Advancements in AI have come a long way in the last decade alone. AI, or artificial intelligence, has taken the world by storm with utilities such as ChatGPT and CoPilot to name a few. AI, in the sphere of education, is a real threat to students but CAN be a great tool if used properly. A threat due to the nature of irrational-minded people to cheat and helpful due to the aid it provides students. I feel as if this applies elsewhere, specifically in Software Engineering. I for one have used AI, specifically ChatGPT, a multitude of times over the last few years. Most usage is for non-school-related activities (previously for aiding in my understanding of the Godot game engine), but there are instances where it has cause small errors in larger coding assignments.
While taking ICS314, the use of AI (specifically ChatGPT) was heavily minimized.
In all in-class WODs and practice WODs, AI was not used to aid in the completion of the assignments. I felt as if I would be cheating myself. These assignments were to test our understanding of recent concepts and using ChatGPT is like copying code from the professor before trying yourself, you learn nothing.
The same goes for any essays and, currently, the final project. Essays are to be written with your own thoughts in mind. As for the final project, it encapsulates everything we learned during the semester, therefore I feel as if I am cheating myself if I utilize AI in this instance.
I have used ChatGPT for experience WODs that I have failed to complete in time due to any code-related issue (if I am just moving slowly, AI was not used). This was the case for 1-2 assignments throughout the semester. AI was also used to help in my understanding of specific concepts that I failed to understand. One instance of this was asking ChatGPT how to override bootstrap variables in a CSS file.
AI was not used for any asking or answering of questions within this course, nor any documentation of any code. I typically never answered questions; they were typically answered before I got to see them. Furthermore, I like to ask my questions as an AI might ask a question that doesn’t quite show my understanding and the lack thereof. Documentation was in small demand for this course and as such AI was not needed for achieving this task.
As previously mentioned, I have used ChatGPT for a few experiences, particularly asking questions regarding where an issue in a file could exist in order to help locate any errors. Most code given by ChatGPT results in more complications so I typically use AI to locate errors rather than find solutions for them.
As stated earlier in this essay, AI has been used for projects relating to game development. Instead of scouring docs, a quick query provides a quick and (for the most part) reliable response. For this particular case, I have found that it is a toss-up as to whether the given information was accurate. Commonly, ChatGPT responds with information regarding applications from previous builds/doc history. A notable example is from the recent update from Godot 3 to Godot 4 which had significant changes and tripped up ChatGPT.
I can identify two limitations: the lack of updated information producing unreasonable answers and abuse of the tool. These stem from previous examples/explanations provided earlier in this essay. It is hard to propose solutions or opportunities to improve the integration of AI into education as it is nearly impossible to enforce restrictions upon it. AI, though a very useful tool, is very dangerous to the future of education and needs a plan to prevent disaster. Currently, I cannot develop any potential opportunities that don’t extend beyond reason.
Traditional teaching methods are something I’d like to refer to as brain-flex learning. It requires the straining of the mind, committing things to memory, and referencing them later. AI, however, takes this process and turns it on its head. It is quick and does not require storing concepts in memory. In this instance, learners won’t typically hold on to what they snag from AI queries. Nor will participants of AI-enhanced learning approaches develop skills such as persistence and resilience. I know many who abuse AI and fold after mere minutes of tackling a task (whether hard or trivial). Now, this isn’t to say AI is all bad, it’s just the idea of ‘AI-enhanced approaches’ seems unreasonable. I consider the use of AI as a branch of a tree with traditional teaching methods as the root.
A large want for consumers of AI is better/more accurate responses. I believe that will come with time. Moreover, the longer the request is given by a user, the less likely an AI is to provide a proper response (obviously challenging concepts play a role too). If AI were to develop enough to where thousand-line requests are responded two with near-perfect accuracy, then AI would surely take over the role of day-to-day Google searches, forum searches, etc.
AI is a valuable tool that should be used while following a guideline. Students, especially Software Engineers, should be constantly challenging their understanding and making an effort to be self-knowledgeable. For now, students should be knowledgeable of the tool, but hold on having the tool teach the student. Leave that to the professors.