Generative AI tools have the potential to revolutionize the way we interact with technology, opening new doors to faster, more efficient, and more creative learning processes. But that accelerated pace can run away with us unless we are aware of the downsides of these tools.
Learn more about possible benefits and pitfalls of generative AI.
Equity
Generative AI tools have the potential to widen the institutional performance gaps that impact learning in higher education, but also the potential to create a more equitable learning environment. Generative AI is prone to error and can perpetuate biases and stereotypes. Bias may also arise in faculty assessments of which students are making unauthorized use of AI. Students may over-rely on the tools and neglect to develop skills critical to success in higher education and beyond. Inequitable access to broadband internet and to previous education that promoted skills in prompt writing and related skills may exacerbate existing institutional performance gaps. However, generative AI can be thought of as an assistive technology that will support the learning of many of our students. Tools such as ChatGPT can provide useful suggestions for succeeding in college or getting started on assignments, offer immediate feedback on writing and help students develop communication and planning skills.
Learn more about using generative AI to promote equity in your classroom.
Mis/disinformation
As the efficacy and availability of generative AI tools advances, both we and our students will face a variety of information-related challenges. Generative AI can be used to automate the generation of online misinformation and propaganda, flooding our information environment with disinformation, distracting from accurate information and increasing skepticism in content generated by credible scholarly and journalistic sources. Source material used for training data, the design of a model’s algorithm, data labeling processes, product design decisions and policy decisions all contribute to the possible replication of biases and unfair stereotypes (Ferrara, 2). Content generated by AI tools can seem accurate but be entirely made up, a phenomenon known as AI hallucinations. Our task as educators is to prepare our students to navigate an information environment characterized by the use of generative AI by inoculating against disinformation, helping them develop the skill and habit of verifying information, and building a conception of the components of a healthy information environment.
Learn more about inoculating our students against mis- and disinformation in the age of AI.
Productive Struggle and Deep Learning
In a series of blog posts on the impact of AI on student reading skills, Marc Watkins writes, “Unchecked, the lure of frictionless reading could produce a profoundly atrophied culture of surface-level ideas rather than exploring them in depth.” As noted above, AI tools can function as assistive technologies that support students in meeting course goals. However, if used uncritically, they can reduce productive struggle with course content and skill development. While an AI tool may be able to quickly generate code that performs a task, answer a difficult mathematical problem or summarize a challenging text, oftentimes, the learning comes in grappling with the algorithm, working through the problem, or synthesizing ideas in a text Allowing a tool to step in and complete the task or quickly outline for the student the exact steps to take may limit the actual learning that may have derived from working through the process on one’s own., There are ways to use generative AI to provoke deep learning and valuable skill building (see , What are some approaches to using or working around AI in my class?) however, for this to happen, these tools must be integrated into our course thoughtfully and in support of course goals.