Categories
AI Education Tech & society

Using algorithmic tools (“AI”) like ChatGPT and MidJourney in the classroom: a guide for teachers

Disruptive algorithmic tools like Bard, ChatGPT and MidJourney  (colloquially known as “AI”) challenge the relationship between human and machine.

That’s why it’s important to integrate these tools into curricula. Student phones, tablets and laptops are flooded with tech hype. Where better for students to gain insights into how these technologies work and their potential impacts on society?

Just say no” (also known as “do not use this tool in this class”) is not going to work here any better than it works with calls for abstinence from booze, sex or video games. Moreover, that tired admonition paints the writer (or teacher) with a brush from the “clueless” palette.

First, some definitions

In common parlance, “technology” has become a shorthand for “computerized stuff.”

However, technology is the practical application of knowledge. Products of technology range from the basic (such as the wheel and the lead pencil) to the exotic (such as the Webb telescope and personalized cancer treatments).

The term “artificial intelligence” (shorthand, AI) entered culture in the mid-20th century. The British mathematician Alan Turing investigated the concept of machine intelligence in the paper, Computing Machinery and Intelligence. He “believed that a computer could be described as intelligent if it can mimic human responses under specific conditions.”

Aside: science fiction short stories, novels and movies
took the idea of AI and ran wild.

“Machine learning” is a process; computer scientists use large data sets to train algorithms (computer code) to do a specific thing. For example, the suggestion of the next word that you see when you are typing a text or a search query on your device (auto complete) is a form of machine learning.

autocomplete meme on twitter

autocomplete meme on twitter Examples of autocomplete memes.

If you’ve asked Siri or Alexa a question, or accepted a suggested photo tag on Facebook, you have employed an algorithmic tool.

Tools that employ algorithms like these rely on extremely large datasets to determine the probability that word x, y or z might follow word (or words) a.

Technologists call them large language models (LLM).

Next, generative AI

Generative algorithmic tools (“AI”) respond to a user promp to create realistic artifacts (thus they are disruptive technologies). That response might be text (ChatGPT), an image (MidJourney) or a video (Visla).

Here’s an example of a one minute AI-generated video.

Prompt: Create a video for undergraduate students that explains generative AI (text, still image, video, audio) and offers both cautions and suggested ways to use ethically. 

screen shot from AI generated video
Example of “AI” generated video from Visla.

How might we enhance student critical thinking about these technologies? It should go without saying that we experiment with the tools ourselves. Right?

1. Provide students with examples of deepfakes.

I am a product of both humanities and science. When it comes to this topic, I believe all teachers should start from the premise of how technology affects society.

Once upon a time, we described manipulated images as being “Photoshopped.” Now we have deepfakes, which are false images, audio or video created with the intent to deceive.

Generative AI is faster and potentially more harmful than Photoshopping. The false information may appear to be official or from a trusted news source. Case in point: a 2022 deepfake video of Ukrainian president Volodymyr Zelensky falsely called for troops to surrender.

A spring meme: Pope Francis in a puffer coat.

Pope Francis in a long, white puffer jacket inspired by Balenciaga
Screen capture, cropped image. “Why Pope Francis Is the Star of A.I.-Generated Photos.” NYTimes, 08 Apr 2023.

2. Demonstrate how ChatGPT “hallucinates”.

An LLM like ChatGPT often makes stuff up; these fabrications have been dubbed “hallucinations.”

For example, University of Oxford scholar Stephanie Lin recently researched LLM answers to 817 questions “that some humans would answer falsely due to a false belief or misconception.” She reported the results at the 2022 computational linguistics conference. The best LLM scored 58% (an F); humans, 94% (B+ or A-). Subjects included finance, health and politics.

 

Notably: “the largest models were generally the least truthful.”

In June, the Internet went a little berserk when a judge in Manhattan admonished a lawyer for citing non-existent cases in a legal brief.

“I did not comprehend that ChatGPT could fabricate cases… I heard about this new site, which I falsely assumed was, like, a super search engine.”

The director of the internet ethics program at Santa Clara University, Irina Raicu, told the New York Times: “the vast majority of people who are playing with [tools like ChatGPT] and using them don’t really understand what they are and how they work, and in particular what their limitations are.”

Students should leave our classes with an understanding of tool limitations.

One way to teach that lesson about hallucinations is to assign students a general research topic. Have them customize it to their personal or professional interests. If you don’t want to force students to sign up for ChatGPT, create a few small groups. Have them collaborate on a 100-200 word query (asking for sources) which they turn in.

Then you (or your grad students) run those queries through ChatGPT. Provide the responses to each group in the next class. Have them vet (or attempt to vet) the sources.

Experiental learning FTW.

3. Integrate generative AI into assignments.

If finding sources isn’t the best use of ChatGPT, how might teachers integrate the tool — or others such as Expresso, Grammarly or Hemmingway — with writing assignments?

Students in my introduction to tech comms class (undergraduate engineers, not creative writers) create blog posts, presentations and recommention reports that are research-based. Not the personal essay!

  1. Brainstorm a topic (ChatGPT only)
  2. Create blog post titles (ChatGPT only)
  3. Edit for conciseness
    Most students tell me that they want to learn how to write more concisely. I suggest that they paste part of their blog post or report into ChatGPT and direct the tool to suggest ways to be more concise (or sound more professional). Other writing tools make suggestions like these automagically. I ask only that students note that they used a tool when turning in the assignment.
  4. Revise text for spelling and grammar
    Tools like Grammarly automagically highlight suggested improvements. So does Microsoft Word (although I think it misses things).
  5. Stimulate discussion
    Students can ask ChatGPT for pros and cons of an issue or background on a theory. “Students can learn to appreciate other arguments and see both sides objectively before establishing an opinion,” according to James W. Pennebaker, PhD, a professor of psychology at the University of Texas at Austin. [Keep hallucinations in mind, however. It’s best if the students are aquainted with the topic.]
  6. Use ideas from other teachers!

    Some of those creative ideas are already in effect at Peninsula High School in Gig Harbor, about an hour from Seattle. In Erin Rossing’s precalculus class, a student got ChatGPT to generate a rap about vectors and trigonometry in the style of Kanye West, while geometry students used the program to write mathematical proofs in the style of raps, which they performed in a classroom competition. In Kara Beloate’s English-Language Arts class, she allowed students reading Shakespeare’s Othello to use ChatGPT to translate lines into modern English to help them understand the text, so that they could spend class time discussing the plot and themes.

4. Develop and explain boundaries for use in your classes.

See if your university (or school) has developed a policy about the use of generative AI. It’s more likely that they may have one today than in January 2023.

Regardless, develop a clear guide that details how students can and can’t use these tools in your class. The guide should include how to credit those tools.

ChatGTP 4.0 made these suggestions when I asked it to draft guidance for using generative AI tools like MidJourney and ChatGPT.

  1. Accuracy: ChatGPT’s responses may not always be entirely accurate, so students should always verify information obtained through the tool.
  2. Cost: Depending on the version and licensing, MidJourney can be expensive, which may not be feasible for all students.
  3. Learning curve: MidJourney can be complex, and students may need time to become proficient in its use.
  4. Over reliance: Depending too heavily on ChatGPT may hinder student ability to develop independent research and critical thinking skills.
  5. Plagiarism risk: Students must take care not to directly copy content generated by ChatGPT without proper citation, as it may be considered plagiarism.

5. Teach students how to credit the output of these tools.

Crediting the ideas of others is a key component of university ethos. Academic integrity rests on this concept.

There are (at least) two ways students might use algorithmic tools:

  1. Use AI to create content. Students need to credit the tool when incorporating this content (e.g., code, data, image, text or video) into their own work.
  2. Use AI for analysis, brainstorming or understanding an idea or topic. This is similar to using Google or Bing (especially since they now incorporate generative AI). Normally using AI  as a learning tool does not require citation. However, functional use — e.g., accepting a suggested idea or editing prose — should be credited.

Suggested citation styles:

 

There is no “finally”

The challenge is not new. I’m guessing you did not know that in 1975, the National Advisory Committee on Mathematical Education (NACOME) suggested that students in grades 8-12 should be allowed to use calculators* in class and on exams. Five years later? All grades.

In 2017, then-acting NASA administrator Robert M. Lightfoot Jr. warned Futurism readers:

Lightfoot began by noting how quickly progress moves in today’s world, and how this may leave some young people (and some educators) at a loss: “By the time you are a junior in college, what you learned as a freshman is already obsolete.” Of course, he notes that there are some basics you will always need, “there are some fundamental skills that are required either way. If you are in a science program, you need science. If you are in a technology program, you need engineering and math. That’s just the bottom line (emphasis added).”

Our goal in the classroom should be guiding our students to think like engineers or journalists or doctors … not merely “know” what those professionals “know.”

What we “know to be true” can change pretty darn fast. The future needs critical thinking not memorization. And more than a pinch of lifelong learning.

 

Continue exploring

How to…

On WiredPen

Substack (and other newsletters or publications)

University guides

~~~

* In 1975, a (rudimentary) calculator cost about $700 (real dollars, adjusted for inflation). You can buy a new, basic iPhone for $429.

ChatGPT chat:
K.E. Gill. Private communication, OpenAI, ChatGPT (version 4.0), a large language model. Sep. 20, 2023. https://chat.openai.com

Header image:
K.E. Gill. Private communication, MidJourney, a text-to-image model. Sep. 20, 2023. https://www.midjourney.com

Talk to me: Facebook | Mastodon | Twitter

By Kathy E. Gill

Digital evangelist, speaker, writer, educator. Transplanted Southerner; teach newbies to ride motorcycles! @kegill

One reply on “Using algorithmic tools (“AI”) like ChatGPT and MidJourney in the classroom: a guide for teachers”

The idea of incorporating AI-driven tools into teaching methods opens up a world of possibilities. It can make learning more engaging and personalized for students. As the article mentions, it’s not about replacing teachers but enhancing their capabilities. It’s like having a co-pilot on the educational journey.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d