Not too long ago I was with a group of administrators and we were talking about STEM. For those of us who have been involved with STEM/STEAM for awhile, it's hard to remember that not everyone is on board. I was asked about virtual classes, about robots, and about AI. The tones were so much inquisitive as somewhat defensive. I recognize this is an on-going challenge because technology moves so fast these days. Yes, some of the administrators were older, but that is no excuse. I am a bona fide boomer and not inept with nor afraid of technology. (So please stop bashing boomers and apologizing that we're not digital natives. We're not digital natives; we're freakin' digital pioneers because it's boomers who designed and wrote the first iterations of most things people take for granted today.)
So, where was I? Oh yes, robots and AI. I'm old enough to remember when 2001: A Space Odyssey first came out and holy moley, HAL was freaky. Most of us are familiar with the Matrix movies when robots really do become sentient and things really get weird, fascinating, yea, weird and fascinating. There are plenty of movies speculating about what could happen if robots could become sentient and lots of discussion about whether or not AI robots can become sentient or conscious beings. What are the moral and ethical implications? What does that mean for humanity?
The questions and responses also made me think about Kate Darling's TED Talk, "Why We Have an Emotional Connection to Robots." I haven't named all of my robots, but I know I treat them in ways that sometimes cause students to look at me askance. Now the students will name the robots they're using at the time, by the way. I've had students burst into tears when they dropped a robot on the floor because they were afraid they'd hurt it.
What has that to do with AI? Well, think of the way you respond to Siri, Cortana, Alexa, Watson, or any other AI technology to which you have access. How many of you say "thank you" to Alexa after the device has done as you asked or answered your question? I do. I also giggle when Alexa thinks I've called out her name but then mumbled some sort direction when I've likely just mumbled and something in that has sounded to "her" like "Alexa." How many of you feel bad when you get irritated with Alexa and use a harsher tone of voice? I've not yet apologized to her, but it's only a matter of time.
As an educator, if you're using some sort of a platform that makes adjustments based on student responses and interactions (i-Ready, Dreambox), you're already using a level of machine learning and AI to help students personalize their learning pathways.
There are, of course, immediate ways teachers can introduce and use AI in their classrooms. There were plenty of educators at ISTE 2018 talking about using Echo Dot in their classrooms. Sure, it can be fun to have a conversation with Alexa but there could be privacy issues, so be careful. Students can learn how to create chatbots and it could be interesting to have students have conversations with chatbots.
I was with some high school students recently who were asked to create their own rubric for their presentations. It was an interesting exercise: frustration on both sides. So then I imagined students having a conversation with Alexa or chatbot to try to explain what they were really trying to say.
If you're looking for other suggestions or ideas for using AI in the classroom, check out this blog post by Shake Up Learning.
It's important to understand that AI isn't coming; AI is here. It's already in your home, your car, your classroom. Data analytics is only a small part of the role AI plays and can play. Heck, sports scouts are using AI to help find the best athletes for their teams. Just as students (and their parents) seem to fret about how robots will change their lives, well, robots have already changed their lives and often they don't realize it. And if it's not robots, it's cobots. Yep, that's a thing.
The dark side of AI and technology exists. Students can't understand why they have to learn things they can look up on their phones if they don't remember it. Students can't understand why they have to make connections between something in French class and something in Economics class. Students can't understand why they have to remember stuff from Math I and Math II to help them in Math III. Why? Because they look stuff up or ask Siri or Alexa.
Even as we are embracing AI and playing with robots and recognizing that cobots are a part of some workplaces, we have to help students understand that doesn't relieve them of all responsibility for learning. In fact, the truth seems to be that if they are unable and unwilling to learn and if they are unable and unwilling to think critically and creatively, they may have less of future than most cobots. On the other hand, if they were able to think critically and creatively and figure out ways to help solve problems and find solutions using AI and robots and cobots and whatever else they might invent, well, who knows how they might be able to control and manage their futures?
For more information about AI and education, you might read:
- How Is AI Used In Education -- Real World Examples Of Today And A Peek Into The Future (25 July 2018)
- The surprisingly boring role AI could play in classrooms (18 Jun 2018)
- CoSN Issues Guidance on AI in the Classroom (18 Jun 2018); you can request the brief from Michael Kaplun)
- The Promise (and Pitfalls) of AI for Education (Aug 29, 2018)
- 32 Ways AI is Improving Education (Aug 10, 2018)
- How artificial intelligence could help teachers do a better job (Jul 30, 2018)