Wednesday, July 25, 2018

A message about the future of teaching and learning


I recently read More Now, A Message from the Future for the Educators of Today by Mark Wagner, Ph.D. It’s a new release, published by EdTechTeam Press; Wagner is the CEO of EdTechTeam.

I have to say I was a little disappointed. I’ve always admired the EdTechTeam team; I recognize a number of their names and am familiar with their work and their efforts.

Let me clarify: I wasn’t disappointed with a lot of what Dr. Wagner had to say. I think much of it was spot on, but his message from the “future” seemed lacking. His future felt more like the upcoming school year, and maybe that’s the best one can hope to do in these unpredictable times. At the end, however, his future and message felt a little too Borg-like for me. I don’t believe resistance is futile, but I also don’t think resistance is actually necessary. Yet. I hope. Maybe.

Technology is growing and changing in ways unfathomable even a few years ago. In the past couple of weeks I’ve read articles highlighting how AI is making a positive difference for talent recruiters and for hiring managers; I’ve also read how AI seems to be a personal and professional horror for job-seekers who don’t know how the algorithms work and who may be overlooked because the algorithm or the interview chatbot didn’t process answers the way a human might. There are many who worry that AI will overlook some quality candidates because AI can’t adjust for certain elements of humanity—AI can’t (yet?) detect insight, humor, savviness, and that often really difficult immeasurable quality of potential.

As technology changes work places, it also influences education though the changes are equally erratic and uneven.

One of the messages from the future reads:
Students don’t need to learn from us anymore. Learning is post-human. Learning math, learning the capitols [sic], learning all this other stuff, it’s post-human. Computers are going to be able to do it better than we do. They are going to be able to leverage that information, beat us, better than we are at whatever it is.
What we’re trying to focus on are those real, human characteristics that we need to push in education: collaboration, communication, creativity, and entrepreneurship. Those are the uniquely human traits that computers, we feel, will never be able to replicate, duplicate, or be better than us in accomplishing. The focus is not learning anymore; it’s on nurturing the human traits that already exist in students. We have to ask, How do we make sure those things flourish, grow, and become strong? Because those are the only things they are going to have thirty years from now (p. 147).
Where to begin? Well, I have any issue with using “anymore” instead of “any more,” but that’s a small quibble though someone should have made sure he used “capital” instead of “capitol.” I will not minimize correct grammar and mechanics, but, in this case, those are not my major concerns.

I need to start with that first concept: “Students don’t need to learn from us. . . Learning is post-human.”

Um, who programs those computers that are going to leverage information and beat us at whatever we’re trying to do? And if they’re going to beat us at whatever we’re trying to do, then what value are those uniquely human traits of collaboration, communication, creativity, and entrepreneurship, an interesting deviation from the usual 4Cs of collaboration, communication, creativity, and critical thinking? I suppose that could mean humans would be free to be more creative and entrepreneurial about what computers might be able to do. . .to beat us whatever the next thing is that we're trying to do.

A later message from the future has this chilling observation:
Your job, as an expert in knowledge and information, is about to be replaced by robots. You should not be afraid of that. Instead of being a source for knowledge and information, you should be a source of the kinds of things that human can only get from humans: friendship, mentorship, guidance, empathy, and what it takes to actually work together.
Your job, as a teacher, should be more like a mentor, a community leader, a mediator, an arbiter, potentially an exemplar of the whole person. Your job will become more like being a coach-therapist-mentor, but that’s still a form of education. You’re just educating certain facets of humanity that are the hardest for robots to replicate (p. 153).
Yes, robots will be able to gather and retain zillions more bits of information than I ever will or will ever want to. And, in time, robots may be able to sift through those zillions of bits of information to construct knowledge, which is a theoretical and practical understanding of something that requires more than just bits of information. But computers do not yet have knowledge; they have information. As algorithms are built and AI is developed, AI will be more adept at sifting through bits of information to make logical connections, but AI (as yet) does not have the very human capacity of reason nor can AI or robots construct what we know as knowledge.

Does Alexa know what type of music I prefer to listen to first thing in the morning? No, she doesn't know. But the algorithms could check to see what I listen to most often in the morning and report that as my preference. Of course, the algorithms don't know if that's really my preference or if that's what I've been listening to because there were specific reasons I chose that music.

Do robots know how to do particular tasks? No, robots are mechanical devices with circuits, sensors, and programmable elements. A robot does what it is programmed to do; a robot follows a sequence of commands. Period. It does not question if that command makes sense because it does not have reason.

So I’m thinking about a group of 3rd graders who worked on a habitat project. They each chose an animal and were supposed to be doing research to answer some specific questions about their chosen critter. iPads in hand, students routinely asked Siri. Now Siri is not (yet) a hugely sophisticated AI system and mostly she sent them to web sites they had to read, much to their dismay, frustration, and irritation. They just wanted the answer. They didn’t want to understand anything about this animal or its habitat. I got that. Too often this habitat project is part of the curriculum checklist and doesn’t seem to be part of much else so it’s not like they’re really learning about the desert or the savannah or anything else. Animal duly “researched” and little habitat model duly created in empty pizza boxes with construction paper, cotton balls, and other assorted stuff. Admittedly, some of the habitats were really creative and some kids really got into the design of their habitats and then wanted to better understand their animals so went back to the computer to gather more information so they could put more bits of information together to expand their knowledge and their understanding. Silly children. That’s so post-human.

I mean, why in the world would we want kids to know how to learn? Or get at all interested or excited in knowing how to learn? How to solve problems and puzzles when computers and robots are going to do that for us?

Fast Company recently published an article about Siri, Apple's voice assistant and some of its next features. If you read through that article and you're a Siri fan, try not to be distracted by the features because what I'd like you to notice is how many other functions and apps it mentions. Here's the deal: we've been using smartphones for over a decade now and we've become accustomed to what apps can do for us. We've been accustomed to the working in the cloud. We've become accustomed to what car systems can do for us. Driverless cars? Who would have thought that possible even five years ago?

We also know what we wished certain apps would do for us because of the way we think. Others have different ideas of the way they wished those same apps would work for them because of the way they think.

My point is this: we cannot confuse knowledge with information. We cannot diminish what humans are able to do with the power of learning, with the unique capabilities we have for making unusual and unexpected connections with the paltry bits of information our little brains can manage to process. So yes, we let the computers and the robots do the heavy lifting of processing information, but it's likely that, for a time yet, the actual creative process of producing knowledge will remain a human task.

Experience Life

I think about how often I use my phone to look up something and I remember those students who don’t know how to frame a question to get an answer because they haven’t learned how to do research. Because I’m old, I remember sitting on the floor of my mother’s sewing room/”office” because the encyclopedias were on the bottom shelf of the bookcase in there. I was so excited when those beautiful Encyclopedia Britannica boxes showed up, and I’d sit on the floor to do research for school and get distracted and follow different articles to different things until I had to pause to try to remember what I was looking for in the first place. I still get lost doing research on the internet. Anyway, I know that I learned how to do research. And I learned how to learn. And I know that one of the distinctly human qualities many of us have is to find connections between apparently disparate things. I’m not sure robots can do that yet, or if they ever will be able to do that.

I agree that there are some things it may not be necessary for me to learn because a computer can do it for me or get the straight-up answer for me. People disparage the practice of teaching kids state capitals and maybe it’s not one of the most important things; however, in the process of learning their capitals they might also learn how to read a map and learn how to find different states in their own country. And then they might learn something about topography and geography which might also lead to learning something about geology, transportation, history, agriculture, weather, immigration, and more. As a result of learning this messy compendium of stuff, students learn how to make connections between things in the present, between things in the present and the past, and they can begin to think about what could happen in the future.

Sensible Driver
I heard you ask why in the world kids would need to learn how to read a map when they have Google Maps and GPS. It's an on-going argument/discussion. I use GPS all the time when I travel. Some months ago I was going to the same city pretty much bimonthly and I suddenly realized I could not get to the hotel from the airport without GPS because I never paid attention to direction or where I was or landmarks or anything. Then I wondered what would happen if the satellites went down or lost connection (let’s not be too catastrophic) or my phone had no more juice and I forgot my charger. So I stopped using GPS as much until I could find my way without it. And guess what? I discovered a lot about the area in which I was working because I was finally paying attention! Come the apocalypse, the kids who know how to use an actual compass and a map are going to beat the zombies. Just sayin’.

I don’t want students to be docile recipients of information, even with those distinctly human qualities of collaboration, communication, creativity, critical thinking, and entrepreneurship. . .and cultural awareness. That’s almost like the old factory model of kids being vessels that teachers fill with information so they can spew it out on a test. Only this time the robots are giving students information.

I want students to be architects of knowledge. I want them to have some proficiency with skills that they can wield as they take in that information, from wherever and however they get it. I want them to continue to learn how to learn and figure out how to use their bits of information as they are collaborating and communicating and being creative through critical thinking, cultural awareness, and entrepreneurship.

I think teachers will still be a source of knowledge because they will be a source of experience about how they used information to craft something, to figure out something, to solve something, to become.

I’ve been trying to imagine being in a situation in which I never have to do research again, but I can’t. Even if I ask Alexa or Siri for some information, that information might lead to other questions. I have to figure out which questions are more important. That’s learning. Learning is realizing that the first answer given by the robot or computer may not be the only answer or even the best answer. Maybe the question wasn’t phrased correctly. Or maybe the question wasn’t the right question because the questioner didn’t have enough information or the right information.

Discernment. Knowledge. These are human qualities.

I actually love getting lost in information. It is a quirky fault of mine. I can start looking up something and, like many, I end up with a few dozen tabs open because I find I have several trails I need or want to follow. Because I have knowledge—stuff I’ve learned over time because of experience and because of the things I’ve figured out or bits of information I’ve combined—I can often discern when one trail has fizzled out or isn’t worth following. That’s part of learning.

I may be reading Dr. Wagner incorrectly, I give you that. But I have to say that I reject that learning is post-human. I think that learning is one of the most brilliant of human qualities. Do teachers have to be the sole purveyors of information? No, no, and no. In fact, most of the really great teachers I know point students to the computer and say, “Let’s look it up.” Why? Because they know that the process of research, of figuring out what’s relevant and why are some of the most important skills students can ever learn.

Will robots be able to do that eventually? I suppose it’s possible. But that will have to be a highly personalized robot that can discern what’s important to me about those combined bits of information in that situation at that moment.

Until then, yes, teachers you will need to be coaches and mentors, just as you’ve been for years. You will need to be an arbiter—is this robot answer better than this robot answer?—and help students learn how to collaborate, communicate with each other, be creative, and be critical thinkers who are culturally and globally aware and might be entrepreneurs.

Lifewire
As I contemplate Dr. Wagner’s book and the video of Simon Sinek and Jim Kwik, I say we have to strive for balance. Yes, technology will continue to change how we learn and it could change it faster than we ever imagined. At the same time, we cannot simply yield to technology and let it make us stupid and victims of digital dementia.

Educators have to continue to teach students to learn how to learn, and that may be more imperative now than ever.




Some resources for AI in hiring/recruiting:
3 Ways AI Will Transform Recruitment

Two other resources on maps vs. GPS:
Digital maps vs. Paper Maps
Paper Maps vs. GPS: When to Go 'Old School'

Some other resources on digital dementia:
5 Ways to Combat Digital Dementia
Dealing With the Effects of Digital Dementia
Digital Dementia: Video games improve attention, but is there also a link with dementia?
Digital Dementia: Keynote by Dr. Manfred Spitzer (1:24)

No comments:

Post a Comment