I found myself scratching my head over this article. Something about it rubbed me wrong. Then I realized what it was. The arguments made for AI are the same as the ones made for XYZ edtech back in the day of Apple //e and Macintosh LC II computers, word processors, and databases, etc. In essence, the old arguments were dragged out, given a fresh coat of “AI paint,” then rushed into service.

AI Is a Critical Skill for Curriculum

Who benefits by putting AI into the curriculum? Who makes money off of this, and does AI distract from what students need to learn, which is how to process information, ideas, for long-term memory retention? Is critical thinking still of value, or is it more critical for Big Tech to push AI into school curriculum to ensure they get more money from school coffers?

Teaching students how to use ChatGPT properly is a critical skill that schools should incorporate into their curriculum. Education should prepare students to function effectively later in their lives by teaching them how to learn and how to use the tools of modern society. Source

As I read this, I’m struck by how this argument was the same one edtech advocates relied on before. It’s an old argument that is less effective now. Human beings need to learn how to use the tools of modern society. Presumably, introducing them at an earlier age accustoms them to those tools.

Monetizing Children with Tech

Let’s not forget that a possible reason big tech (e.g. Microsoft, Google) want those tech tools in the hands of children is so that when they become adult, they will continue to use them. A free child user now may very well translate into a paying adult user later. Big tech has a vested interest in introducing children to their ecosystems. Too cynical?

In the late 1800s, new technology emerging from the Industrial Revolution offered commercial interests a new opportunity to monetize childhood—and those commercial interests took it. At the time, there were about 765,000 American child laborers, many working in extremely unsafe industrial conditions. Children suffered numerous harms, from crushed limbs and broken bones to death by industrial accidents like factory fires.

Big Tech’s social media platforms are similarly exploiting children today. And just as policymakers needed to act to protect children then, they must do the same now. The Digital Revolution has created new ways to exploit children for profit, and Big Tech has seized the opportunity enthusiastically. (source)

As educators, as veteran edtech advocates, perhaps we need to be a little more critical and less “pollyanna” about yesterday’s edtech adoptions in schools.

A Side Trip into Meaninglessness

Ever since I started blogging, I’ve been reading Harold Jarche’s work. His ideas on knowledge management certainly pushed me into knowledge management in the organizations I worked in, even if I was the only one doing it (er, oops). I found this quote he includes in his piece, Careening Toward a Meaningless World from Andrew Perfors to speak to me:

More likely, while the rich might be able to create walled gardens of meaning, the system for most of us will become a swamp of falseness and distortion, a cursed transformation of humanity’s greatest asset – our cumulative cultural knowledge – into our greatest weakness. —Andrew Perfors 2024-02-14

Is it possible AI’s incorporation into curriculum is yet another way of creating a walled garden of meaning? In the early days of blogging in schools, breaking down the classroom walls seemed a good thing. Now, I’m not so sure.

The conspiracy or grand plan to exploit public schools is, at last, revealed. Move the children of the rich into private schools where they create meaning. For public schools, the goal is exploitation and redirecting funding into expensive curriculum innovations (like AI) that leave children mired in the swamp Perfors describes.

In the meantime, UTSA is establishing a new college focused on AI:

“We don’t want [students] to spend time early in their careers just trying to figure out AI,” said Jonathon Halbesleben, dean of UTSA’s business school who is co-chairing a task force to establish the new college. “We’d love to have them be career-ready to jump right into the ability to sort of shape AI and how it’s used in their organizations.” source

I’m not sure. It’s something to think about and reflect on.

AI Integration into Student Work

Should we, as educators, be adopting the latest AI tech for use by students? Consider the following:

There is no way to undo the technological innovations that ChatGPT has brought, and the engine is certainly not the last AI breakthrough. Students will have to be able to integrate AI into their work throughout their lives.

Using ChatGPT takes practice; phrasing prompts and devising creative uses for the AI requires skill.

It is important to acknowledge that overreliance on ChatGPT can harm students’ learning. Source

If Not AI, Then What?

Well, let’s not be too hasty. There must be SOME use to AI in K-12 schools, especially for teachers. Consider the use of AI integration matched to high-effect sized instructional strategies.

AI Integration Strategy in K-8 Education Visible Learning MetaX Teaching Strategies Citation
Personalized Learning Feedback (Tasks & Processes) d=0.63 VisibleLearning Meta X Database
Interactive Learning Environments Inquiry-based teaching d=0.53 VisibleLearning Meta X Database
Support for Diverse Learning Needs Response to Intervention d=0.73 VisibleLearning Meta X Database
Enhanced Assessment Tools Effects of testing d=0.63 VisibleLearning Meta X Database
Development of Digital Literacy Explicit teaching strategies d=0.63 VisibleLearning Meta X Database
Facilitation of Project-Based Learning Problem-solving teaching d=0.61 VisibleLearning Meta X Database
Language Learning Direct instruction d=0.56 VisibleLearning Meta X Database
AR and VR Constructivist teaching d=0.92 VisibleLearning Meta X Database
Teacher Assistance Scaffolding and situated learning d=0.52 VisibleLearning Meta X Database
Encouraging Creativity Philosophy in schools d=0.54 VisibleLearning Meta X Database
Collaborative Learning Cooperative learning d=0.53 VisibleLearning Meta X Database
Critical Thinking and Problem-Solving Epistemic cognitive training d=0.51 VisibleLearning Meta X Database

Further explanation for each strategy can be found here.

Wait, What Does AI Say About This?

Let me run my thoughts here through the FLOATER acronym through AI and see what it says:

To apply the FLOATER approach to the provided text and evaluate the claims made within it, we will assess each element of FLOATER and assign a score from 1 (lowest) to 5 (highest) based on the information given.

Element Evaluation Score (1-5)
Falsifiability The article questions the motives behind integrating AI into the curriculum and suggests that it could be driven by financial interests rather than educational benefits. This claim is falsifiable as it can be tested by examining the financial relationships between tech companies and educational institutions. 3
Logic The argument that AI might be pushed into schools for profit rather than educational value follows a logical structure, but it may not consider the full range of reasons for AI integration, such as genuine educational benefits. 3
Objectivity The article seems to present a skeptical view of AI in education, which may indicate a bias. It does not appear to provide evidence from both sides of the argument. 2
Alternative Explanations The article does consider alternative explanations for the push of AI into schools, such as the historical context of technology in education and potential exploitation. 4
Tentative Conclusions The article raises questions and presents a critical view without making definitive conclusions, suggesting a level of tentativeness. 4
Evidence The article references historical examples and draws parallels to current events but does not provide direct evidence of AI’s impact on education or the motives of tech companies. 2
Replicability The claims made are more speculative and do not provide a clear methodology for testing or replicating the assertions, making it difficult to assess replicability. 2

The overall FLOATER analysis suggests that while the article raises important questions and provides a critical perspective on the integration of AI into education, it may lack objectivity and sufficient evidence to fully support its claims. The scores reflect a need for more balanced reporting and comprehensive evidence to strengthen the arguments presented.