Image source: Someone in a Facebook ChatGPT for Teachers group

Great news, right? Custom GPTs will be usable by free users of ChatGPT, those custom GPTs will no longer be behind a paywall. The letter above was shared by an educator who is creating custom GPTs for his K-12 students to use.

Can vs Should

Just because you can use a technology in a K-12 classroom, should you? In a blog entry that goes to the hearts of the values mismatch between the mission of schools and AI development fraught with a morass of ethical issues, Tom Mullaney explores the issues.

Is AI Compatible with Schools?

That question is the title of Tom’s blog entry. In it, he asks the question many have been asking as they watch AI tech companies make a run at school funding, a mass charge to bring AI into schools, and charge every person who uses it, at the same time using that data for AI model training.

His blog entry highlights several issues with AI such as equity, racism, bias, copyright, human exploitation, AI lies or hallucinations. To this pile, I would add climate change impact, water usage, destruction of the environment to build massive data centers.

AI Bits: What Are We Worried About?

Tom isn’t the only one with concerns. John Dolman makes a list of concerns, but doesn’t go into details, by his own admission. His list includes:

  • Plagiarism and academic dishonesty. This is on target. Someone tells you not to use AI, you do, and now you have academic dishonesty. It pales in significance to the next bullet, in my opinion [Miguel]
  • Skill reduction. This is that ubiquitous, and valid, concern that AI supplants student thinking, resulting in students who can’t think their way out of a paper bag, put their thinking down in writing, or use writing to clarify their thinking.
  • Misinformation. This is the AI lies, hallucination, factual inaccuracy.
  • Intellectual Property and Data Privacy. This is copyright, but also dropping all the privacy data from students, staff, schools into AI could result in information popping up somewhere else and you never know about it.
  • Equity and access. We’re seeing this already, the fact that Teacher A has access to super-powered, customized AI (e.g. MagicSchool, Diffit, Brisk, Eduaide,Teachaid, etc.) while someone else is using lame, er, Llama, via Meta or struggling along with a free AI. The quality of the responses is markedly different, better on these other solutions.
  • Pedagogical impact. What IS the impact on pedagogy and education? Some see AI as the tool to dismantle public education (stuck with bottom tier AIs) while private, charter schools (with deep pockets funded AI tools available to the rich) get the benefit of anything goes AI. It’s a “new arms race” for education, and AI is already being “weaponized” to give advantage to the privileged elite, whomever they might be, over the regular working folks and their kids forced to go to intentionally underfunded public schoo.
  • Environmental and resource concerns. You have but to reflect on the environmental impact to know, this is worrisome.

I love John Dolman’s appeal to all of us to embrace Epictetus or Stoic principles, which I’ll quote here for those unfamiliar:

Rejoice in what you have, cease worrying about things which are beyond the power of your will, make the best of what is in your power and take the rest as it comes.

Stoic Wisdom

I suspect that many teachers are wishing for the impossible, if they hope to go back to a time without AI and technology. Teachers, everyone who isn’t financially independent will find themselves forced to use AI. Only someone with only the most basic needs, a source of income, can hope to embrace Stoic wisdom:

“He is a wise man who does not grieve for the things which he has not, but rejoices for those which he has. Remember that it’s not only the desire for wealth and position that debases and subjugates us, but also the desire for peace, leisure, travel, and learning. It doesn’t matter what the external thing is, the value we place on it subjugates us to another . . . where our heart is set, there our impediment lies. Do not spoil what you have by desiring what you have not; remember that what you now have was once among the things you only hoped for.” -Epictetus

A quick review of those Stoic habits John Dolman refers to:

My 7 Habits of Highly Stoic People

  1. Be self-aware.
  2. Control what you can, let go of the rest.
  3. Practice gratitude, find joy in simplicity.
  4. See obstacles as growth opportunities.
  5. Live in the present, no past regrets or future worries.
  6. Cultivate inner peace against chaos.
  7. Strive for virtue and wisdom.

The more I blend AI into my workflow, the more I ask myself, “Are these benefits worth the price paid by others, the environment?” But then, that’s true of EVERY technology that powers modern society, no?

Ok, enough philosophy. Let’s get down to brass tacks for educators.

Two More Questions

Mike Bell, author of The Fundamentals of Teaching, suggests we ask a tough question when someone says, “The research says its effective.” In the case of AI, preliminary research is already suggesting it’s not effective for student learning. It stunts their brain growth in ways that hamper critical thinking and long-term information retention. More evidence is needed, of course.

Bell’s point still applies. He writes:

When someone says, “The research says it’s effective,” ask for the effect size. If it requires a lot of work to implement, and has a low effect-size, pause.

What is the effect size of AI in the classroom? Does it cost a lot of money?

But, of course, that’s the opposite of what’s going on with AI, right? Every company that can make money from AI adoption in K-16 education is promoting it. How can humanity overcome the engines of its own economy? And, of course, schools can’t. They are the tail on the dog, whipped this way and that by whatever catches the attention of a creature with a toddler’s attention span.

Issues Remain

Let’s be honest. AI makes work a lot easier, minimizing the “friction” of making, organizing, analyzing content . Students need that work to develop their own critical thinking and brain power. Adults focused on making money are willing to use AI. The ethical dilemmas posed by AI tools are massive, but no different than any other technology (see table of tech and ethical concerns) humans use to the detriment of stakeholders.

Those stakeholders are the workers, the environment, anyone who is exploited.

Some day, there will be a reckoning. And AI, like every other technology will reach a point it is no longer an issue or providing such a fantastic benefit (think of your smartphone), that it won’t matter.

It is an unpleasant truth, but the majority of people in society will use AI, no matter its negative effects. After all, I don’t see anyone making their own clothes a la Gandhi in an effort to stop production.

Do you?

Table of Tech and Ethical Concerns

Examples of Technologies in K-12: 1980s and Onward

Both tables below were generated by

Technology Ethical Concerns Resolutions
Channel One News (1990s) - Commercialization of education
- Forced exposure to advertising
- Biased or inappropriate content
- Criticism led some schools to opt-out or seek alternatives
- Increased media literacy education efforts
Internet access in schools (1990s) - Exposure to inappropriate content
- Online safety and privacy risks
- Digital divide and unequal access
- Implementation of content filters and firewalls
- Digital literacy and cyber safety education
- E-rate funding and initiatives to bridge the digital divide
Webcams in classrooms (2000s) - Privacy concerns for students and teachers
- Potential for misuse or unauthorized access
- Chilling effect on classroom behavior
- Strict guidelines for webcam use and placement
- Secure storage and access controls for recordings
- Opt-out policies for students and teachers
Biometric identification (2000s) - Collection and storage of sensitive biometric data
- Potential for data breaches and misuse
- Consent and privacy concerns for minors
- Strict data protection and encryption measures
- Limit use to specific purposes (e.g., lunch lines)
- Opt-out options and alternative identification methods
RFID tracking (2000s) - Surveillance and privacy concerns
- Potential for data misuse or unauthorized access
- Consent and privacy issues for minors
- Limited use to specific purposes (e.g., library books)
- Secure data storage and access controls
- Opt-out options and alternative tracking methods
Classroom management software (2010s) - Excessive surveillance and control
- Data privacy and security risks
- Potential for misuse or bias in application
- Transparent policies on data collection and use
- Strict data protection measures and access controls
- Regular audits and reviews for fairness and effectiveness

Consider this list:

Here are some examples of technology being deployed in K-12 schools, along with their effectiveness, ethical concerns, and resolutions:

Technology Deployment Effectiveness Ethical Concerns Resolutions
Facial recognition systems Used for attendance tracking, campus security, and monitoring student behavior Mixed results; accuracy concerns, especially for students of color Privacy violations, potential for misuse and bias, lack of student consent Strict guidelines for use, opt-out policies, transparency about data collection and usage
Online learning platforms (during COVID-19) Widespread adoption for remote learning during school closures Varied effectiveness; issues with access, engagement, and learning outcomes Digital divide and inequity, student data privacy, screen time concerns Providing devices and internet access, training for teachers and students, data protection policies
AI-powered adaptive learning software Personalized learning paths and content based on student performance Promising results for individualized learning, but limited evidence of long-term efficacy Algorithmic bias, privacy concerns, reduced teacher autonomy Ensuring diverse training data, human oversight, and transparency; protecting student data; empowering teachers
Social media monitoring tools Tracking students' social media activity to identify potential threats or concerning behavior Questionable effectiveness; high rates of false positives and limited prevention Invasion of student privacy, chilling effect on free speech, disproportionate impact on marginalized students Clear policies on monitoring scope, student and parent notification, opt-out options, data retention limits
Virtual and augmented reality (VR/AR) Immersive educational experiences and simulations Engaging for students, but limited research on learning outcomes; high costs and technical challenges Potential for physical discomfort, psychological impact, and addiction; unequal access Age-appropriate content, time limits, adult supervision, ensuring equal access for all students
Wearable devices (e.g., fitness trackers) Tracking student physical activity and health data for PE classes and wellness programs Can encourage physical activity, but accuracy and long-term engagement are concerns Student privacy, data security, potential for body shaming or unhealthy competition Opt-in policies, strict data protection, focusing on overall wellness rather than individual metrics