By: Jennifer Nelson

I am not opposed to technology in the classroom. It has its uses. For example, I am able to teach language arts to public school students online. While they are in their classroom, I am at home interacting via WizIQ. This was before the madness of 2020, and the format continues today.

I know that there are a lot of teachers who teach in Asian public schools from the United States. The schools have a large monitor setup in front of the classroom, an assistant teacher there with the students, while the teacher from the US is at home giving the lectures. This is all wonderful; and there are many other positive uses of technology.

Yet there is also a bigger picture to consider. My point of view is not likely to fall under the scope of what is commonly discussed, yet I am okay with that. My main concern is that children could become trained to rely so much on technology that they become controlled it. As technology develops, there could be a point where humans decline to use their own mental faculties because they believe technology is superior. And if they don’t believe that, they may live in a society that does.

A society controlled technology could be quite unpleasant because it could lead to a digital dictatorship. Harari suggests, “information technology is continuing to leap forward; biotechnology is beginning to provide a window into our inner lives—our emotions, thoughts, and choices. Together, infotech and biotech will create unprecedented upheavals in human society, eroding human agency and, possibly, subverting human desires. Under such conditions, liberal democracy and free-market economics might become obsolete” (2018).

This is not something that happens automatically. It is something that one implements over time conditioning the recipients to accept it. If one wants to be effective, you don’t begin saying, “this will be used to reduce humans to a slave class in order to implement a world wide dictatorship.” That would be ridiculous and hard to believe.

Yet it is already being enforced at some level. For example, “China’s government is embracing technologies to monitor its population. A national plan to develop artificial intelligence highlights its “irreplaceable role in effectively maintaining social stability.” Surveillance cameras with facial recognition, policing platforms that crunch big data and the monitoring of smartphones and social media are being deployed” (Yuan, L, 2018).

That’s not something that can be easily implemented worldwide, but that doesn’t mean it’s impossible.

It’s like the tale of how to boil a frog in water. This common saying suggests that you can’t put a frog in hot water because they will jump out. You must put them in water that is comfortable and then slowly turn the heat up. Whether or not that is actually true is irrelevant because it’s the imagery that matters.

In this case, the imagery conveys that if you want to change a society, you start with the children. And you begin with a normalization process that tells educators things like, “the simplest application of AI often provides the most immediate benefit: By automating straightforward tasks such as grading, digital asset categorization or timetable scheduling, educators can increase the amount of time they spend actively engaging with students” (Bonderund, 2019).

Then you push the envelope further with “emotional care robots” and other things that disconnect humans from engaging with one another in favor of having some technology do it. Chen, Yeh, Tseng, Wu, and Chung outline using an emotional robot as a teaching assistant (2009). “For each robot like a performer, it will behave individually in its own distinctions including character, expression and emotion. Hence, they can reply different responses to the same stimulus” (Chen, Yeh, Tseng, Wu, Chung, 2009).

Behavioral Signals notes that it has a tool that can read emotions based on a child’s voice in order to alert the teacher to the child’s state (n.d.). This is promoted as being helpful with classroom management, but in my opinion it sets a dangerous precedent.

My second concern with technology is the health of the children. If classrooms are full of wireless technology, there should be safety protocols to protect the students and teachers from the exposure to radiation. This is important because, “people with chemical and/or electromagnetic sensitivities can experience debilitating reactions from exposure [ to.. and] from electromagnetic fields emitted computers, cell phones, and other electrical equipment” (Carpenter, Melnick, Herbert, Scarato, Clegg, 2019).

With children being issued laptops and other devices, I find the subject of radiation to be quite relevant.

Furthermore, schools are not always following the recommended guidelines. “per current guideline Wi-Fi antennas must be at least 20 cm /8 inches away from body. In practice students are very close to antennas for many hours a day” (Carpenter, Melnick, Herbert, Scarato, Clegg, 2019). I question if educators are even aware of this.

The point of implementing technology in the classroom is to help enhance the students educational experience. Yet educators need cognizance of how to do this in a harmless manner.

In addition to the potential of conditioning children to accept anti-human values, there is also the issue of health. Carpenter, Melnick, Herbert, Scarato, Clegg, note that “there is strong scientific support to argue that EMF/RFRs are important contributors to degrading the optimal chemical-electrical function of our bodies – theredetuning our brains and nervous systems” (2019).

I support technology that is used in a healthy and responsible way. In order to use technology in this fashion, people have to challenge themselves to be conscious of potential issues so that we can all conduct ourselves in a manner that values human life. This involves cultivating a high level of awareness of the potential outcomes, and setting limits to what technology will and will not be used for.

(Date: 10/22/2020)


References


Behavioral Signals. (n.d.) The Impact of AI on Education.
https://behavioralsignals.com/the-impact-of-ai-on-education/


Bonderund, D. (2019 August 27). Artificial Intelligence, Authentic Impact: How Educational AI is Making the Grade https://edtechmagazine.com/k12/article/2019/08/artificial-intelligence-authentic-impact-how-educational-ai-making-grade-perfcon


Carpenter, D., Melnick, R., Herbert, M., Scarato, T., Clegg, F. (2019 March 25). Questioning the Safety of Our Children’s Exposure to Wireless Radiation in School. Retrieved from https://www.youtube.com/watch?v=Rpa2XFxYax8

Chen J., Yeh L., Tseng H., Wu G., Chung I. (2009) Development of an Emotional Robot as a Teaching Assistant.In: Chang M., Kuo R., Kinshuk, Chen GD., Hirose M. (eds) Learning Playing. Game-based Education System Design and Development. Edutainment 2009. Lecture Notes in Computer Science, vol 5670. Springer, Berlin, Heidelberg. https://link.springer.com/chapter/10.1007/978-3-642-03364-3_64


Harari, H. ( 2018 October). Why Technology Favors Tyranny. The Atlantic.
https://www.theatlantic.com/magazine/archive/2018/10/yuval-noah-harari-technology-tyranny/568330/


Yuan, L. (2018 March 1). Stranger Than Science Fiction: The Future for Digital Dictatorships. Wall Street Journal. https://www.wsj.com/articles/stranger-than-science-fiction-the-future-for-digital-dictatorships-1519900866