top of page

Biases and Breaches: The Importance of Teaching Students About Human Influence in Technology

Updated: Nov 21, 2023




From the widespread use of the Internet and social media to the digital transformation of various sectors including education, the impact, and importance of technology in our society cannot be denied. It is in this context that the World Economic Forum sets forth the idea of digital literacy as a skill that must be taught to our youth by schools and universities. Students must learn about the increasingly digital world not only in technical terms but also in the ways science, technology, engineering, and mathematics (STEM) are not neutral and are otherwise affected by human influence and social constructs. It is crucial that students have the capacity to reflect on and analyze how technologies are developed, accessed, and used; and how these can have implications for our political, social, and economic worlds.


Analyzing how politics and bias seep into tech

The politics and bias of coders, engineers, designers, or developers can be embedded in the code, data, algorithm, or any aspect of technology that they work with. What they create and release into the world is therefore not purely objective; it can sometimes reflect only their own lived experiences and exclude the people who live differently from them. This is exactly what algorithm expert Cathy O’Neil highlights in her book Weapons of Math Destruction, pertaining to how algorithms are not just built for efficiency but can sometimes codify society’s prejudices and inequalities. For example, firms can discriminate against people of low-income backgrounds or of a particular race or gender when their hiring models screen out applicants based on their credit score, zip code, or the given name on their résumé. Such hiring decisions and practices only perpetuate and reinforce existing cycles of unemployment and poverty, so students must be taught to value fairness and avoid letting their biases skew data and algorithms.


Using tech to solve problems and improve society

Becoming aware of one’s own judgments and biases is only the first step. The younger generation must also be equipped with the capacity to identify not just the risks but also the benefits of technology. As future STEM experts, it’s important that students are guided toward shifting the power dynamics in technology and being better citizens of the world. This positive, yet not uncritical, view of technology as an avenue for changing the world is shared by space entrepreneur Peter Diamandis in The Future Is Faster Than You Think. Diamandis introduces the usefulness of existing tech across various industries like food, healthcare, and education, as well as the limitless possibilities that can be unlocked through the convergence between two or more technologies like artificial intelligence (AI), 5G, and blockchain. Educators can start instilling this idea of STEM—the careers and advancements within the field—as a powerful catalyst for an agent of change by encouraging them to investigate problems within their communities, then giving them a chance to collaborate and work together in developing solutions that are inclusive of their multiple views and perspectives about the world.


Recognizing the values of privacy, security, and transparency

Beyond encouraging students to ask questions and spark conversations about the implicit biases and societal implications of technology, they must also be prepared to value privacy, security, and transparency when approaching current and future tech. As discussed in a previous post entitled Teaching AI Ethics to High School Students, the emerging role of AI involves gathering data from various sources and platforms from which patterns and assumptions can be made. As such, the people behind AI must be careful not to exploit their access to a wide range of information and secure it from being compromised by third parties or unauthorized actors. To illustrate, there have been talks of leveraging AI in medical devices and health records in order to automate and streamline processes, but this must come with tight and built-in cybersecurity measures so that health data only remains between patients and providers.




This article was written by Remy Anne Jacobs


Exclusively for



AIClub's educator curriculums include a range of introductory yet diverse AI topics to teach for K-12 students. In our course AI Ethics, we break down privacy, bias, and trust when using and building AIs. Learn more about this course and teaching ethics to K-12 students here.

38 views0 comments
bottom of page