Activity 4
AI Ethics
AI Ethics is a very important aspect of AI that students of all ages should be exposed to. In this activity, we tackle a critical part of AI Ethics, namely Trust and Bias.
Lets Build an AI to Classify between Apples and Bananas
Build an AI in Navigator to tell the difference between apples and bananas
How to start using Navigator
1. Go to https://aiclub.world/try-navigator
2. Follow the instructions in the video below to build the AI
Discussion
1. What is the accuracy of the AI?
2. Try a few pictures of apples - does it work?
3. Try a few pictures of bananas - does it work?
4. How about a picture of an orange - what does the AI think it is? Why?
5. How can we teach this AI to recognize oranges?
(a) give it one picture of an orange
(b) give it fewer pictures of apples or bananas
(c) give it as many pictures of oranges as of apples or bananas
(d) any of the above will work
6. What does this exercise tell us about AI Bias?
(a) If AI is taught about caucasians and asians, it may not answer questions about african americans correctly
(b) if AI is taught about adults, it may not answer questions about teenagers correctly
(c) If an AI is taught about men's purchase patterns, it may not recommend relevant products for women
(d) all of the above
In this activity, we will assess how datasets can contain inherent bias which, if left unmanaged, can impact the quality of the predictions.
We will use the COMPASS dataset, which is a famous study and illustrates how race as a feature impacts predictions of Recidivism risk.
Algorithms were used to predict an inmate’s risk of re-offending. Studies later showed that these algorithms were significantly biased by race.
By building your own AI and exploring it you can see if the AI built from this dataset is biased.
The dataset that we will use is a processed version of that found in the Kaggle repository at https://www.kaggle.com/danofer/compass
Let build our AI
How to start using Navigator
1. Go to https://aiclub.world/try-navigator
2. Follow the instructions in the video below to build the AI
Observe and Discuss
Try leaving all other fields the same and change the race field. Does the decile score change?
What features are the algorithm most sensitive to? Try using Range Query to determine. For example - try setting an age range from 20 to 50 and select all races and see how the decile scores vary.