From f1d8a8816b53c3d1bf03a43ae52dd42bf9a2887f Mon Sep 17 00:00:00 2001
From: Jen Mankoff <jmankoff@cs.washington.edu>
Date: Mon, 28 Oct 2024 10:19:59 -0700
Subject: [PATCH] fixed week

---
 assignments/technology-implementation.md |  2 +-
 slides/bias-in-machine-learning.html     | 18 +++++++++---------
 2 files changed, 10 insertions(+), 10 deletions(-)

diff --git a/assignments/technology-implementation.md b/assignments/technology-implementation.md
index f9839239..fe8ba6c7 100644
--- a/assignments/technology-implementation.md
+++ b/assignments/technology-implementation.md
@@ -35,7 +35,7 @@ To complete this assignment, please do the following:
 You should identify a technology you have implemented *for general use* (i.e. not an accessibility technology). Ideally this should be an interactive technology (some kind of user interface, app, etc).  If you have no such thing, you can ask to approve using a website or other static content that you have generated as an alternative. 
 
 {% details Possible examples (this list is still under construction) %}
-- A class assignment that has an interface (web, mobile or other). If you've built a mobile app for 340, built a website for 154, or taken 331 (such as the campus map website) these are all things you could use.
+- A class assignment that has an interface (web, mobile or other). If you've built a mobile app for 340, built a website for 154, or taken 331 (such as the campus map website) or 442 (if you made a website) these are all things you could use.
 - A web or mobile app you built yourself
 - A profile of yourself on linked in, tiktok, or any other site where you've posted content.
 - Your personal website (even if it's hosted somewhere else like wordpress)
diff --git a/slides/bias-in-machine-learning.html b/slides/bias-in-machine-learning.html
index 69e6c546..0645c8ed 100644
--- a/slides/bias-in-machine-learning.html
+++ b/slides/bias-in-machine-learning.html
@@ -1,13 +1,13 @@
 ---
 layout: presentation
-title: AI Accessibility Bias  --Week 7--
+title: AI Accessibility Bias  --Week 6--
 description: Discussion of Bias in AI
 class: middle, center, inverse
 ---
 background-image: url(img/people.png)
 
 .left-column50[
-# Week 7: Automating Accessibility 
+# Week 6: Automating Accessibility 
 
 {{site.classnum}}, {{site.quarter}}
 ]
@@ -166,6 +166,12 @@ as disabled <q>based on a hunch</q>.
 
 - Make predictions
 
+---
+--- 
+# QUICK BREAK
+
+Good time to stand and stretch
+
 ---
 # How do we Evaluate Predictors/Predictions?
 
@@ -205,6 +211,7 @@ Examples:
 
 - **Make predictions**
 
+
 ---
 # Example: Resume study (1/2)
 
@@ -245,7 +252,6 @@ Awards and honors
 - Tried this with 6 “Disability” CVs [Disability, Blind, Deaf, Autism, Cerebral Palsy, Depression] vs. a CV Missing Disability Items 
 - Gave ChatGPT 10 tries per CV
 
----
 ---
 # What should have happened
 
@@ -256,12 +262,6 @@ Bar chart titled: “What should have happened”. The X axis shows number of ti
 
 What should have happened is that the CVs with the awards (the disability items), which are all prestigious, wereranked first 10 out of 10 times. 
 
----
---- 
-# QUICK BREAK
-
-Good time to stand and stretch
-
 ---
 # What  happened
 
-- 
GitLab