Coded Bias: Education Panelist Perspective

The film Coded Bias makes an excellent contribution to a dialogue that is far too limited in education. My comments and perspectives are based on my career as a K-12 teacher and now working in teacher education in post-secondary with an interest in transforming teaching and learning. I would like to thank Shalini Kantayya and everyone involved in the film making for provoking this much needed dialogue to help guide the way forward as machine learning and artificial intelligence (AI) continue to evolve and impact all aspects of society. The film reminds us that societal biases can be encoded in algorithms unknowingly or unintentionally and can lead to algorithmic bias, a problem that may not be easily detected. The use of algorithms can lead to important decisions that affect people’s lives. As shown in the film, it’s possible for an algorithm to provide an invalid assessment of an exemplary teacher that can impact employment, retention or tenure. Similarly, invalid assessments of students can impact admissions, program advancement, assessments and decisions related to their academic conduct. What are the imperatives for education? For educators, for schools, for curricula? I would like to discuss three imperatives (I’m sure there are many more):

First, biases need to be critically examined. I often refer to the double-edged sword of innovation. With AI for example, there can be extraordinary opportunities for improvement, such as increased efficiency but there can also be significant consequences, such as the invasive surveillance shown in the film. Technology can be helpful and at the same time technology can also cause undue harm. AI can be developed for seemingly good purposes and with intent to be harmless not harmful. However, there can be insufficient attention to the biases in designs. In teaching we refer to teachers as designers of learning and recognize that each teacher has bias, each curriculum designer has bias, each curriculum has bias. The film demonstrates why it is important for designers in any field to analyze bias in their designs. Bias in designs need to be critically analyzed and questioned from multiple perspectives; bias needs to be discovered and uncovered at the very early stages in the design process. Too often designers move from prototype to testing or from draft curricula in education to pilot phases without critically examining and limiting the biases.

A second imperative is to raise the expectations and standards for ethics in designs.

In education we need transparency and accountability for algorithms that are used that have potential to impact overall advancement of individuals. There needs to be full disclosure of the algorithms and there needs to be regulations for their use. We need to question the ethics and raise the standards when using AI as the first step and first stop in making important decisions that have human impact. False positives can have a significant negative human impact.

A third imperative is to take responsibility and assume a role in protecting integrity. We all have a role and responsibility to protect the integrity of a meaningful world. In my role as an educator and scholar in education, and an academic coordinator for a graduate program called Leading and Learning in a Digital Age, I aim to design courses and conduct research and continually interrogate and critically examine implications of innovations in education. We need to advocate for, look for and consider plausible consequences when designing learning or when faced with testing or piloting any new inventions and innovation. As a society, how might we take action? How might we advance high standards of the technologies we use with learners, the technologies we develop for learning, the learning designs and the curricula used?

There were three key imperatives that resonated with me from an educational perspective as I viewed the film: there is a need to critically examine the biases; there is a need to raise the expectations and standards for ethics in designs; and there is need for all of us to take responsibility and assume a role in protecting the integrity of a meaningful world.

You may find the following related links interesting ( shared by Dr. Lisa Silver,  Faculty of Law, University of Calgary):

Federal Digital Charter: https://www.ic.gc.ca/eic/site/062.nsf/eng/h_00108.html

Law Commission of Ontario, The Rise and Fall of AI and Algorithms In American Criminal Justice: Lessons for Canada, (Toronto: October 2020)

Lisa Silver and Gideon Christian, “Harnessing the Power of AI Technology; A Commentary on the Law Commission of Ontario Report on AI and the Criminal Justice System” (November 18, 2020), online: ABlawg, http://ablawg.ca/wp-content/uploads/2020/11/Blog_LS_GC_LCO_Report.pdf (commenting on the LCO Report)

Recent privacy review of Clearview AI: Joint investigation of Clearview AI, Inc. by the Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information Privacy Commissioner of Alberta:

https://canlii.ca/t/jd55x

Ewert v. Canada, 2018 SCC 30 (CanLII), [2018] 2 SCR 165: https://canlii.ca/t/hshjz (bias in risk assessment tools)

Multiple Reports on the issue from AI NOW Institute: https://ainowinstitute.org/reports.html.

LAWNOW Magazine – Special report on Privacy: https://canlii.ca/t/sjpm

An accessible perspective: McSweeney’s Issue 54: The End of Trust (2018)

High School Redesign

Abstract: Researchers examined seven schools in Alberta undergoing high school redesign and removing the Carnegie Unit, a time-based metric for awarding course credits. A mixed methods convergent parallel design was used to gather data from leadership teams in the schools and to examine evidence of impact on student learning. Qualitative and quantitative data were analyzed concurrently and then merged for the analysis. Findings illustrate that removing the Carnegie Unit was a catalyst for redesign and learning improvements.  Five constitutive factors enable high school redesign, including a collective disposition as a learning community, a focus on relationship building, obtaining student input, collaboration, and making changes to learning tasks and assessment practices.  The findings provide insight into the ways in which leadership teams formed complex adaptive systems to enable change and may serve to inform practitioners and school leaders, schools and systems, and those who study policy changes in schools.

Brown, B., Alonso-Yanez, G., Friesen, S., & Jacobsen, M. (2020). High school redesign: Carnegie unit as a catalyst for change. Canadian Journal of Educational Administration and Policy (CJEAP), 193, 97-114. https://journalhosting.ucalgary.ca/index.php/cjeap/article/view/68066

Neutral Chair for Online Oral Defense

I served as a neutral chair for an online doctoral defense recently and thought it might be helpful to share my experience. This may be helpful to others who serve as neutral chairs or for graduate students or examiners who are wondering about the process for an online oral defense. I also want to note that this may not be the process for all examining committees, but this may provide some ideas.

 

I connected to the meeting room (using Zoom) about 15 minutes prior to the start of the exam. When I arrived, the student and their supervisor were already in the virtual room and having a conversation. There was also a graduate program administrator who was there to make sure everyone could connect properly, and the student could share a slide presentation. Next, we discussed what would happen during the deliberation part of the exam and decided the student would go into a breakout room and then return to the main room after the deliberations. We tested this out to make sure the student could easily move to and from the breakout room. By this time all the examiners were present, and we were ready to begin the exam. The graduate program administrator logged out of the session and provided me with a contact number for any issues during the exam. I also provided my contact number to everyone in the event of any connectivity issues.

 

Example of an Oral Exam Sequence:

  1. Description of Process for Exam – I described the sequence of events that would take place during the exam (e.g., introductions, student presentation, two rounds of questioning, followed by a third optional round and then our deliberations).
  2. Introductions – I called on each person, one-at-a-time to provide an introduction. Each of the examiners, the student, and myself (neutral chair) provided a brief introduction with name and role. This was a good opportunity to make sure all examiners and students were turning their microphones on/ off properly. I also intentionally made sure the student was not the last one to provide an introduction as I wanted to give the student a break between providing the introduction and then moving the presentation.
  3. Student Presentation – The student started by providing a presentation up to 15 min. in length. The student shared the presentation screen so we could all see the slides. I asked the examiners to mute their mic during the presentation and with the option to turn off their video as well.
  4. First Round of Questioning – Following the presentation, we started the first round of questioning. Each examiner, starting with the most external first, asked a question. The students had up to 10 min. to respond to the question and during that time frame the examiner could also ask follow-up questions. During the questioning I asked the examiner asking the question and student to leave their video ON. However, during the questioning, I suggested the other examiners could turn OFF their video. This way, the student could focus on looking at one person on the screen instead of a gallery when answering the questions. I also indicated that I would turn my camera back ON closer to the 10 min. point as a visual cue, so the examiner would know it’s time to wrap up their questioning for this round and reserve additional questions for the next round. This visual cue seemed to work quite well and kept the exam timeline on track.
  5. Break – after the first round of questioning, we took a five-minute break. I asked all the examiners and student to mute their microphone and turn off their video. We agreed on the return time. I asked everyone to turn ON their camera to indicate they were ready to start the second round.
  6. Second Round of Questioning – We repeated the same process as the first round of questioning. Once this round was complete, I offered the examiners an opportunity to ask any additional questions. I asked the examiners to let me know if they had any further questions so I could allocate the remaining time appropriately for the third round.
  7. Deliberation – After the rounds of question were complete, I explained that I would open the breakout room for the student. I explained to everyone that we would have deliberations and when we finish, there would be a message in the breakout room indicating the room would be closing. I set the breakout room to provide a 15 second time for transition back to the main room.  When the student was ready and understood what would be happening, I opened the breakout room. The student then moved into the room. Visually, I could see the student was now in the room and only the examiners and myself (neutral chair) remained in the main room. During the deliberations, the examiners turned ON their videos and microphone. I explained the voting process and how the examiners could privately send me their examination results. At the conclusion and when all the examiners were ready, I explained the student would be returning in about 15 seconds.
  8. Closing – The student returned back to the main room and all the examiners turned on their Videos/microphones and provided commentary and feedback. Once this was complete, I thanked everyone and closed the meeting room.

 

 

Online Assessment of Research Projects

I was interviewed about my assessment practices in online environments. You can read the blog post based on the interview.

Here’s some additional reflections regarding my assessment practices in online environments:

What are your main ways of communicating with students when teaching online? How do you ensure that your communication of assessment expectations are clear?

I try to use a variety of ways to communicate and ensure assessment expectations are clarified.  I prepare a detailed syllabus with information about the learning tasks and assessment.  I also use our learning management system (LMS) to communicate with students and share information about assessment.  For example, I organize course materials prior to the commencement of the course and open the course so it’s accessible to students one or two weeks prior to the start date.  In the content area, I provide a section for each of the learning tasks with a detailed rubric clarifying the learning intentions. In addition, I prepare a video where I discuss each of the learning tasks and expectations so the students can review the video prior to or during the first week of class.  I send weekly and sometimes bi-weekly email messages to the students to help clarify expectations.  I also use the News items on the landing page for the course in the LMS to communicate with the whole class.

When providing students with individual feedback, I use email and send direct messages to the students. I try to do this early on and provide students with formative feedback so they know if they are on-track or need to make improvements to their work.  I also post messages to groups of students in the discussion forum to offer commentary and advice when students are working on learning tasks as a group.

Offering drop-in sessions for students is another strategy that I have found useful to help with specific parts of an assignment.  For example, when I noticed students were not accessing current resources and recognized they were not making full use of accessing the databases accessible to them as graduate students, I offered two drop-in sessions at different times.  During the drop-in session I used our web conferencing system to share my desktop and talk through the steps for finding and assessing quality of resources.  I record the drop-in sessions and make them accessible in the LMS for students who were unable to join us.  I also have a few videos pre-recorded and accessible on my YouTube playlist for students and also make these available in the LMS (Dr. Brown’s Playlist).

Another strategy I use to help clarify learning intentions is part of my instructional design.  I organize peer-feedback loops.  This provides students with an opportunity to share draft work with myself as instructor and their peers.  During synchronous sessions, I use breakout rooms and I circulate through the rooms to offer my feedback.  Asynchronously, I organize spaces in the discussion forum for students to share their work and offer feedback to each other.  I also respond to the threads and provide my feedback as well.  We all use the criteria in the rubric to offer feedback and idea improvement.  I find the use of the rubric helps everyone give meaningful feedback that helps move work forward for all.

In the following article, I discuss a strategy I use for offering feedback to students using different mediums:

Brown, B. (2019). One-take productions for student feedback. Education Canada, 59(2), p. 10. Retrieved from https://www.edcan.ca/articles/student-feedback/

What are the biggest challenges associated with giving students a research paper assessment online? How do you manage these challenges? 

In online courses, I also find time is a challenge. Time to create additional resources or use multimedia as part of the assessment process.  For example, I would like to dedicate more time to creating, revising and curating resources to support students when working on their assignments. Resources quickly become outdated as software and other systems are updated. One of the ways I manage this challenge is to lower my expectations in the quality of the video produced. As noted, in my article, One-take productions for student feedback, it is not necessary to spend excess time in editing and polishing videos. Students appreciate conversational style and do not expect professional videos. Similarly, when I offer drop-in sessions.  At first, I thought it was challenging to find time to organize and plan the drop-in sessions.  However, once I started to do this and noted that a less structured approach and with a conversational style was effective, I no longer viewed time as a barrier.  I found drop-in sessions can be 30 minutes in duration and can help students with their specific questions.

Are there any differences in how you design/plan for online assessments compared to face-to-face assessments?  What additional factors do you have to consider?

I design/plan for assessments in a similar way in my face-to-face classes.  Many of the strategies I use online, I have now incorporated in my face-to-face classes. One additional factor in online classes is that I do not see the individual students on a regular basis as I do in a face-to-face class. In online classes, I find myself communicating using written text more than I do in a face-to-face class.  It is important to check-in and make sure students are not mis-interpreting feedback when relying on written text. This is one of the key reasons I started using more multimedia to communicate.

What other advice/tips/insights do you want to share with instructors who are new to teaching and assessing students online?

In addition to the tips I offer in the One-Take Productions article  I suggest dedicating time to instructional design. Prepare the syllabus as well as the online space in the LMS prior to the commencement of the course.  Students appreciate accessing a well-organized online learning space to help clarify course expectations and criteria for assessment.

One Word Input

Mentimeter can be used to gather input from a group.  When working with pre-service teachers and helping them design units in small groups, I asked the question “What is the most challenging part of this work?”  Students were invited to provide up to three, one-word responses using a link provided by Mentimeter– or by visiting www.menti.com and using the code provided.

Mentimeter also provides users with a link to show others the result to the question.

I was also able to share the results on my course page using the embedded code provided. I believe this is a great way to visually gather input from students and to show the live results. I plan to use this again.

Check out the Mentimeter site for numerous examples in using this tool to engage audiences and for assessment.

Online Teaching Tip – Transparent Feedback Loops

 

Tip: Use the online discussion forum to incorporate transparent feedback loops into learning tasks to provide students with suggestions for improvement.

 

Sharing incomplete and draft work can be a regular and repeated process throughout a course. Students can be organized into small peer review groups (3-4) to share draft work using the online discussion forum in Desire2Learn. Draft work can be shared as an attachment or by inserting an external link (i.e. Google document) in the message thread.  It is also helpful for reviewers when students describe the type of feedback requested using the criteria outlined for the learning task.

 

In these small discussion groups, the feedback process is manageable and students can provide clear and specific feedback to a few peers in the class.  As the instructor, I also review the draft work posted and provide feedback to students.  My feedback might include a brief reply posted in the discussion forum or a more detailed response using track changeshttp://www.drbarbbrown.com/wp-content/uploadscomments. Additional feedback may be required using email or arranging a virtual meeting using Adobe Connect.

 

Overall, this transparent feedback strategy serves to: 1) provide students with peer and instructor feedback when there is still an opportunity to make changes before submitting the assignment for a grade; 2) clarify learning intentions and any misunderstandings about the criteria for the task; and 3) offer students an opportunity to review the type of feedback peers receive from other students and from the instructor. The following quote from one of my former students demonstrates the value in using transparent feedback loops: “Assessment practices supported my learning and showed me what my next steps are. Quick and helpful feedback was inspiring and exactly what I needed to stay engaged in the course.”

Note: This was also posted in the Teaching & Learning Newsletter, Werklund School of Education, University of Calgary, November 2015.