Two excellent resources I have made available for me are that of the interviews and presentations of Chris Gilliard and Ian Linkletter. Ian Linkletter is an Emerging Technology and Open Education Librarian at the British Columbia Institute of Technology. He is also a public rights activist and has been raising awareness for Slapp lawsuits ever since he was sued by a monitoring software company. Chris Gilliard is the co-director of the Critical Internet Studies Institute, he also specializes in research in ‘luxury surveillance,’ and has a book of the same name coming out this year.

Ian Linkletter
Chris Gilliard spoke in his podcast-style talk on the growth of digital surveillance, both online and off-line. One of the concepts that he spoke of was “luxury surveillance,” where high-class and expensive technology, like apple watches and smart cars, act as a high-tech security cameras that constantly watch you. It is somewhat sinister how large companies spend millions of dollars on mass psychology operations to convince people that their products, that they make to sell to you for money, are the height of necessity and your possession of them reflects well on your social standing. Especially when these products are then used to track and monitor those who use them, and now with the new push to wear some kind of necklace or pin that you are always wearing and that is always listening.
Ian Linkletter generally specializes in the many ways in which digital surveillance is used to engage in unethical behavior in monitoring students. Digital proctoring is the act of using software when a student is engaging in cheating behavior when doing schoolwork online. Digital proctoring software observes you while you are engaging in schoolwork by observing your head movements and checking if you are still in frame. He covered how this digital proctoring software is often more active against students who are neurodivergent or disabled, as the actions that are defined as being ‘suspicious’ as they are more likely to fit the vague descriptions that the programs are compliant to. The digital proctoring software that scans a computer’s web-screen to check if a student’s face is present has also been known to discriminate, this time against people with darker complexions. Its advocates may claim that the software is ‘colorblind,’ and in one sense it quite literally is, but this does not change the fact that these unthinking programs are still pliant to the previously existing biases of those who made it, like how was pointed out in my blog of Nodin Cutfeet’s talk.