Student Privacy @ Home
Week 37, 2021
Welcome to this week’s Appropriate Future, a roundup of news and info on AI, data and algorithms, and its impact on the planet, humans, privacy, and public policy. In today’s edition: Continued calls for regulating high risk AI; Privacy vs. Pandemic data; and tracking schoolkids at home.
If you know of anyone that might be interested in the topic - please share!
Ban the algorithms that threaten our human rights. UN's human rights chief Michelle Bachelet is pushing for a ban on AI applications that are contrary to international human rights law (specifically calling out applications in surveillance, discrimination, and biometrics). Since safeguards aren't in place yet to make sure that the technology is used responsibly, governments should rein in artificial intelligence as a matter of urgency. ‣ zdnet.com
Technological progress does not have to come at the expense of safety, security, fairness, or transparency. In fact, embedding our values into technological development is central to our economic competitiveness and national security. Our federal government has the responsibility to work with private industry to ensure that we are able to maximize the benefits of AI technology for society while simultaneously managing its emerging risks.
The Path to Fairer AI Starts With Audits, Standards. Government needs to establish clear procedures for vetting high-risk AI systems for bias and discriminatory impacts, said the report “Cracking Open the Black Box” from New America’s Open Technology Institute. ‣ govtech.com
Biden's FTC is adding a staunch critic of facial recognition tech. Alvaro Bedoya, a professor at Georgetown Law, previously served as lawyer for a U.S. Senate subcommittee on tech privacy, and is an outspoken critic of surveillance technology’s use in law enforcement. ‣ inputmag.com
How data, analytics, and AI power public health. The pandemic has put a spotlight on how big data and analytics technologies are being used in the public health sector. ‣ gcn.com
Pandemic tech left out public health experts.
“Singapore said, ‘We’re not going to use your data for other things.’ Then they changed it, and they’re using it for law enforcement purposes. And the app, which started out as voluntary, is now needed to get into office buildings, schools, and so on. There is no choice but for the government to know who you’re spending time with.”
‣ Tufts professor Susan Landau via technologyreview.com
An Inside Look at the Spy Tech That Followed Kids Home for Remote Learning — and Now Won’t Leave.
The data, gleaned from those 1,300 incident reports in the first six months of the crisis, highlight how Gaggle’s team of content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools’ authority far beyond their traditional powers to regulate student speech and behavior, including at home.