More than ever, technology is shaping, and being shaped by, public policy. This has an enormous impact, particularly for marginalized communities. The artificial intelligence and computer algorithms increasingly driving government and industry decisions—from the allocation of social services to hiring—are reflecting and reinforcing social biases towards women, people of color, and disabled people, among others. Global climate change is having a disproportionately negative impact on low- and middle-income countries, and on historically disadvantaged communities of color in the United States. Communities are increasingly concerned they are not benefiting from government research funding, and that the regulation of emerging technologies is inadequate.
The interconnectedness of technology, policy, and equality raises crucial questions for scientists, technologists, and leaders in public policy, civil society, and industry. How can technology be built, implemented, and governed more equitably? How can the concerns of marginalized communities be integrated better into technology and related policies? How should community knowledge and concerns be integrated with technical expertise and scientific evidence in the development of public policies?
This course aims to help learners understand how inequity and injustice can become embedded in technology, science, and associated policies, and how this can be addressed.
Combining real-world cases with scholarly insights, this course introduces learners to these challenges and offers tools for navigating them. You will learn about:
– The landscape of technology policymaking
– How technology, and related policies both reflect and reinforce social values, biases and politics
– The power and limitations of technology in solving social problems
– New ways to think about “experts” and “publics”
– The politics of innovation policy
The course is designed for people from diverse professional, advocacy, and academic backgrounds. No scientific, technical, or policy background is necessary.