The opening keynote at Educause 21 was delivered by Ruha Benjamin, entitled Beyond Buzzwords: Innovation, Inequity and Imagination in the 21st Century (abstract).
Benjamin began the address with two images of robots. There are two stories of technology: one of the apocalypse where all technology is out to destroy us and one of utopia, where the technology will save us. In order for it to be the latter Benjamin proposes, we need to broaden whose imagination is being included in these technologies and that needs to begin in higher education.
Structural inequities and discriminatory design can and do occur. With innovation comes inequities unless we are aware of it and takes steps to advert it. Benjamin provided an example of park benches that are designed for sitting, but do not allow for sleeping. Similar situations occur with the metering of healthcare and education (which I assume Benjamin means pay-for-service options) which are solutions that do nothing to solve the underlying social issues.
Alternative text for image: Twitter, image shows a park bench with spikes. A user must inset coins to make the spikes retract. This is the work of design student Fabian Brunsing calling out hostile architecture.
According to Benjamin these obvious versus insidious spikes occur in technology as well. The example was given of immigration data that is then used to discriminate the very population that it was aiming to serve. Benjamin states that the 'do gooder' is unwittingly enabling harm by not involving the community in decisions about data collection, nor do they look at the context or the history of the situation.
Technology is also coded to be discriminatory, as algorithms in proprietary systems inherit the creators biases and prejudices. Benjamin refers to this as the new Jim Code, the insidious spikes that you may be completely unaware of. Examples included AI making discriminatory decisions about care, profiling by applicant name, and other decisions that are made about you based on your data. Benjamin showed a clip from Better Off Ted that exemplified how inadvertently biased tech can occur.
Acknowledge that bias exists. Practitioners and technologist should consider the social science, the history, and the context when they are trying to create a tool. Benjamin stressed that we all have a responsibility to read the fine print, to turn technology on its head, to question where the expertise lies, to create new patterns and to move beyond tokenism and predatory inclusion.
Books:
Benjamin, R. (2013). People's science bodies and rights on the stem cell frontier. Stanford, California: Stanford University Press. https://casls-primo-prod.hosted.exlibrisgroup.com/permalink/f/cun44o/01CASLS_SPL_ALMA5138717020003480 Available at Saskatchewan Polytechnic
Benjamin, R. (2020). Race after technology: Abolitionist tools for the New Jim Code. Polity. https://my.saskatoonlibrary.ca/sm/search/results?terms=ruha+benjamin&search_type=KW&format=all&language=all®ion=&branch=&availability=&sort=MP&page=1 Available at the Saskatoon Public Library.
No comments:
Post a Comment