To: The CU Faculty Council Executive Committee
We’re writing to request the University of Colorado System ban the use of facial recognition and detection technologies (referred to hereafter as FRT) by any CU institution, in all educational settings, online and in-person.
FRT are invasive, biased, and violate the privacy of students, faculty, and staff. They exacerbate discrimination and put our safety at risk. Several studies have shown that FRT inconsistently identifies Black people, trans people, and misidentifies non-binary people 100% of the time. FRT systems often share data with law enforcement agencies that have used FRT to target undocumented immigrants, surveil protesters, and communities of color. FRT systems can sell or share biometric data with for-profit companies and federal agencies without the consent of the people being whose faces are scanned.
Diversity, Inclusion, Equity, and Access is one of the four pillars of the University of Colorado System’s Strategic Plan. Banning FRT is a definitive and measurable outcome that supports this pillar and sends a strong message to our community that we care about equity. We ask that administrators work together with our coalition, as well as with IT leadership, to keep FRT off campuses, except for personal use (like unlocking a phone) or for ethically conducted research.
Signed,
CU Anschutz Medical Campus Student Senate
UCCS Student Government Association
United Campus Workers Colorado
CLAS Counsel on Diversity and Inclusion, CU Denver
Ethnic Diversity Committee, CU Denver
Office of Diversity and Inclusion, CU Denver | Anschutz
Center for Excellence in Teaching and Learning, CU Denver
College of Liberal Arts and Sciences Council, CU Denver
Digital Pedagogy Lab, CU Denver
Women & Gender Center, CU Denver
Boulder Faculty Assembly Diversity Committee, CU Boulder
Faculty Minority Affairs Committee, UCCS
Faculty Assembly Women's Committee, UCCS
LGBTQ+ Faculty Committee, CU Denver
Auraria Library Faculty, CU Denver
Libraries Faculty, CU Boulder
Philosophy Department, CU Denver
Economics Department, CU Denver
History Department, CU Denver
Sociology Department, CU Denver
CCTSI Community Engagement Core, CU Denver | Anschutz
Antiracist Advocacy and Action Committee, CU Denver | Anschutz
Media Archaeology Lab, CU Boulder
Shea Swauger, Senior Instructor, Auraria Library, CU Denver
Charlene Ortiz, Colorado School of Public Health, Dept. of Community & Behavioral Health
Amanda Beyer-Purvis, Office of Inclusive Excellence in STEM, CU Denver | Anschutz
David Mays, Professor, Civil Engineering, College of Engineering, Design Computing, CU Denver
Matthias L. Richter, Associate Professor, Asian Languages and Civilizations, CU Boulder
Jacquie Richardson, Department of Chemistry, CU Boulder
Wendy J Rauf, Community College of Denver
libi rose striegl, Manager, Media Archaeology Lab, CU Boulder
Lori Emerson, Associate Professor of English, Director of Intermedia Arts, Writing, and Performance, CU Boulder
Miles Huskey, MSUD alum, former CU Denver employee
Jenn Greiving, Instructor, University of Colorado Denver
Laurel Sindewald, PhD Candidate, Integrative and Systems Biology, CU Denver
Salim Lakhani, Assistant Professor, Dept. of Computer Science and Engineering, CU Denver
Carly Setterberg, MD Candidate, University of Colorado School of Medicine
Gordon Matthewson, MD Candidate, University of Colorado School of Medicine
Current CU technologies that use either facial detection or facial recognition:
Automated Proctoring Software such as Proctorio, Respondus, ProctorU, Examity, etc.
Resources about FRT and equity:
A US Government Study Confirms Most Face Recognition Systems Are Racist. Karen Hao, MIT Technology Review, December 2019 https://www.technologyreview.com/2019/12/20/79/ai-face-recognition-racist-us-government-nist-study/
Facial Recognition Software Has A Gender Problem. Lisa Marshall. Oct. 8, 2019 https://www.colorado.edu/today/2019/10/08/facial-recognition-software-has-gender-problem
ICE Used Facial Recognition to Mine State Driver’s License Databases. The New York Times. Catie Edmondson July 7, 2019 https://www.nytimes.com/2019/07/07/us/politics/ice-drivers-licenses-facial-recognition.html
Clearview’s Facial Recognition App Has Been Used by The Justice Department, ICE, Macy’s, Walmart, And The NBA. Caroline Haskins, Ryan Mac, Logan McDonald. Buzzfeed. February 27, 2020 https://www.buzzfeednews.com/article/ryanmac/clearview-ai-fbi-ice-global-law-enforcement
Racist Facial Recognition Technology Is Being Used by Police at Anti-Racism Protests. Verdict. Lucy Ingham, June 5, 2020 https://www.verdict.co.uk/facial-recognition-technology-racist-police-protests/
Amazon “Stands in Solidarity” Against Police Racism While Selling Racist Tech to Police. The Intercept. Sam Biddle. June 3, 2020 https://theintercept.com/2020/06/03/amazon-police-racism-tech-black-lives-matter/
How White Engineers Built Racist Code – And Why It's Dangerous for Black People. The Guardian. Ali Breland. December 4th, 2017 https://www.theguardian.com/technology/2017/dec/04/racist-facial-recognition-white-coders-black-people-police
‘It's Techno-racism’: Detroit Is Quietly Using Facial Recognition to Make Arrests. Tom Perkins, Aug 17, 2019 https://www.theguardian.com/us-news/2019/aug/16/its-techno-racism-detroit-is-quietly-using-facial-recognition-to-make-arrests
Amazon face-detection technology shows gender and racial bias, researchers say. CBS News. January 25, 2019. https://www.cbsnews.com/news/amazon-face-detection-technology-shows-gender-racial-bias-researchers-say/
'Racist' passport photo system rejects image of a young black man despite meeting government standards. James Cook. The Telegraph. September 19, 2019 https://www.telegraph.co.uk/technology/2019/09/19/racist-passport-photo-system-rejects-image-young-black-man-despite/
“'A white mask worked better': why algorithms are not colour blind.” The Guardian. Ian Tucker. May 28, 2017 https://www.theguardian.com/technology/2017/may/28/joy-buolamwini-when-algorithms-are-racist-facial-recognition-bias
‘IBM will no longer offer, develop, or research facial recognition technology’. The Verge. Jay Peters. June 8th, 2020 https://www.theverge.com/platform/amp/2020/6/8/21284683/ibm-no-longer-general-purpose-facial-recognition-analysis-software?__twitter_impression=true
‘Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education.’ Hybrid Pedagogy, Shea Swauger. April 2, 2020 https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/ Universities around the county have recognized the dangers FRT create and have banned their use on campus, including:
Boston University
Brown University
California Institute of Technology
Carnegie Mellon University
Colorado State University
Cornell University
Emory University
Georgetown University
Harvard University
Howard University
James Madison University
Johns Hopkins University
Kalamazoo College
Kent State University
Massachusetts Institute of Technology
Michigan State University
New College
New York University
North Western University
Oregon State University
Portland State University
Rice University
Sarah Lawrence College
Stanford University
Tufts University
Tulane University
University of California Berkeley
University of California Davis
University of California Irvine
University of California San Diego
University of California Santa Barbara
University of California Santa Cruz
University of California Los Angeles
University of Chicago
University of Florida
University of Maryland
University of Michigan
University of Minnesota
University of Missouri
University of North Carolina Chapel Hill
University of Pennsylvania
University of Rochester
University of San Fransisco
University of Utah
University of Virginia
University of Washington
University of Wisconsin-Madison
Vanderbilt University
Wake Forrest University
Washington University
Western Washington University
Yale University
Commenti