{"id":91237,"date":"2021-08-31T14:05:44","date_gmt":"2021-08-31T18:05:44","guid":{"rendered":"https:\/\/cdt.org\/?post_type=insight&p=91237"},"modified":"2021-08-31T14:18:17","modified_gmt":"2021-08-31T18:18:17","slug":"protecting-student-privacy-and-ensuring-equitable-algorithmic-systems-in-education","status":"publish","type":"insight","link":"https:\/\/cdt.org\/insights\/protecting-student-privacy-and-ensuring-equitable-algorithmic-systems-in-education\/","title":{"rendered":"Protecting Student Privacy and Ensuring Equitable Algorithmic Systems in Education"},"content":{"rendered":"\n
Using student data responsibly<\/a> is about more than adhering to legal requirements \u2014 it also requires schools and their partners to use data in ways that help students and to guard against using technology to discriminate against, stigmatize, or otherwise harm them and their families. <\/p>\n\n\n\n As the technology and data used in education continue to evolve, discrimination and bias have taken on new forms, including in the use of algorithms. This summer, the Center for Democracy & Technology (CDT<\/a>) submitted two sets of comments (here<\/a> and here<\/a>) to the U.S Department of Education (ED), asking it to address the discriminatory effects of some algorithmic systems on marginalized groups of students.<\/p>\n\n\n\n An algorithm<\/a> is a process performed by a computer to answer a question or carry out a task, such as sorting students into schools, analyzing social media posts, or flagging students at risk for dropping out. Algorithms, however, are not neutral decision-makers. Subjective human judgments influence the design of the algorithm, dictate the purpose, design, and function of an algorithm and influence its outcomes. Moreover, data used to train algorithms may itself implicitly embed biases.<\/p>\n\n\n\n Algorithmic bias has very real effects on students, especially those in marginalized groups such as Black and LGBTQ students and students with disabilities. Consider:<\/p>\n\n\n\n To combat these harms, CDT is calling on ED to begin working to address the discriminatory effects of some algorithmic systems. Algorithmic bias may not only run afoul of the principles of responsible data use, but also students\u2019 legal rights to non-discrimination under Title VI<\/a> of the Civil Rights Act of 1964, Title IX<\/a> of the Education Amendments of 1972, and Title II<\/a> of the Americans with Disabilities Act. Those laws broadly protect students from discrimination due to their \u201crace, color, [] national origin,\u201d sex, gender identity<\/a>, or disability status. Those protections apply not only to explicit discrimination but to an \u201cotherwise neutral policy or practice<\/a>\u201d that has a \u201cdisproportionate impact\u201d due to race, sex, gender identity, or disability. <\/p>\n\n\n\n Because algorithmic systems are increasingly used throughout education and have the potential to provide benefits for, as well as cause harm to, students and families, it is important for ED to examine questions such as which types of algorithmic systems can have disparate impacts on marginalized students, what categories of training data can lead to discriminatory outcomes, and what mitigating steps can help reduce the potential for discrimination. Informed by research and fact finding, ED should consider providing resources for schools, creating guidance, and\/or engaging in rulemaking to help detect, mitigate, and avoid algorithmic bias. The scope of the guidance or rules \u2014 if any \u2014 should be appropriately tailored to the harms algorithmic systems pose.<\/p>\n\n\n\n CDT applauds the Department of Education\u2019s efforts to protect the rights of students to non-discrimination and ethical data use, and we look forward to working with them to ensure all students have an opportunity for an equitable education. <\/p>\n","protected":false},"featured_media":86101,"template":"","content_type":[7251],"area-of-focus":[834,10207,10204,78,833],"_links":{"self":[{"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/insight\/91237"}],"collection":[{"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/insight"}],"about":[{"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/types\/insight"}],"version-history":[{"count":6,"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/insight\/91237\/revisions"}],"predecessor-version":[{"id":91243,"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/insight\/91237\/revisions\/91243"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/media\/86101"}],"wp:attachment":[{"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/media?parent=91237"}],"wp:term":[{"taxonomy":"content_type","embeddable":true,"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/content_type?post=91237"},{"taxonomy":"area-of-focus","embeddable":true,"href":"https:\/\/cdt.org\/wp-json\/wp\/v2\/area-of-focus?post=91237"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}