ACM FAccT Conference 2023




Hyatt Regency McCormick Place

2233 S Martin Luther King Dr

Chicago, IL 60616

Register + More Info

Red oval with blue stripe and star. FAccT 2023.

ACM FAccT Conference 2023

Date: Monday, June 12- Thursday, June 15

Time: 10:00 am- 6:30 pm est

Algorithmic systems are being adopted in a growing number of contexts, fueled by big data. These systems filter, sort, score, recommend, personalize, and otherwise shape human experience, increasingly making or informing decisions with major impact on access to, e.g., credit, insurance, healthcare, parole, social security, and immigration. Although these systems may bring myriad benefits, they also contain inherent risks, such as codifying and entrenching biases; reducing accountability, and hindering due process; they also increase the information asymmetry between individuals whose data feed into these systems and big players capable of inferring potentially relevant information.

ACM FAccT is an interdisciplinary conference dedicated to bringing together a diverse community of scholars from computer science, law, social sciences, and humanities to investigate and tackle issues in this emerging area. Research challenges are not limited to technological solutions regarding potential bias, but include the question of whether decisions should be outsourced to data- and code-driven computing systems. We particularly seek to evaluate technical solutions with respect to existing problems, reflecting upon their benefits and risks; to address pivotal questions about economic incentive structures, perverse implications, distribution of power, and redistribution of welfare; and to ground research on fairness, accountability, and transparency in existing legal requirements.

The sixth annual ACM FAccT conference will be held in Hyatt Regency McCormick Place (You can start to book your hotel stay at Hyatt with the special group rate here.) in Chicago from Monday 12th through Thursday 15th of June 2023. The conference brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems. It builds on the success of the 2022 conference held in Seoul, South Korea.

Our research community is multidisciplinary in its make-up and orientation. The conference organizers welcome your contributions and insights. If you have thoughts and suggestions about this year’s conference, please e-mail the ACM FAccT 2023 General Co-chairs Christina Harrington, Sarah Fox, Aziz Huq, and Chenhao Tan at [email protected].

From Research Insight to Policy Impact: How You Can Engage in Current AI Policy Debates

June 12 at 4:30-6 pm CT / 3:30-5 pm ET


Now is an incredibly active time for AI policy and regulation, and FAccT researchers have much to contribute. How can you plug in? This interactive session will provide an overview of major AI policy-making efforts currently underway in the US and EU, and give guidance / build connections for researchers who want to get more involved. The session leads are advocates and former government staff who are working on policy efforts ranging from the AI Act in Europe, to next steps for the AI Bill of Rights, NIST AI Risk Management Framework, and other various agency and state initiatives in the U.S. Participants will leave the workshop with specific ideas of how they might engage in policy processes, and tools, guidance and relationships to act on those ideas. Attendees already working on these and other policy efforts are warmly welcomed to share their own experiences and build connections.

Co-Design Perspectives on Algorithm Transparency Reporting: Guidelines and Prototypes

June 15 at 1:15-2:15 pm CT / 2:15-3:15 pm ET


Recommendation algorithms by and large determine what people see on social media. Users know little about how these algorithms work or what information they use to make their recommendations. But what exactly should platforms share with users about recommendation algorithms that would be meaningful to them? Research has looked into frameworks for explainability of algorithms as well as design features across social media platforms that can contribute to their transparency and accountability. We build on these prior efforts to explore what a recommendation algorithm transparency report may include and how it should present information to users. Through a human-centered co-design research process we present the following results: (1) A set of guidelines for recommendation algorithm transparency reports; (2) initial suggestions, in the form of prototypes, for more engaging and interactive forms of transparency; (3) an evaluation of these prototypes’ strengths and weaknesses, and areas of exploration for future work.

Register here.