Vanessa Aceves

The Project

Vanessa’s (she/her/hers) project will ensure academic success and future independence for low-income students with complex communication needs by expanding access to highly needed augmentative and alternative communication (AAC).

Upwards of 91,000 students in Illinois have complex communication needs and would benefit from communication devices to express themselves to fully participate in their education. Despite federal and state laws that protect students who require communication devices, Illinois school districts systematically fail to appropriately evaluate and train educators to implement a communication device. For low-income families, school is often the only place they can turn to access a communication device for their children due to challenges with health care access.

Fellowship Plans

During her Fellowship, Vanessa will provide direct legal assistance to families in both English and Spanish, including advocacy support, representation at Individualized Education Program Meetings (IEPs), mediations, and administrative due process hearings. Vanessa will conduct community outreach trainings to increase advocacy support and inform of legal rights to communication devices. Vanessa will also collaborate with community partners to develop a direct referral base for AAC throughout Illinois.

As a life-long Chicagoan and former CPS student, Vanessa is deeply passionate about quality public education for all. When she discovered special education law existed, she knew it was the path for her.

My Equal Justice Works Fellowship is a personal mission, based on my brother’s lack of access to critical communication devices. Through my project, I want to uplift the voices of similar students who have been denied opportunities to effectively communicate.

Vanessa Aceves /
2023 Equal Justice Works Fellow

The Project

Grant (he/him/his) investigates the automated decision-making systems used in government benefits programs and advocates for low-income individuals across the United States whose benefits have been unfairly reduced or eliminated because of algorithmic bias.

More than 37 million Americans are in poverty, and with rising inflation and the ongoing COVID-19 pandemic, many more rely on government benefits to keep themselves afloat. To meet rising needs, state and local agencies across the United States have turned to automated decision-making systems to make government benefits programs more efficient and effective. However, many automated decision-making systems are deeply flawed and exhibit serious errors and biases that unfairly reduce or eliminate government benefits for those most in need: low-income communities of color. For the millions who rely on government assistance, these algorithmic errors can cause serious harm—often without impacted individuals ever knowing.

The son of an engineer, Grant has long believed that technology and justice go hand-in-hand. He has worked both within and outside government to fight algorithmic bias, surveillance, and corruption.

Fellowship Plans

During his Fellowship, Grant will file Freedom of Information Act (FOIA) requests to publicize how government agencies use automated decision-making systems and challenge agencies who fail to disclose information in court. Alongside organizations serving low-incoming communities, Grant will develop educational materials and provide support for impacted individuals by filing amicus briefs. And to prevent future harm, Grant will push state agencies and legislatures alike to adopt A.I. guidelines that protect individuals from algorithmic harm.


Outsources and Automated: How AI Companies Have Taken Over Government Decision Making

Too many families have struggled to stay afloat amid COVID-19, and they should be able to trust our government to support them. EPIC’s work ensures that agencies use artificial intelligence to help, not hurt, those in need.

Grant Fergusson /
2022 Equal Justice Works Fellow

The Project

Evelyn will provide pro se record sealing resources to low-income individuals in Harris County, Texas through an innovative, technology-driven tool.

The free, public self-help website, launched by The Beacon in August 2021, and made possible through funding from the Texas Access to Justice Foundation, allows any individual with a Harris County criminal record to determine eligibility to seal their record through nondisclosure and navigate through the step-by-step process of self-filing in Harris County.

There are approximately 400,000 individuals with Harris County criminal records who are eligible to have all or part of their record sealed. This Fellowship project is focused on connecting with those individuals whose records are preventing them from accessing housing, employment, and educational opportunities. In doing so, the project seeks to address the racial and economic disparities reflected in the criminal justice system.

Fellowship Plans

Evelyn will help users navigate and the pro se process, and she will track and troubleshoot common barriers in navigating the site. She will provide direct representation for ancillary civil legal issues to wholly remove individuals’ legal obstacles to employment and housing. Through case manager workshops and clinics, she will also help community organizations and pro bono attorneys assist users in completing the forms for filing. The long-term goal of the project is to increase the number of pro se filings and to work with courts and government entities to improve procedures for pro se filers in Harris County.

For people who do not have much, a criminal record becomes the weight that keeps them from accessing all the basic things that help one thrive in life: an education, a good job, and a place to live. By removing legal barriers, this project creates opportunity and gives people in need a chance to overcome other obstacles in their lives.

Evelyn Garcia Lopez /
2022 Equal Justice Works Fellow

The Project

Ben investigated and published information about automated decision-making systems used in high-risk government services throughout the criminal justice cycle. Ben educated the public, advocates, and legislators working to address and combat the inherent biases in both the underlying data and algorithms used in the criminal cycle.

The increasing reliance on data and algorithms to make decisions about the length and severity of punishment among other important determinations is an underappreciated trend in the criminal justice system today. One example is the algorithm used to determine recidivism risk and to set bail, commonly referred to as a “risk assessment,” which has been shown to have disparate impacts on people of color. Other algorithms are used to determine eligibility for government benefits and more. Yet despite the increasingly significant role that these algorithms play in our justice system, they operate largely in a black box. Bringing them to light and instituting proper accountability and testing procedures will be essential to control the disparate impact these systems are having on underrepresented and over-incarcerated communities.

Fellowship Highlights

During the two-year Fellowship, Ben has:

  • Testified in support of a bill establishing transparency and accountability in government procurement of automated decision-making systems in front of the Washington State Legislatures and submitted written testimony to the Massachusetts State Legislature
  • Published a report called Liberty At Risk, featuring significant FOIA documents, legal analysis, and recommendations around the use of Pretrial Risk Assessments
  • Worked with government agencies and other organizations to help understand and strategize about the use and impacts of automated decision-making systems.
  • Published and maintained web pages highlighting open government work, legal analysis, and critical educational context

Next Steps

Ben will transition to a Counsel role at EPIC, where he will do similar work leading AI and Human Rights work both inside and out of the Criminal context.


AI legislation must address bias in algorithmic decision-making systems

In California, voters must choose between cash bail and algorithms

An Algorithm That Grants Freedom, or Takes It Away

Algorithms Were Supposed to Fix the Bail System. They Haven't

Going back to work or school? An algorithm may warn you to keep your distance from others

Technology Adoption Around the Criminal Justice System is a Tightrope

The capabilities that algorithms have to improve impartiality and efficiency within the courts and policing are vast and exciting—but can’t come at the cost of equality, transparency, and understanding in order to mitigate the perpetuation of inequitable incarceration.

Ben Winters /
Equal Justice Works Fellow

The Project

Margaret’s project focused on providing direct Freedom of Information Act (FOIA) assistance to nonprofit organizations and systematically analyzing the most pressing FOIA needs of the nonprofit community to strategically litigate FOIA cases. Coordinating these activities through a newly-created Public Interest FOIA Clinic at PCLG and using a web-based interactive feedback tool, Margaret helped nonprofit organizations realize their goals in assisting underserved communities.