Skip to main content

Resources

The MCDC works with the following facilities, centers, and resources, to facilitate connections between community organizations and the Metropolitan Chicago Data science Corps:

Further resources will be added over time. If you have ideas, please contact us!

Chicago-relevant data sources

Collections of data and more general sources

  • ChiVes is a data collaborative and community mapping application that brings data on Chicago’s environment together at the neighborhood level.

  • Chicago Metropolitan Agency for Planning

  • CHIHACK NIGHT is Chicago’s weekly event to build, share & learn about civic tech.

  • GitHub: people have already pulled data, for example, census or voting data, but it’s compiled and the work is done.

  • US Census data

Air Quality

  • EPA-sponsored air quality measurements

    • AirNow Realtime air quality data, with a focus on usability. Easy to access, but less detail for historical data.

    • AQS Official archived data. More detail, but requires API access.

  • Microsoft-sponsored air quality project (beta).

  • PurpleAir air quality data.

Traffic data

Health

Environment

Social

  • AI Now Institute. The AI Now Institute, housed within New York University, is an interdisciplinary research center analyzing the social implications of artificial intelligence. They focus on four core domains: rights and liberties, labor and automation, bias and inclusion, and safety and critical infrastructure. They currently host workshops and their website includes their various publications.
  •  
  • Algorithmic Justice League, The Algorithmic Justice League is a collective that aims to: highlight algorithmic bias, provide space for people to voice concerns and experiences with coded bias, and develop practices for accountability during the design, development, and deployment of coded systems.
  •  
  • Data for Black Lives,  a group of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people. Data for Black Lives seeks to mobilize scientists around racial justice issues.
  •  
  •  Fairness, Accountability, and Transparency in Machine Learning (FAT/ML).  FAT/ML is an organization that has hosted various events in the past five years discussing the impacts and practices surrounding bias in big data. Their website includes recommended principles and best practices, the schedule of their planned events, and information about the current projects their organizers are working on to push fairness, accountability, and transparency in data science and machine learning. 
  •  
  • The Fenway Institute- LGBT Population Health Program, The Fenway Institute of Fenway Health in Boston houses the LGBT Population Health Program. The Program works to develop and support collaborative research and education programs to understand and improve the health of sexual and gender minorities. The Program includes findings and more details on the work they have been doing on their website.
  •  
  • National LGBTQ Task Force, “We’re building a future where everyone is free to be themselves in every aspect of their lives. Today, despite all the progress we’ve made to end discrimination, millions of LGBTQ people face barriers in every aspect of their lives: in housing, employment, healthcare, retirement, and basic human rights. These barriers must go. That’s why the Task Force is training and mobilizing millions of activists across our nation to deliver a world where you can be you.
  •  
  • The Williams Institute, Census and LGBT Demographic Studies, The Williams Institute of the UCLA School of Law houses a plethora of studies on their website pertaining to a variety of topics within the LGBT community. Their research discusses LGBT health, data collection, legislation impacts, incarceration rates, hunger, and socioeconomic wellbeing among the LGBT community.

 

Books

  • Design Justice: Community Led Practices to Build the World We Need This book explores the theory and practice of design justice, demonstrates how universalist design principles and practices erase certain groups of people—specifically, those who are intersectionally disadvantaged or multiply burdened under the matrix of domination (white supremacist heteropatriarchy, ableism, capitalism, and settler colonialism)—and invites readers to “build a better world, a world where many worlds fit; linked worlds of collective liberation and ecological sustainability.” Along the way, the book documents a multitude of real-world community-led design practices, each grounded in a particular social movement. Design Justice goes beyond recent calls for design for good, user-centered design, and employment diversity in the technology and design professions; it connects design to larger struggles for collective liberation and ecological survival.
  • Data Feminism Data Feminism houses a community review site for folks to easily access their work and contribute their knowledge to the project. The Project uses experienced based testimonies as well as contributions from scholars and students from an array of fields. Currently, the Data Feminism community review site has eight chapters, an introduction, conclusion, and various other readings that discuss the impacts of data bias.  D’Ignazio, C., & Klein, L. (n.d.). Data Feminism · MIT Press Open. https://bookbook.pubpub.org/data-feminism
  • Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy We live in the age of the algorithm. Increasingly, the decisions that affect our lives–where we go to school, whether we can get a job or a loan, how much we pay for health insurance–are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules. Mathematician and data scientist Cathy O’Neil reveals that the mathematical models being used today are unregulated and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination–propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.
  • Algorithms of Oppression: How Search Engines Reinforce Racism Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color. Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance—operating as a source for email, a major vehicle for primary and secondary school learning, and beyond—understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.
    •  
  • Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor Automating Inequality systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.
  •  
  • Race After Technology: Abolitionist Tools for the New Jim Code From everyday apps to complex algorithms, Ruha Benjamin cuts through tech-industry hype to understand how emerging technologies can reinforce White supremacy and deepen social inequity. Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent when compared to the racism of a previous era. Presenting the concept of the “New Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Moreover, she makes a compelling case for race itself as a kind of technology, designed to stratify and sanctify social injustice in the architecture of everyday life. This illuminating guide provides conceptual tools for decoding tech promises with sociologically informed skepticism. In doing so, it challenges us to question not only the technologies we are sold but also the ones we ourselves manufacture.
    •  
  • Data Action: Using Data for Public Good Big data can be used for good—from tracking disease to exposing human rights violations—and for bad: implementing surveillance and control. Data inevitably represents the ideologies of those who control its use; data analytics and algorithms too often exclude women, the poor, and ethnic groups. In Data Action, Sarah Williams provides a guide for working with data in more ethical and responsible ways. Williams outlines a method that emphasizes collaboration among data scientists, policy experts, data designers, and the public. The approach generates policy debates, influences civic decisions, and informs design to help ensure that the voices of people represented in the data are neither marginalized nor left unheard.
  •  

Papers

  • Bias in Big Data 2019 Workshop White Paper The Bias in Big Data 2019 Workshop White Paper is a living recollection of the Bias in Big Data 2019 Workshop held at Northwestern University for those who want to learn and do more to challenge bias in big data and data science. The workshop sought to stimulate intersectional discussion about the role of bias in big data and to explore, in particular, how bias in data and data science impacts the health of sexual and gender minority populations. The workshop was also intended as a space for academic researchers and data scientists to become engaged with the data justice movement led by Data for Black Lives.
  • Datasheets for Datasets. Abstract: The machine learning community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on. Datasheets for datasets will facilitate better communication between dataset creators and dataset consumers, and encourage the machine learning community to prioritize transparency and accountability
  • Datasheets for Datasets help ML engineers notice and understand ethical issues in training data This Medium article describes testing of the Datasheets for Datasets intervention for addressing ethical problems in training data. These files accompany datasets to help engineers “notice potential ethical issues in unfamiliar training data by documenting the dataset’s context.

Talks

    • CONNECT is focused around using a systems-science approach in order to understand the complex mechanisms which drive the health disparities of stigmatized populations, in particular gender and sexual minorities. As part of this mission, CONNECT invited Dr. Sasha Costanza-Chock to give a talk on Design Justice on 8/5/21. 
    • Dr. Sasha Costanza-Chock (they/them or she/her) is a researcher and designer who works to support community-led processes that build shared power, move towards collective liberation, and advance ecological survival. This one-hour talk, along with a 30-minute Q&A session focused on their work on networked social movements, transformative media organizing, and design justice. 

 

  • Bias in Big Data, Prof. Michelle Birkett
    • CONNECT is focused around using a systems-science approach in order to understand the complex mechanisms which drive the health disparities of stigmatized populations, in particular gender and sexual minorities. A systems approach is important when thinking about health disparities because stigma isn’t just expressed through discrete interactions between people – it’s structural – meaning that it is baked into all aspects of life. Not just the victimization we experience, but where we live – to who surrounds us – to what opportunities we have access to. And many of the privileges and barriers individuals experience on a day to day basis – they’re likely not even consciously aware of. Which means that if we want to understand stigma, we can’t just talk to individuals, but we need to really understand the system around individuals which shape their health and well being. As part of this work – our group has tried to promote interdisciplinary discussion about the role of bias in big data and to explore, in particular, how bias in data and data science impacts the health of racial sexual and gender minority populations. This lecture is derived from Dr. Birkett’s talk at the 2019 Bias in Big Data workshop. It was a free, half-day event CONNECT organized on big data, bias, and health justice. The lecture is meant to introduce the many ways in which bias can infiltrate science, research, and data science. CONNECT Research Program: https://isgmh.northwestern.edu/center… Bias in Big Data White Paper available here: https://doi.org/10.21985/n2-kax9-ew70