Environmental Reporting with Sensors

Sensor journalism uses sensors to collect information about our environment. It opens new possibilities for journalists enabling them to collect and process data that might not be available or at a level of detail not previously available.

Technology for building sensors to collect data on the environment have become affordable and extensible thanks to technology like Arduino and Raspberry Pi. Sensor data can include but is not limited to information about what is in the water, air, temperature, sound level, wind and soil. This project will measure the air quality of different places on campus. Sensors will be built that measure particulate matter in the air and report back their readings at frequent intervals. Using the data collected, students will visualize the air quality on campus as well as how it fluctuates throughout the day. This project is meant to be a prototype of how to execute an end to end reporting process that uses sensors to collect data and then use that data to visualize a synthesis for an audience. Part of this project will include documenting the build process to allow others to be able to replicate our sensors for data reporting.

Sensor journalism is not new — many news organizations have been using sensors for powerful storytelling for years. Some recent cases include The Houston Chronicle’s investigative story “In Harm’s Way,” where they used sensors to examine air quality where people lived near oil refineries and factories. The investigation identified poor air quality in the area despite state and federal regulations showing the factories and refineries were within legal limits. USA Today’s “Ghost Factories” series examined soil contaminants from old metal factories. The reporters used x-ray gun sensors to scan the soil. Their analysis showed several sites were above the EPA’s limits in terms of arsenic and lead.

Media Shift

Sensor journalism could have the most impact in environmental reporting. In the current political climate, journalists cannot assume that government agencies will be able to continue collecting data on the environment. Data that’s open and available to the public and journalists can be limiting but affordable technology and the journalistic process can help fill in the gaps to inform the public and influence public policy,

This project will measure the air quality of different places on campus. Sensors will be built that measure particulate matter in the air and report back their readings at frequent intervals. The data will be collected using either our experimental long range point to point data transmitters using Lora technology or it will be systematically collected. The data will be logged to a database. Using the data collected, students will visualize the air quality on campus as well as how it fluctuates throughout the day. This project is meant to be a prototype of how to execute an end to end reporting process that uses sensors to collect data and then use that data to visualize a synthesis for an audience. Part of this project will include documenting the build process to allow others to be able to replicate our sensors for data reporting.

Students should be exposed to idea that they can go out and generate their own data for reporting. Many Engineering students would like more experience getting down to the hardware level of interacting and creating their own computers. These are skills that will distinguish our students and empower them with skills that are in demand and very much needed in journalism.

Results

Results from the project

  • Democratizing data reporting from the ground up

    Building a dust sensor network

    As the Trump administration continues to curtail funding for longstanding federal agencies like the EPA, citizens have taken it upon themselves to gather and contextualize the data formerly compiled by federal agencies. Scientists, academics and individuals have banded together to form [save the data](https://www.datarefuge.org/) events to protect existing environmental data that has already been collected for fear it may be deleted or made inaccessible. This trend raises the question: Where will future data come from?...

    Continue Reading

Faculty and Staff Leads

Scott Bradley

Senior Engineer

With engineering and computer science degrees from Purdue and DePaul, Scott focuses on AI, NLP & web solutions for science and media.

Zach Wise

Associate Professor

Emmy winning interactive producer & Associate Professor @NorthwesternU, @KnightLab. Formerly of The New York Times. Creator of TimelineJS & StoryMapJS

Students

Holly Kane

Victoria Cabales

David Wallach

Rachel Inderhees