What if we created machine readable policies that connect to state of the art evidence on visual and noise pollution to create dynamic, outcomes-based regulation. We increasingly understand how disturbances in the sensory experience of the city (e.g. noise pollution, blue lights, visual noise) influence human cognitive functions and psychological wellbeing. We could create place based experiments and regulatory sandboxes that collaboratively explore alternatives to a particularly insidious issue.
Sounds of New York City (USA)
Citizen scientists help address NYC’s noise by annotating audio clips for machine learning technology. SONYC helps label different urban soundscapes recorded by our sensors and train machine learning technology to mitigate sources of noise pollution.