We develop simulation systems to predict and quantify the potential impact of disruptive interventions within new and existing cities. We place a special emphasis on augmented reality decision support systems (ARDSS) that facilitate non-expert stakeholder collaboration within complex urban environments. Such systems blend hardware, software, human interface design, cloud computation, and variants of so-called big data. "CityScope" is an open source platform for shared, interactive computation.
Contact the Developer: Ira Winder, Research Scientist
CityScope FEATURED in White HOuse Report
The President’s Council of Advisors on Science and Technology (PCAST) recently published a report "Technology and the Future of Cities Report to the President" that featured the CityScope Project developed by Ira Winder.
We're always busy working with partners to bring our prototypes to reality.
As part of a cooperation between MIT and HCU the first urban model of this kind was built in an interdisciplinary student-workshop using Lego bricks as well as traditional materials. On a surface covering 4 square meters, the model displays parts of Hamburg-Rothenburgsort, currently a prospective urban development.
By placing optically tagged Lego bricks representing different building types on the CityScope, changes to the city are visually revealed in real-time by changing color-codes projected onto the pieces. Thus different variables such as the walkability, access to jobs, housing or open space can easily be estimated. Configurations and data from the Hamburg model are uploaded to our cloud server and viewable from the web.
We led a workshop in Saudi Arabia, with staff from the Riyadh Development Authority, to test a new version of our CityScope platform. With only an hour to work, four teams of five professionals competed to develop a redevelopment proposal for a neighborhood near the city center.
We are working to deploy AR Decision Support Tools with cloud-based API in Chinese developing cities with in-kind support from iSoftStone.
The Mobility Futures Collaborative in the MIT Department of Urban Studies and Planning (DUSP) and the Changing Places group at the MIT Media Lab developed new interactive tools aimed to better communicate the possible impacts of new transit systems. Read the Blog Post.
Contributors: Anson Stewart, Ira Winder, Ariel Noyman, Alley Michel, Phil Tinn, Chris Zegras, Ryan Chin, Barr Foundation. Video by Ariel Noyman.
Structure of Collaboration
Our researchers are actively exploring new technologies for realtime data visualization, tangible interaction, and simulation. Check out some of recent work below:
CITYSCOPE: Urban Intervention Simulator
The platform is a tangible-interactive urban rapid prototyping environment. Users from any background can collaboratively configure the neighborhood with their own vision for land use and density. Users receive real-time feedback of different urban metrics including walkability, energy, daylighting, and trip generation.
MIT CityScope is patent pending.
Contributors: Ira Winder
Andorra tourism Flows
CITYSCOPE: ANDORRA MODEL
A unique implementation of the CityScope platforms is being developed for the country of Andorra to emphasize tourism, energy, and traffic congestion.
CITYSCOPE: Land Use Transportation Simulator
CITYSCOPE: Playground for Zoning Kendall Square
A student team leverages the CityScope platform in a unique implementation "Playground" that empowers non-expert stakeholders to experiment with form-based zoning. The platform provides feedback when user input violates certain zoning restrictions coded into law, but does not prohibit such interventions.
CityScope: Real-Time Data Observatory
GIS data is used to create "LEGO-tized," 3-dimensional representations of existing urban areas. The model is augmented with layers of information via projection mapping.
Click through the above images to see various visualizations of Kendall Square in Cambridge, Massachusetts.
The CityScope "Scout" prototype transforms any tabletop into a canvas for land use planning, and allows rapid prototyping with real-time evaluation. The system strives to create an intuitive experience, as users are less likely to have expert-knowledge of the simulations models we use.
Contributors: J. Ira Winder, Joshua Fabian
The Reconfigure prototype allows users to edit a digital model by moving physical abstractions of building typologies. Movements are automatically detected, scanned, and digitized so as to generate inputs for computational analysis. 3D information is also projected back onto the model to give the user feedback while edits are made.
The "Mark IV" prototype (below) was developed for a short demonstration at the Edinburgh Culture Summit in August 2014.