PlaceMatters logo PlaceMatters text logo

Book Review: Who Counts? The Power of Participatory Statistics

WhoCountsCoverIn the past year, I’ve been focusing more on one of the roles that PlaceMatters is uniquely positioned to play—getting information from academics and researchers into the hands of practitioners. Because PlaceMatters bridges the research and tool/method development world and the practitioner world, we can help make sure practitioners are learning from research happening in universities on topics ranging from individual and group decision-making to tool development to effective data visualization.

Along these lines, I was recently sent a copy of the book “Who Counts? The Power of Participatory Statistics,” a collection of articles and case studies on the use of participatory data and information collection and knowledge sharing. The book, edited by Jeremy Holland at the Institute of Development Studies at the University of Sussex, focuses on international development and describes best practices for a variety of participatory techniques.

The book’s introduction is packed with useful information and an overview of participatory statistic and grassroots data collection and use, and how recent improvements in methodology are making participatory research more robust. In addition, I was particularly interested in the chapter on Participatory 3D Modelling, by Giacomo Rambaldi of the Technical Center for Agricultural and Rural Cooperation in the Netherlands.

Rambaldi discusses examples of the use of large 3D representations (generally the base of which is constructed using thin layers of durable material. Participants then add push pins, yarn, and other markers to represent their mental maps and knowledge of the area. Many features can then be verified using GPS, and the resulting maps are frequently more accurate than official maps. Rambaldi describes a process like this in Ethiopia that was geared toward helping the community repair environmental degradation from deforestation.

The mapping process in Ethiopia provided an opportunity for adults to remember and describe to youth what the environment had been like, and realize the impacts of deforestation on livelihoods. In addition, it provided an educational opportunity for all, since the mapping exercise meant participants thought about the connections between different parts of the ecosystem.

While this chapter was perhaps the most similar to work that PlaceMatters does in the U.S., there are many case studies and articles that are relevant to informed civic engagement in decision-making processes as well as the use of participatory statistics in evaluating program success. A few points that were raised overall, among projects around the world:

  • Participatory GIS, research, modeling, etc. must be carefully designed to be authentically participatory and include not only the elite
  • The information that comes out of participatory processes can lead to community empowerment, but also to the use of the data and information to further disempower residents—for example, identification of additional resources to be extracted by business or government. (PlaceMatters has been working on the flip side of this coin, advocating for the opening of datasets for the public to use).
  • New technologies such as OpenStreetMap, Ushahidi’s Crowdmap, wikis, and mobile tech are making participatory research, data collection and statistics easier and more accurate.

While “Who Counts?” focuses on international projects, its take-home lessons resonate with anyone working on engaging community. Some of the methodologies described would be easily transferred to domestic settings, and could be an improvement on the way we are engaging (particularly in non-urban settings). The book is worth a look, even for those of us not in international development.

Want the book? Here’s the info: “Who Counts? The Power of Participatory Statistics.” Edited by Jeremy Holland, with an Afterword by Robert Chambers. Published by Practical Action Publishing.

 

 

 

Civic Hacka-what?

Hack4Co

Hack4Colorado will be just one of 100 civic hackathons happening all across the U.S. on the weekend of May 31, all under the umbrella of the National Day of Civic Hacking. A hackathon is an event where computer programmers and others in the field of software development, as well as graphic designers, interface designers and project managers collaborate intensively on software projects. These events are food and caffeine fueled events where innovation happens and new ideas are born.

What does it mean to participate and support a Civic Hackathon? Well, it means different things to different people.

Some people come with visions of venture funding, a great new start up, building a company and becoming the next Techstars company. That’s a great aspiration but that’s not really the primary goal of a “Civic” hackathon. It could happen. You could build an app that really blows up and you form a company and sell this app to every city, state and municipality and retire like Ted Turner. But a Civic hackathon has a different spin. It’s about the community we live in and giving something meaningful back to that community.

There are others who come because they are just sick and tired of not having an app that tells them to move their car because it’s street sweeping day or they are desperate for an app that really addresses the Veterans struggle to overcome PTSD. At our last organizing committee meeting, one of our members was talking about their frustration of not being able to get live bus data to help her catch the next bus without standing around waiting. Can you get your head around that one? Imagine, you open the app on your mobile device or tablet and it tells you that the #6 will be at your stop in 2 minutes- better run!

Why will you come, invest a weekend, hack, collaborate, and compete? For the challenge? The food? The fun? The comraderie? To give back? For the prizes? Hack4Colorado promises to be challenging, super cool and if you’re good, very rewarding!

The organizers come from OpenColoradoPlaceMatters, and Executive Lattice. The sponsors include some great local companies like iTriageSendGridReadyTalkNoodles & Company, Illegal PetesGalvanizeCOIN, and Commerce Kitchen.

Check it out at www.hack4colorado.com. Registration is open for the event, May 31st – June 2nd, and we’d love to have you! Come on and join the Geeks for the Good of Colorado! Follow @Hack4Co on Twitter for more updates.
If you missed the Hackathon we put on last year, you can read a re-cap of it here.

Re-posted with permission from writer, Ann Spoor. Original post here.

 

Big Data, Open Data and Planning at #APA13

At APA in Chicago?  If so, join me at 4 PM in Regency C for a panel discussion on Big Data, Open Data and Planning.  We will explore themes around data, technology and urban planning with some of the leading minds in both planning and technology.  This aims to be not just a “how-to” session but also a broader cross-sector conversation where the audience and panelists can learn from each other.  The panel brings together senior technologists and the senior planning counterparts from San Francisco, Chicago, New York and Philadelphia, including:

  • John Tolva, Chief Technology Officer, Chicago
  • Peter Skosey, Executive VP, Metropolitan Planning Council
  • Gina Tomlinson – Chief Technology Officer, City and County of San Francisco
  • Teresa Ojeda – Manager, Information and Analysis Group, SF Planning
  • Gary Jastrzab – Executive Director, Philadelphia City Planning Commission
  • Andrew Nicklin – Director of Research and Development, NYC Dept of Information Technology and Telecommunications

Look forward to seeing you there!

 

Participation by Design: Providing Context for Data

This post, by guest blogger Daniel Saniski, is the eighteenth in a slightly-more-than-a-month-long series on the impressive diversity of participatory decision-making tools that communities can use for land use plans, transportation plans, sustainability plans, or any other type of community plan. Our guest bloggers are covering the gamut, from low-tech to high-tech, web-based to tactile, art-based to those based on scenario planning tools, and more. Daniel’s post explores the challenges and importance of unpacking complex quantitative data using unemployment statistics as an illustration. We welcome your feedback and would love to hear about the participatory design strategies that you’ve found to be the most useful.

“Unemployment is down!”
“Imports are up!”
“The price of coffee skyrocketed last month!”

News headlines scream data points at us each day assuming we understand their meaning, source, and context. Although we see the same greatest hits of data each month (unemployment rate, inflation, job openings, GDP, imports/exports, etc.), many people do not realize much of this data is available not just at a federal and state level, but is available for their town. The sheer quantity is sure to induce information overload and it takes great care to find exactly the right points. Local and comparison data from other cities, states, or a federal average can and should be used in community decision-making, but it is a bit of a challenge wrangling data without misleading people. Graphs provide enormous rhetorical power and should keep near the question at hand. Given the terabytes of possible data series we can explore, today we will explore some ways to frame and contextualize one metric: unemployment.

The Bureau of Labor Statistics tracks data regarding employment, prices, and consumer spending habits and they have well over 10,000 data series ranging from standard headline numbers to narrow measures like the prices of intercity bus and trains. Most of their major data sources contain federal, state, regional, and city/metro sub-series which can be used to provide endless curation opportunities. Using their unemployment data alone we can produce a number of discussions.

Consider the dramatic contrast of the unemployment rates of California and North Dakota. North Dakota has an unemployment rate of 3.1% while California clocks in at 10.9%. Why? Theories range from North Dakota’s use of a state bank to their extensive oil reserves. The answer, to keep correlation from causation, does not matter as much as the framing of the visual question. Seeing such a great disparity in North Dakota prompts a lot of compelling civic questions and can be easily used to start a discussion, although this is still a narrow context. In order to better inform the unemployment discussion, we need more numbers—some of which are equally dramatic.

The newspaper headline unemployment rate and the “real” one are often pretty far apart. The headline rate measures people actively participating in the unemployment system (i.e. on benefits, etc.), but not people who have dropped out of the formal economy or work less than they’d like. The broadest unemployment rate, the U-6 or “Total unemployed, plus all marginally attached workers plus total employed part time for economic reasons” measures all people on the employment fringes and is over 14% compared to the narrowest measure at 8.2%. Now we have some sense of measurement disparities, but these numbers do not tell the whole story.

One must look at the Labor Force Participation Rate and the Civilian Employment-Population Ratio as well. These two figures tell you the rate that people are participating in the economy. If the unemployment rate drops and these measures drop, then one of three things probably contributed: a whole lot of people retired, went to prison, or dropped out of the labor force. Dropping out means their unemployment insurance ran out and they are no longer part of the 8.2%, but if they’re still looking for jobs they’re part of the 14%. When trying to make sense of unemployment’s ups and downs and how they might affect your town, keep these in mind to make better decisions.

Bringing this closer to PlaceMatters, following is some data about Denver which we will unpack in a minute. Local unemployment has more lag than national numbers. Denver’s unemployment rate was 9.2% as of January, which is .9% higher than January’s national average. Payroll in Denver grew from 1990-2000, but has been essentially flat since then though the unemployment rate changed drastically. Third, the percentage of government employees in Denver has been stable at about 14% since at least 1990.

From these figures we gain some interesting insight. First and foremost, the unemployment in Denver should be addressed, as it is well above average. But we would be remiss to blame it on the crash of 2008. Why? Payroll in Denver has not substantially changed in more than ten years. How very odd—who are these unemployed people? The city’s population has expanded dramatically since 1990. There is a serious discussion to be had in Denver since people keep coming, aren’t getting jobs, and haven’t been for a decade.

As a final example, the last graph on this page shows the percentage of people working for the government in Denver. In an age where claims about “bloated” government size, we can show with some quick calculations that these arguments are untrue (in Denver). Taking some time to dig into employment statistics can help to track where your city has been, where it is, and where it is going.

By adding context at a federal, state, and/or local level we can gain greater understanding and start asking better questions—and framing the questions we do ask—with data. As seen when comparing the two big unemployment measures and their supporting participation rates, it takes more than one number, carefully curated, to successfully and fairly employ data graphs. Using these and other contextualized figures, we can help take government data out of the headlines and into our civic discussions where they belong.

For more information, see also:

This post was contributed by Daniel Saniski, the managing editor at Data360.org and an associate consultant at Webster Pacific LLC. He catalogs, writes news about government data, and guides site development for Data360 and provides business intelligence and information systems design services at Webster Pacific LLC.

Most Exciting Trends in 2012: Better Data and Apps for Planners

Shareabouts screenshot

Shareabouts is an open source app from OpenPlans that makes sharing ideas on a map simple.  Applications like this will make 2012 a year of more usable apps and better data for community decision making.

This past year, we’ve seen the growth of community decision making tools around planning.  In my estimation, 2012 will continue this trend and bring more usable, integrated apps to the world of community decision making, giving planners and community leaders a broader and more efficient toolkit for engaging stakeholders in a decision-making process.

In the world of mapping, we’ll see more ways for people to easily contribute to maps about the places they live.  These apps have been around for a while, but now they’re getting easier to manage and deploy.  For example, our friends at OpenPlans have an emerging platform called Shareabouts (blog | git repo), that is open source and has a clean, usable interface.  MindMixer just added maps to their web-based community idea platform, and these guys have given a lot of thought to user-centered design.  These more usable apps will increase our ability to crowdsource relevant geographic data. The mapping interfaces of yore were pretty clunky, but this will be less the case in 2012.

Continue reading

Communicating Hot Data

At PlaceMatters we are always looking for ways to better communicate information relevant to helping stakeholders make informed sustainability and planning decisions.  For example, our recent Cape Cod Interagency Transportation, Land Use, and

Cal-Adapt Temperature Map

Cal-Adapt Temperature Map Showing Impacts of Climate Change

Climate Change Pilot Project used CommunityViz and our DIY touch-tables to let participants place future jobs and housing on a map, and then immediately see the impacts of those development patterns on greenhouse gas emissions and amount of development threatened by sea level rise.

The Geospatial Innovation Facility at the University of California, Berkeley has created Cal-Adapt, a tool that has some excellent features for communicating potential impacts of climate change on California.  Cal-Adapt has visuals that show changes in temperature, snow pack, precipitation, sea level rise and wildfire risk.  Users can zoom in on the map and quickly click between layers to see impacts on their locality, which could help drive local decision-making around adaptation to climate change.  The Cal-Adapt website also has links to resources and publications, and a section that tells stories that highlight the impacts of climate change, from impacts on natural systems to changes in electricity demand.

The site is smart in its presentation of the facts–using news and other “stories” to bring home the impacts, showing data clearly on maps and graphs, and providing additional information.  It’s a full suite of educational options for those with varied interests or who might be impacted by different information.  As PlaceMatters, we are striving to make all our projects and tools as diverse with respect to how to communicate detailed and complex information.  As we use and develop tools to measure impacts on a variety of sustainability indicators, Cal-Adapt will serve as a great example.

New Spatial Decision Support Portal Available

SDS Knowledge Portal graph

SDS Knowledge Portal graph. Users can click on the graph to explore the relationships within the ontology.

The folks at the University of Redlands have announced the latest version of the Spatial Decision Support (SDS) Knowledge Portal.  Working with the SDS Consortiumof which PlaceMatters is a member, they have polished the interface to make the SDS ontology more accessible to users.  The SDS ontology is an expert-driven approach to provide semantic consistency when talking about spatial decision and planning related processes.  The ontology is built on linked concepts that structure the way in which we think and talk about spatial problem solving.  For example, planning is a spatial decision problem type which has a series of related methods, which may have a set of phases or steps and so on.  These relationships are mapped graphically on the site through an interactive browser that allows the user to explore the ontology by clicking on the related nodes (image above), or a user can browse the ontology in a hierarchical fashion.

Additionally, the SDS portal has a special GeoDesign view that can be accessed in the upper right-hand corner.  Switching to this view gives the user a perspective on a slightly narrower domain of GeoDesign problem types.  GeoDesign is an evolving field of cross-disciplinary practitioners, educators and researchers that are looking at the relationship between geography and design.  While there is no singular agreed upon definition, SDS characterizes GeoDesign the following way:

From the SDS perspective, GeoDesign has a broader sense and a narrower sense. In the broader sense, GeoDesign is a process of creating solutions for geo-spatial problems, involving design and decision making activities at various stages of the process. Typical stages of this process include

  • design goal identification and design requirement development
  • design process mapping
  • condition assessment including data development and domain knowledge process model development
  • suitability assessment
  • design (with iterative, creative design activities supported by fast feedback on the design through constraints, function, performance checking models)
  • impact and performance analysis of the alternative designs
  • design alternative selection
  • etc.

This sense is similar to Steinitz’s definition above.

In a narrower sense, GeoDesign focuses specifically on the design phase of the above mentioned GeoDesign process, focusing on the creative design activities (such as drawing) and the associated feedback methods and technology that allows rapid design iteration and modification. This sense is similar to Flaxman’s definition.

The GeoDesign view of the portal is a great start on clarifying terms and approaches and it is quite likely there will never be a singular definition, but rather some frameworks and concepts that emerge over time.  As part of that, we are working on building GeoDesignWorld which will be the social component of this work.  The goal of this site is to connect people working on GeoDesign related problems to each other to help solve some of our greatest challenges.  More will be posted on this as the site emerges from the development stages.

Let us know what you think of the resource and how it can be improved by commenting below.  What other related resources would you want to see come out of this work?

Data Smorgasbord: IBM’s City Forward

http://www.youtube.com/watch?v=DeZ6Sgu2Pu8&

I’ve recently blogged a couple of times about scientists measuring attributes of cities and some of the interesting correlations and predictions that can come out of those measurements.  Jacob Smith recently alerted me to IBM’s launch of City Forward, an online tool that contains large amounts of information about 55 cities–including traffic patterns, health metrics, consumer spending, and a lot more.  Now users can do the science and see if they can tease out other correlations and connections within the data.  Check out their You Tube introductory video (including some cool tilt-shift videography) above.

 

PlaceMatters Weekly Blog Roundup: September 7, 2010

Next American City offers a few thoughts about the growth of real-time data feeds and opportunities to improve the efficiency of city services and quality of life (e.g., negotiating traffic).

Urban STL laments the lack of participation in the “Framing a Modern Masterpiece” competition to redesign the grounds of the St. Louis Arch.

Digital Urban published a report on data mashups and the future of mapping.

Digital Urban also writes about new features available in Sketchup 8 and about integration with Lightup.

Here on the PlaceMatters blog, Jason Lally wrote about the relationship between geodesign and planning, the role of geodesign in facilitating collaboration, and next year’s GeoDesign Summit.

The National Coalition for Dialog and Deliberation writes about the Open Model for Citizen Engagement (OM4CE).

The Mind of Mbuga Njihia describes the winners of the Knight Foundation’s “Knight News Challenge.”

Transportation for American kicked off a twelve-part series – one case study a day – on livability (h/t to StreetsBlog).

Transit Information for Better Urban Living

Real time information for Chapel Hill transit system

Image originally featured on NextBus news showing real time transit information.

Recently New Urban News featured an article on how transit information and car sharing are making it easier for urban dwellers to get around, save money and shed some CO2 (Smart phones + shared cars = better urban living).  It’s been over a year since Google added transit information to the iPhone.  I blogged about this way back when.  I am glad to see that cities have finally picked up on the importance of shared transit data and open APIs.  For example, Massachusetts released their data and an API last November for five of the busiest bus lines in the system.  Christopher Dempsey of the Massachusetts Dept of Transportation had this to say about the release:

Within an hour, an application (“app”) using the information was placed on Google Earth, giving real-time location of buses on those lines.

• In two days, a programmer created a web page that tracks the buses’ movements.

• In five weeks, the data was on apps for iPhones and Android phones.

• In seven weeks, the data was available for delivery to any phone.

• In March, a shop in the Jamaica Plain section of Boston, installed an LED sign counting down to the arrival of the next bus. The sign cost the shop — JP Licks, a café and ice cream store — $300. It reportedly has brought in additional business from individuals who now know how much time they have for a coffee or a snack before the bus arrives. Many more shops may follow suit. Continue reading