How do you measure that?
This was the question at the heart of last week's MuseumCamp. MuseumCamp is an annual professional development event at the Santa Cruz Museum of Art & History in which teams of diverse, creative people work on quick and dirty projects on a big theme. This year, the theme was social impact assessment, or measuring the immeasurable. We worked closely with Fractured Atlas to produce MuseumCamp, which brought together 100 campers and 8 experienced counselors to do 20 research projects in ~48 hours around Santa Cruz.
We encouraged teams to think like artists, not researchers. To be speculative. To be playful. To be creative. The goal was to explore new ways to measure "immeasurable" social outcomes like connectedness, pride, and civic action.
The teams delivered. You can check out all twenty research projects here. While all the projects are fast, messy, and incomplete, each is like a small test tube of ideas and possibilities for opening up the way we do social impact research.
Here are three lessons I learned at MuseumCamp about research processes:
- Look for nontraditional indicators. The JerBears group used "passing of joints" as an indicator of tribal affinity at a Grateful Dead tribute concert. The San Lorenzo Levee group used movement of homeless people as an indicator of social disruption. People x (Food + Place) looked at photos taken by children in a park to understand what contributed to their sense of community. Some of these experiments didn't yield anything useful, but some were surprisingly helpful proxies for complex human interactions.
- Don't (always) call it a survey. Several groups created projects that were somewhere between engagement activity and research activity. Putting stickers on signs. Taking photos. Finishing a sentence mad-libs style. My favorite example of this was the One Minute Art Project group, which rebranded a fairly standard sticker survey into a "fast, fun, free and easy" activity. They had several participants who said "I wouldn't do a survey, but I like doing this."
- Every active research method is an intervention. It's easy to look at the One Minute Art Project referenced above and see a red flag - maybe people self-select into this because it's "art" instead of "research." But I realized through this process that a survey solicitation is just an active an intervention as an engagement solicitation. There are different biases to who participates and why. But we shouldn't assume that any one research method is inherently "neutral" just because it is more familiar. Many of the most interventionist projects, like the Karma Hat, yielded really interesting information that was not visible in more passive research methods.
And here are three of my favorite findings from the experiments:
- On depth of bridging among strangers. Two groups dove into the work at the MAH on social bridging - one with the Karma Hat game, and one with a photobooth project. The Karma Hat required people to wear a hat, write their name on it, and pass it on. It was hugely used. On the other hand, a photobooth where people were prompted to take a photo with a stranger they met at the museum was barely used. We saw that people were ready and willing to engage with strangers at the museum, but not necessarily to build relationships on those engagements. This is just a drop in the barrel of exploration we are doing around bridging at the museum.
- On smartphone usage at natural sites. We Go to 11 studied the difference in mood change for people at a beautiful site overlooking the ocean relative to their smartphone use. They found that people with smartphones used them to go from a state of active negativity (tension, anxiety) to active positivity (energy, joy). People who didn't use smartphones at the same site tended to embody passive positivity (serenity, calm). Not a shocker, but a pretty interesting project.
- On the power of programming to spark civic action. This project, measuring the connection between empathy and action at an indigenous solidarity film screening, is full of useful insights. Read their report for thoughts about the challenges of participant observational research, the power of spiritual experiences, and the results of a compelling survey about ignition to action.
I encourage you to explore all the projects and see what insights might connect to your own work and research goals. You can comment on the projects too and share your own ideas. Please bear in mind that these were very quick projects and are more like research sketches than full evaluations.
What did you get out of MuseumCamp? If you didn't attend, what do you want to know more about?