City Data - Design Research Method

I developed a set of research methods to identify needs of city employees to quickly deliver an inexpensive and easy-to-use data tool.


My research methods connected raw data to team goals and responsibilities. In one meeting, I helped our stakeholders:

  • Surface the most actionable data points

  • Identify why staff would use these data points to achieve their goals

  • Ensure that people’s roles and responsibilities could support the final plan

Role: Product Researcher

Tools: User research, product design, web analytics

Timeline: 3 weeks


​To get the fully story behind the city partnership described below, check out my separate blog post.

My Role

This project took place during my time as a Product Apprentice at NYC Opportunity, a city agency working to reduce poverty and increase access.


While there, I helped the digital product team solve tough interaction design problems, run user sessions, compose product briefs, and build web analytics dashboards.


The awesome apprentice team at NYC Opportunity.

The Challenge

A partner city agency asked us to provide a weekly web analytics report for a digital product we built for them. At the time, we were already sending them similar data.

We could have immediately jumped to a solution. But first we wanted to better understand the reasons behind the request:

  • What was the reason behind a request for weekly data? 

  • Among dozens of web metric data points, which would be most useful? 

  • Lastly, how could we offer a data solution that was maintanable for both teams?

With all of these challenges in mind, I tailored common user research methods to gather information to scope a new data report.


Designed worksheets and cards used during the interview with partner staff.



  • I gathered baseline research on the history of the city agency partnership by interviewing immediate team members.

  • At the end of those interviews, I tested out the research methods I was developing in order to get feedback from my colleagues.

  • I then used the final research methods in a meeting with our partner staff. The research methods included a Stakeholder Interview and Closed Card Sort.

    • The Stakeholder Interview took the form of a worksheet with partially blank user stories for each partner staff role and how they might use the data being requested, and why.

    • The Closed Card Sort included printed cards for each of the 40+ web analytics data points that staff requested. I asked them to sort the cards into three categories: "I can act on it now," "I can act on it later," and "I don't expect to act on it for a while."

      You can learn why I chose these methods further below.


Closed card sort activity in progress.


  • We did most of the synthesis together in the meeting. With this collaborative approach, we arrived at a conclusion in a manner that nurtured the partner relationship.

  • Together we generated insights from the patterns we noticed in the staff's answers to the research questions.

  • From these insights we could more clearly see our opportunities. In response, we came up with a few solutions:

    • We went from a request for 40+ data points to only three. The three data points closely matched their current goals, making them more actionable.

    • We also went from a weekly report for an unclear audience to a monthly report for senior staff meetings. This change made sense, as it was primarily senior staff who could first act on the data.

Forty to one data points.png

 We went from 40+ data points to the most actionable three.


  • I designed a report using the three data points in Microsoft Excel.

  • Excel was a perfect tool because it didn’t require costly technical development. Plus anyone with access to spreadsheet software could access the report. 


  • We gave staff the Excel report for immediate use because we knew them so well. While more user testing would have been ideal, the research we did together was enough to match the needs and available resources for the project.

  • When we checked in, we learned they could easily use the monthly reports and had no further requests.


  • Even better, we learned they no longer needed us to send the monthly report because they started to produce the report themselves.

  • They took the raw data and use Python scripting to complete an automated analysis in time for their senior staff meetings.

How It Works

The methods I used are familiar to user researchers: a Stakeholder Interview and a Closed Card Sort. What makes my approach unique is how I remixed these methods to apply to data needs.


Essentially, my research methods managed to ask a fairly pointed question without putting our partners on the spot: what were they planning to do differently with all the data they were requesting, and why?


These research methods helped us arrive at the necessary answers, but in a collaborative way.


  • I conducted a Stakeholder Interview that relied on user stories as interview prompts to describe people's larger roles, goals and motivations (whereas more typically, user stories describe a specific product feature or functionality from a user's perspective).

  • User stories were a practical substitute for in-depth research interviews  for which we didn’t have time, nor were necessary because we knew our partner well.

  • However, we did need to get on the same page about what they really needed, and why. I broke the user stories down into two parts:

  1. Ask the staff point person to describe people’s roles: "I'm a program manager. My role is ______."

  2. Then ask them how each team member could use the data: "I can use data to ______ so that ______."

User Story.png

User stories as interview prompts.

  • User stories quickly consolidated our shared understanding of who would use the data and why. 

  • Without this step, we might have lost our sense of purpose as we got into the nitty gritty of data points.


  • I asked the partner staff to sort the data points they requested into the three categories.

  • The categories I proposed were time-based: could they act on the data point now, later, or not for a while? The purpose was to surface which data points were most actionable now

  • There were three steps:


  1. Place 40+ web analytics data points, printed out on cards, onto the table.

  2. Ask the staff person to sort them into three buckets:

    I can act on this data now.
    I can act on this data later.
    I don't expect to act on this data for a while.

  3. One the cards are sorted, discuss how to handle the data points that are needed now. Who exactly could use them, how often, and what format they would need them in?

Card sirtug activity.png

Sorting data points according to actionability.

  • The Closed Card Sort helped naturally surface the current goals of the partner staff.

  • By asking why certain data points were more actionable than others, we learned their most important goal.

  • We aligned around a final reporting plan with a perfectly understandable justification: they needed just a few data points to show if their staff were using this new digital product or not.

What’s Happened Since?

  • Our partners gained a clearer understanding of their own needs which led to an easier to maintain solution that met their most important goals.

  • Later on they adopted the reporting themselves. They took the raw data we sent and used Python scripting to complete an automated analysis in time for their senior staff meetings.

  • I’ve since used this method at my current role on a data and product team at the NYC Taxi & Limousine Commission.

  • I’ve given a presentation on the principles behind using these research methods as applied to data purposes.