User-Centered Research & Evaluation Project
January 2023 - May 2023
For the Spring 2023 semester, I took the User-Centered Research and Evaluation Course. For the course, our group had to come up with a solution that could take action against algorithmic bias and ads. In my group, I enacted as a Product manager. Below, the research process for coming up with a final product to tackle the problem is presented. (For demonstration purposes, I have not mentioned numerous other exercises we did, but the content below is the main course for how we came up with a solution to tackle the problem of ads suggested with algorithms.)
In an approach to studying user-centered research. Project during User-Centered Evaluation course at Carnegie Mellon University.
About Our Research
As our research is based on algorithmic biases and how we could create a design to improve user experience with this, we needed to get a gist of how people felt about this topic, if they were aware, and their experience.
Understanding Users' Pain Points
To understand our standing point, we conducted one-on-one contextual user testing. Through a directed storytelling method, we have listened to their experiences and the hardships they went through. By this step, we wanted to gather initial information on people’s thoughts and experiences with such tailored ads. As an enacting product manager, I made sure we had a succinct goal and suggested the storytelling method with the hope to hear about personal experiences.
Through this step, we created affinity maps, customer journey maps, and empathy maps.
We were able to categorize some notes we had taken into labels such as users’ emotions, users’ interpretation of algorithmic biases, etc…
We took a closer look into the emotional states users go through to understand the pain they go through when dealing with algorithmic biases and truly understand the root problem to come up with our ultimate solution.
Customer Journey Map
We took a closer look into what steps a user would take in order to accomplish reporting an ad and taking control of data privacy by creating a persona and creating an empathy map.
Overall Thoughts & How the Exercises Affected Our Final Product
The aforementioned processes helped us with how we were going to suggest ways to fix the problem of algorithmic bias. We were getting close to how users felt and their awareness of the subject. In other words, before we implement any solution, we did background research on the current users’ minds towards reporting ads. As a product manager, I regrouped the thoughts and quotes we collected and engaged the group members in how we can utilize this initial background research to build up a useful product that could empathize with users' hardships in reporting ads. I really wanted to focus on what the users were feeling and the source of their main frustration.
Getting a Taste of How Users Process Design Elements
Understanding how users felt about algorithmic bias, we wanted to learn how users take action against algorithmic bias. Also, we wanted to find out possible reasons why they don’t report ads and the possible design elements that could help them.
Through the survey, our goal was to learn if users often report ads.
From the previous user testing, it was suggested that finding the report ad button was hard so we asked questions like below:
But the results indicated that users felt neutral about reporting the ad being difficult.
Also, people thought their action of reporting ads to take action against biased algorithms did not make much of a difference.
Through this survey, we learned that a lot of people were not motivated to report ads and there were mixed feelings on whether finding report buttons for ads is hard. Based on the survey results we conducted a “Walking the Wall”.
Walking the Wall
The survey suggested the root cause of constant algorithmic bias ads and why people weren’t taking control over it: people were not motivated to take control of personalized ads!
My role during this session was to make sure we have reached as many participants to fill out the survey and get diverse opinions about their thoughts on their impact on reporting ads, if they think reporting ad designs are intuitive, and what design elements are straightforward. Using these elements, I wanted our group to think about how we could change users' feelings of thinking they have no impact and how we could incorporate universal design elements to achieve an easier process of reporting ads.
To have a better idea of possible solutions we could tailor, we conducted a Crazy 8s session where all 5 of us came up with 8 different ideas in 8 minutes to motivate users to report ads. Each of us selected an idea we want to develop more.
From the survey, we understood the root problem of algorithmically biased ads to be people not being motivated. In order to understand, we conducted interview sessions using the storyboards that we came up with from the Crazy 8s session to talk about why people were not motivated to take action against ads and if the suggested solutions would motivate them. These storyboards were created using 5 different solutions based on different needs.
A Universal design across platforms for reporting
More intuitive and engaging icons for reporting
Impact of reporting
Control over user preferences and filters
An easier, simple process to report advertisements
For each solution, the stakes got riskier to pull out a response of how users would truly feel with the presented solutions to encourage users to report ads. From the interviews, we have collected some points to consider:
Finding reporting bothersome
Disinterest in reporting
Lack of impact
Using these bullet points, we decided on an ultimate solution we would like to develop further: a prototype to show a collective act and an easier way to navigate reporting ads.
During this exercise, I encouraged my peers to be critical of the pros and cons of each of the prototypes we created for the storyboards and listened to their thoughts based on the speed dating session they conducted. Then to make it fair, I suggested we call a vote for what prototype we think best addresses the problem. I looked over to make sure the group was going in the right direction of what we envisioned to help solve the problem of people lacking the motivation to enact against algorithmic bias.
Creating a Lo-Fi Prototype & Conducting Final User Testing
In the end, we acknowledged that people are not motivated to report ads and thought a solution would be to show that reporting ads could be a collective act. Thus, letting people know that they are not alone. Also, we realized that to reporting ads are currently 2 clicks away, but making it one click away would be the clearest and most motivating to users.
.Using this prototype, we conducted speed dating sessions and surveys.
The notes (combined with the survey results) suggested that the users easily knew how to navigate through the prototype and report ads. As for motivating users to take action, there was a bit more variety in terms of responses. Whereas some users found that the simplified/shortened process to report and the possibility of ads being removed motivated them, others were still not inclined to report and felt their actions still weren’t impactful enough.
However, users still thought the prototype was easy to navigate, and understood the purpose of collective action. Thus, we came to the conclusion that the prototype addresses the need for a simpler process, but we need to re-investigate how to further motivate users and convince them that their actions have an impact.
Based on the survey, we wanted to take a closer look to see how we may improve the prototype in the future if time allows us. Through this exercise, we understood where our prototype was standing at and enforced our strong and weak points of the prototype we came up with.
In this session as a PM, I led the group to reinforce our prototype and solidify our idea. I encouraged others to share results even though some were negative. Since we knew what product we want to create but was having a hard time envisioning how the design will be I created the lo-fi prototype and discussed feedback on how we can improve it. Based on the lo-fi prototype I created, I also created a hi-fi prototype we could use for our final presentation. Discussing our outcome and ways we could improve it, we acknowledged that since the course was coming to an end, we have to present it but also keep in mind some stuff we could have done to improve our prototype in the future and address that in our final report & poster but also convey the strength of our prototype and ideas on how this could positively affect the current problem of not reporting ads during the presentation.
From the initial user testing, we have gotten feedback that there are too many categories for "Why are you reporting this post" and that it tires the users. Thus we have reduced the number of categories. Also, users said the success page is too cheerful making them feel like they are forced to report an ad and feel pressured. Thus, for the final prototype we made is less cheerful and more formal.
For the poster, the group came up with an overall layout & design schemas & content to put in. As I was familiar with prototyping in Figma, I had the opportunity to prototype the poster as our group envisioned. I took the initiative to present a further developed poster than what we envisioned to show our learnings from user testing sessions. Also, as our group envisioned the prototype being shown on different platforms, thus I also designed and added the prototype to be interactable on different platforms.
During the spring semester of my undergraduate sophomore year at Carnegie Mellon University, I had the chance to take the User-Centered Research and Evaluation course. This course has offered me in-depth learning of how design decisions are made and the amount of research it takes to implement designs. With the semester-long project on how to improve algorithmic bias, ad reports, and data privacy, I was glad to delve deeper into the technological side of research and it helped me so much since I want to become a leader in the tech world in the future. I got to learn how users respond to different design elements and how their thoughts and emotions have a big impact on the decisions they make. Furthermore, I got to learn different research methods: Speed Dating, Think Aloud sessions, Affinity Diagraming, Customer Journey Mapping, Empathy Mapping, conducting surveys, etc. This project gave me a chance to collaborate with my awesome peers: Ethan Lin, Ethan Huang, Lynn Kim, and Sarah Yun, and I got to learn the collaborative spirit of how teammates could give constructive feedback to one another to improve the overall product. Even though I plan my career on the development side, this course has offered me the learning of how research is conducted and how I could use it in my future careers to better understand users and come up with ethical user-centered products. If I have the time to revisit this project in the future, I would want to get constant user feedback on our prototype to improve it and help users take control of personal data that is used in algorithms and ad suggestions.