Screen Shot 2021-02-22 at 5.17.49 AM.png

Redesigning the Critique Process

Role // User Researcher, UI/UX Designer, Ideation Lead

Duration // 5 weeks

Tools // Figma, Procreate

Team // Morgan Creek, Gennifer Hom, Sienna Gonzalez, Rochelle Dai

Overview

Context

For UC Davis’s Design Interactive Fall 2020 cohort, six teams were created to find solutions to various problems that student designers face. Each team included 3-5 junior designers and one senior design mentor. While some teams were asked to redesign existing digital products, including Zoom, Notion, Linkedin, and ScheduleBuilder, my team was asked to tackle the broader topic of the design critique process. Our project was voted Most Innovative UX by the panel of industry professional judges and won the Audience Choice Award on presentation day.

My Role

As one of four product designers, I was involved in every step of our design process, including gathering user research, analyzing user data, creating wireframes, developing content strategies, designing interfaces, and prototyping. I was the ideation lead (2 weeks), where I was responsible for directing user data analysis and transferring our user research findings into concept ideas for our product. I led the data synthesis, affinity mapping, and solution statement parts of this sprint. This helped my team explore diverse solutions to our problem statements and develop a clear understanding of the design direction for our product.

Timeline

We spent the majority of the 5-week sprint conducting research and prototyping. Our user research continued during our ideation process to allow for adequate time researching our main user groups. A key component to our process was conducting user-testing between our lo-fi iterations.

Screen Shot 2021-02-22 at 5.03.26 AM.png

Project Brief

Challenge

My team was asked to streamline the design critique process. A large amount of creative freedom was given, but a major constraint was the inability to meet and design in person per CDC regulations. Currently, UC Davis Design is only offering remote courses, so we needed to also take this into consideration.

Problem

We needed to create a comfortable and beneficial critique environment for design students to receive feedback on their projects in a remote setting.

Solution

We wanted to create a platform to host remote critiques given the current pandemic. Our solution was a new digital product that increases transparency between participants and centralizes feedback for future reference.

Screen Shot 2021-02-22 at 5.03.10 AM.png

Sprint #1- Research

Survey and Interviews

We knew the design critique process involved two groups of users: professors and students. To get rapid student feedback, we created a survey to gather data about their critique experience. For professors, we conducted a total of 4 one-on-one interviews with design professors specifically. Our interview questions included…

  • What is your experience with design critique like?

  • In what ways has remote learning changed your experience with critique? 

  • How would you describe your feelings/thoughts throughout the design critique process?

Our goal was to find the strengths and weaknesses for both in-person and online environments so we could cross analyze the two. We felt the best way to produce these answers without bias was to bring up the idea of design critique in each setting first instead of immediately directing our participants towards any pros or cons.

Screen Shot 2021-02-22 at 10.51.06 AM.png
Screen Shot 2021-02-22 at 10.51.35 AM.png

Brief Analysis

After surveying and interviewing 20 students and professors about their experience with remote critique, we found three common pain points:

  1. A lack of personal connections between students and between professors and students

  2. Frequent awkward silences where students were waiting for one another to speak

  3. Disorganized documentation as a result of decentralized feedback found in different documents or locations, making the feedback difficult to implement in future work

Screen Shot 2021-02-22 at 5.03.43 AM.png

Sprint #2- Research & Ideation

Affinity Mapping

From our research, we used affinity mapping to identify common themes and organized them into distinct pain points. We recognized that the remote critique experience is a multi-step process, which we call the “critique journey”- what happens before critique, during critique, and after critique.

By categorizing pain points into the three phases of the “critique journey”, we made sure our solution accounts for each step of the experience.

Screen Shot 2021-02-22 at 8.44.21 AM.png

User-Research Synthesis

From our surveys and interviews, we received many contrasting responses about the experience during critique.

  • Students have varying comfort levels regarding microphone use. One student felt that using mic is nerve wracking, whereas another student dislikes when the audience mutes themselves.

  • Each professor has a different class structure, which affects the efficiency of critique. Students had varying experiences with whether or not critique sessions were done efficiently.

  • Students felt the text-based format of critique had both pros and cons. For example, students found this type of format beneficial when receiving feedback, whereas when providing feedback, students found this format is more time-consuming.

We then brainstormed problem statements for each step of the critique journey.

Screen Shot 2021-02-22 at 5.05.30 AM.png

Sketching Solutions

We began to brainstorm possible designs to help alleviate each critical pain point found during our research. Each team member explored solutions to a couple different problem statements and depicted these with digital sketches.

Screen Shot 2021-02-22 at 5.05.13 AM.png

Sprint #3- Lo-Fi

Wireframes

Once we finished brainstorming solutions and further analyzing our data, we moved into the low-fidelity prototyping stage of the project. Our main goal was to get a rough working prototype for usability testing early on in the project timeline. My focus was the pre-critique phase, where my main goals were to increase personal connections and ease anxiety before critiques actually began (see my designs below).

We implemented an ice breaker question to create a friendly atmosphere, and let users choose their presentation time to erase any ambiguity or anxiety.  My team also wanted to offset any disorganized documentation and make referencing the feedback after critique easier for students. We centralized all the feedback the student received into one document in our wireframes for after critique.

Screen Shot 2021-02-22 at 5.05.59 AM.png

Lo-Fi Prototypes

We also wanted to minimize the feeling of awkward silence during critique. To combat this, we decided to implement emotes. This would allow the audience to quickly show the presenter they are actively engaged during the critique process. The emotes would be visible to all attendees, similar to Instagram live. We hoped this would help offset longer periods of silence in a fun and encouraging way. We prototyped a simple animation from our wireframes depicting how they would work.

ezgif-2-9171d07be609.gif

Sprint #4- User Testing & Mid-Fi

User Testing I

We conducted remote, moderated usability testing and discovered further pain points in each of our lo-fi iterations described above.

  • Before critique- our lobby screens lacked hierarchy, and users did not know where to look or what actions to take.

  • During critique- having public emotes was actually make students more subconscious about their work.

  • After critique- there was an edge case where there might be multiple annotations in the same area.

Screen Shot 2021-02-22 at 5.06.13 AM.png
Screen Shot 2021-02-22 at 5.06.44 AM.png

Mid-Fi Prototypes

From here we wanted to shift the definition of emotes from a performative action into a more static tool like a sticker that students can place onto the artwork to indicate favorite areas. Only the presenter would be able to view these stickers. We felt this would be a better way for students to receive more specific feedback without having to compare emotes between peers.

ezgif-2-2950e15ef1a7.gif

User Testing II

After another round of user testing, we discovered our emotes were still not functioning quite the way we wanted them to. Users felt that the sticker function was redundant and not intuitive because it was basically another form of annotation. We would further address this problem in our hi-fi prototypes.

Screen Shot 2021-02-22 at 5.06.56 AM.png

Sprint #5- Hi-Fi & Final Presentation

Hi- Fi Prototypes

Now that we had useful feedback from a few rounds of usability testing, we moved into our high fidelity prototyping. Our goal for this sprint was to address the concerns brought up through usability testing and refine the overall aesthetic of our designs.

Before Critique

With both rounds of user feedback in mind, we redesigned the lobby screens and simplified it into two distinct steps: choosing a presentation time and answering the ice breaker question. Simplifying the interactions and using color were the two main ways we addressed the issues with hierarchy from our low-fi prototype.

Design Critique DI lobby.gif

During Critique

We simplified the emotes so that students could instead react to the overall piece rather than specific areas. To address our earlier user feedback, we made the emotes more subtle and provided the option to hide the chat and emotes to make them less distracting. When hovering over the icons, users can see the number of reactions they are receiving, but this information is only viewable by the presenter. We also limited emotes positive options because we wanted to encourage users to provide more detailed and constructive feedback in the form of written critique.

unnamed (1).gif

After Critique

To solve the issue of the edge case, we combined annotations from the same area. These annotations overlap by default, but expand vertically on-click. We also added an option for users to filter between text or hand-drawn annotations to prevent cognitive overload.

unnamed.gif

Final Presentation

Future Changes

If we had more time, we would want to further research and test our product in these ways…

  • Explore and accommodate for various learning styles

  • Improve the professors’ experience with conducting in-class critique

  • Further refine the emotes and test with a large audience

What We Learned

Overall, our team learned how to better design with the user flow in mind and incorporate diverse feedback. We also learned how to conduct face-to-face interviews in a remote setting for research purposes.

>> FINAL PRODUCT

Design Critique

Imagine stepping into a design critique session as a student. First, you select your presentation time and answer a fun icebreaker question, while viewing your classmates’ responses to break the ice. As you review your peers’ work, you can engage with their designs using various tools: react with emotes, leave targeted text annotations directly on specific areas, or sketch ideas using a pencil tool to visually communicate your feedback. You can also provide in-depth written critique through the chat. Once the critique session ends, all conversations and feedback are automatically saved alongside the artwork.

When it’s your turn to present, you’ll receive real-time feedback from your classmates. If the emotes and chat feel distracting, you can easily hide them to focus. Notifications for new messages are highlighted with a purple circle, and you can hover over them to save important notes for later. After the session, all feedback—classmates’ and professor’s comments, annotations, and saved chats—are compiled into a single document. You can filter feedback by type, explore specific annotations, and scroll through the chat history from your critique, providing a comprehensive view of your progress and insights for improvement.

 
Next
Next

Digbi Health