I joined Degreed to lead design for B2B and enterprise product at Degreed, a progressive EdTech company focused on helping people be recognized for all of their learning and skills. After almost two years, I transitioned into the role of Product Manager for the same team to drive our enterprise offerings.
While most of my work from Degreed is under NDA, this covers my work to design a reporting ecosystem for our enterprise product.
What might a reporting ecosystem consist of that generates engagement paths back into the product?
After several rounds of testing and iteration, a robust reporting ecosystem became a flagship feature within our enterprise product, giving customers the information they needed to provide better content and learning paths to their employees (our end users).
One of the challenges of enterprise design is the separation between you and your end users. To bridge this gap, I took the lead to conduct a brief research project to better understand users of all types.
Through existing research and interviews I conducted with our customer success and sales teams, I identified customer and user archetype that helped kickstart conversations about our product initiatives and goals early in the design process. These personas helped raise assumptions and constraints, and were a tool to contextualize direct research and usability studies later in the process.
Another challenge I faced was the volume and frequency of information. Working on a B2B product, input flows in constantly from internal and external stakeholders in the form of complaints, ideas, feedback, and research. I used tools like Trello, Dropbox Paper, and Productboard to track these inputs by theme. This helped me organize the firehose and identify synthesized problems (and who to talk with to dig deeper).
From my research related to reporting, I identified the common and major customer pains to solve for:
Armed with the context for initial exploration, I started my processes on paper and whiteboard. I often will begin by writing down questions from a user’s perspective that I believe the person would want to have answered by interacting with the product. I also will take note of edge cases I encounter, or different scenarios where someone might use the solution.
This helps me to frame the problem from the perspective of different personas or people I know use the app. I use different approaches to mental modeling, depending on the situation, sometimes deconstructing a full experience map, other times using cartoons to visualize the emotional side of the problem.
Once I've established a general direction, I work in low-fidelity to explore interaction patterns and architecture. Degreed is a distributed team, which makes transparency during this phase particularly important. I often export my wireframes into interactive prototypes to share with stakeholders early in the process to stay aligned.
Once I had an established direction, I moved into Sketch and began to lay out what the whole flow would look like, accounting for the edge cases I identified in my initial brainstorming session. I often approach this by creating artboards for each step in the flow (as a large-scale AI map) before I visually mock up anything , so I keep myself focused on the full interactive experience as I work.
At this point I start to break down the stories into interaction journeys. I start very lo-fi, and gradually start to shake out a few directions to explore in higher fidelity that I will take into Sketch, getting feedback from my team and stakeholders along the way.
Building prototypes in Invision and Principle often help to explore concepts as well and get feedback from stakeholders. Few things communicate like a gif or video.
By this point, I had established an assumption of what I thought the interface should look like in order to best balance internal constraints with customer value.
After created a high-fidelity comp, I worked with the product manager to set up meetings with key customers to get their feedback. The reporting interface was a tool for the customer–value would only be realized if they could use it to take action on content and individual performance that would help them reach their internal goals. Therefore it was critical that testing take place well before we got close to implementation.
After several initial rounds of testing, I discovered the some of the visual choices I had made were not clear in use. For example, while I had simplified some charts, our customers communicated that the categories alone were only useful if they were placed in context. I explored several additional iterations and presented them to customers for feedback.
Throughout this process, I worked closely with a data engineer to produce prototypes of charts using real customer data. Our partnership allowed customers to view their actual results, which ensured their feedback was based on real-world scenarios and not just reactions to the visual design.
When I joined Degreed, one of the first projects I took on was to simplify our styleguide and pattern library. We have since gone through a redesign, but the initial work continues to influence how our pattern library is organized and how we track for consistency.
I started by going through the entire app and creating an inventory of the variations of each element. (This approach is outlined by Brad Frost as a step towards Atomic Design.) This helped the team visualize the variance in our patterns and drove conversation to simplify our buttons, color palette, and form library.
I worked with the other designers and the front-end developers to land on consistent styles for each element. As we narrowed down common styles, I collected those in a structured styleguide that the design team and front-end developers could reference.
Once I started working on the visual design for reporting, I realized that we needed to expand our color palette. Our base colors would not allow for the range of data visualizations I was creating.
Working with the constraints of our existing brand colors, I began to explore additional colors using the charts as a base. I had to stay in close partnership with the design team, as we were simultaneously defining colors by purpose while ensuring they met accessibility standards in use.
Using the app Sim Daltonism, I tested each variation against different forms of color blindness to create an accessible pattern that could apply to our entire chart library.
Using the findings gained during early testing, we were able to determine what a minimum viable product needed to consist of. This would allow our customers to access enough of their data to give us a breadth of feedback and analytics to improve the product, balanced against technical resources on our end.
Future iterations included downloadable report generation, expanded filters, “drill down” insights going into detail on individual charts, a refined calendar widget, and more.
While I discovered weaknesses upon use in the initial reporting launch, the early testing and open feedback process reduced waste and technical debt and set us up to succeed with iterative releases. Reporting was a major differentiator and its placement in the product was a catalyst to the growth of our enterprise business.