Atomist
A catalog of automations that enables engineers to automate software processes with ease.
1.5
years
2019-Present
Product definition
&
design
End to end
project
Qualitative
user testing
interviews
Product
Atomist’s pre-built automations, called “Skills”, make it easy for developers to automate work.

The web app UI is the golden nugget and what makes it so easy — it both guides the user through steps to turn on automations, and it allows users to fully customize their automations the way they want.

Contribution
I’ve worked with the Atomist team for 1.5 years, contributing at a high level to the product definition and product design, all the way down to designing a pixel-perfect icon set for the automation catalog.
Case study:
Wizard design and testing.
Problem
We were in the midst of focusing on problem-solution fit while rapidly iterating on the product and could see our conversion rate on automation enablement was low.

We hypothesized that we needed to minimize friction in the user flow of enabling an automation. I set out to conduct qualitative testing in order to validate our hypotheses and discover pain points.

Core KPIs to consider:

  1. Increase in number of new users
  2. Increase in skill enablement
  3. Increase in monthly active users — our north star metric
A key element to reducing friction in the user flow was designing a wizard experience, guided by this module which follows the user every step of the way.
Problem Definition & Lightning Brainstorm
Crafted problem summary, and utilized it to drive and run a lightning brainstorm with a cross-functional team.
Problem Definition
I facilitated a team discussion to craft a problem definition for testing.

I used this formula to help us identify the user, the user problems we were trying to solve, and what the team believed would happen as a result of a solution.

Lightning Brainstorm
The problem definition helped our team align on testing goals. I also used it to drive a lightning brainstorm session with our cross-functional team of engineering, marketing, product, and design.

We considered various use cases and reviewed several concept ideas for guiding users through a step-by-step process.

As a result of our brainstorm, we were able to narrow down a solution to pursue and test — a “light” wizard.

Live Prototype
Collaborated with engineering, product and marketing to rapidly iterate on UI/UX of live prototype developed by our engineer.
Live prototype: the very first version of wizard concept before rapid iteration.
Test Prep & Execution
Single-handedly planned, ran, and analyzed moderated, qualitative user testing with 7 participants.
Test Plan & User Outreach
Crafted test plan, which documented:
  • Objectives for what we wanted to learn
  • Testing method
  • Testing script, including tasks and questions to cover

I also ran outreach to existing users which included manually sending personal emails to users who were signed up with Atomist, tracking responses, and scheduling interview sessions.

Conducted User Testing
I ran 7 testing sessions, all approximately 45-60 minutes long. These were moderated, qualitative tests focused around the open-ended task of exploring, researching, and evaluating the automations of interest to the participant, particularly ones that solved a current problem they were experiencing, via our live app go.atomist.com

Our objectives included assessing content quality and relevance on automation detail pages, learning how users explore, learn what friction, pain points, and questions users have in the process and what information they expect to find along the way.

Analysis & Synthesis
The open-ended tasks uncovered insights about users’ expectations, questions, and needs, which drove our content analysis and iteration of automation detail pages.

I conducted thematic analysis and synthesis of feedback and documented themes and key findings. I observed 6 primary themes, and 3 secondary themes, plus automation-specific feedback and feedback about our marketing website.

Surprising results:

  • Users didn’t understand what a “skill” was and how they were different from integrations.
  • Usability problems with wizard.
  • Users wanted to know the value add compared to other tools almost immediately.
Design Proposals → Implementation
I turned my findings into proposed action items, prioritized them, determined groupings for them, and made those into epics of work that we are still tackling today, months later.
Correlate Key Findings to Action Items
I took my key findings, per theme, and proposed corresponding action items to address user pain points uncovered in the testing.

I then prioritized the action items, weighing effort and impact, and identified whether they required strategy, content, or design decisions.

Prioritization & Epics
After review and re-prioritization by the product stakeholders, we grouped the action items in a way that made sense for testing and validation of the changes we would make. These were translated into epics and individual issues, essentially planning months of work ahead.
Design, Implementation, & QA
I closely collaborated with product and marketing to make design decisions throughout the app and website.

As we set more strategy and planning in place, we quickly saw a need for more structured meetings and communication, especially considering our distributed team across 4 countries. I started to join the daily engineering standup and we would hold frequent design reviews and walk-throughs.

Close collaboration with engineering was essential as well to get the updates implemented and to conduct proper design QA on all shipments.

Results & Reflection
I ran a successful round of user testing interviews, where recurring patterns emerged, as well as high quality actionable feedback. I was able to complete this process from end-to-end, which increased our conversion rate slightly and helped us to reduce friction for our users. Looking back, there definitely would be things to improve on.
End
to
End
End-to-end product testing and improvement, solo-driven.
WTDD
What to do differently:
  • Our target users are difficult to reach so our feedback loops are rare and overly dense without enough targeted results.
  • We waited too long to get feedback — we should have started sooner with lower fidelity prototypes.
  • Scope creep increased time for delivery of epic work.
Client Testimonial
COLLEAGUE
“Jenny is top-notch. She has a creative yet methodical approach to complex problems. She is incredibly skilled at analyzing and articulating complex ideas and does a great job striking the right balance between the big picture and the subtle details. Jenny is a strong advocate for the user and uses data to make informed decisions, and because of this, she’s always looking for ways to push the boundaries of how we can make our product better. She’s an excellent team player, quickly shifting focus from one department to another as we all clamor for her assistance. Jenny is also just a fantastic person to work with; she’s fun, hard-working, and very talented.”
Michelle Urban
VP of Marketing at Atomist
COLLEAGUE
“I work as a web engineer in collaboration with Jenny who is responsible for the UX and design of our web app. Jenny’s designs always demonstrate both a thorough understanding of the user perspective with a meticulous attention to detail. She consistently offers clear explanations and listens carefully to others’ opinions. Jenny is also excellent in discussions with divergent strong opinions, asking pertinent questions and seeking to understand other viewpoints. She researches topics she’s unfamiliar with, filling gaps in both her and the groups’ knowledge that improves decision making. As an engineer I can rely on Jenny to engage in thoughtful and productive discussion, but ultimately take decisions so that we can move forward. This coupled with her positive attitude makes her great to work with.”
Danny Smith
Engineer at Atomist