📈 User Experience Design: Case Studies
3 examples of contributing, guiding, and shipping projects.
1. Pattern Library Cuts Days Off Build Time
- My agency built sites mostly from scratch. Devs and designers used personal methods of reusing code and design for efficiency.
- Designers delivered desktop mockups only, so devs were responsible for all breakdowns. Because many design decisions were made by non-designers, the mobile ux wasn't optimal.
- I was excited 🤩 and drooly 🤤 about Atomic Design and pitched wireframes and a vision to the tech team explaining how a design system would improve our reusing methods. To be clear, each site had a mini design system itself comprised of reusable elements like colors, buttons, and styles. The agency design system model would to make building the mini systems easier. Similar to the WYSIWYG of Square Space.
- It didn't catch on, but the spark surfaced a need to unify the names of every single site piece. A new committee and I created a company-wide glossary, so everyone knew what you meant when you said "Hero" and we banished the use of "Banner."
- A year later, the company was ready to implement. My team and I designed three responsive views for each piece of functionality, based on existing sites.
- Since I understand front-end code, I acted as the conduit to development during build.
- Months later we had a working library of 100+ patterns for dev and design to leverage reusing work.
- We had a process for library maintenance (retiring and adding new patterns).
- It decreased dev's and design's build-time by a day.
- It improved our mobile ux because each pattern included breakdowns of 3 responsive views.
Lesson learned: Ideas turn into realities much faster when you have top-down approval. Keep pitching until one gets prioritized.
2. Improved Outdated Landing Page Designs with Research
- Many of our clients used landing pages to drive traffic from various marketing channels. Every landing page was built on a single template which was outdated, not optimized for conversion, and clunky to launch.
- The company green-lit improving this system with the goal of having pages designed and built with specific purposes in mind as well as leveraging existing data.
- I sat on the research team converting findings into visuals.
- Our research team listened to customer calls to determine common buyer personas.
- Together, we created four personas and told user-stories which tied each persona to various types of landing pages.
Based on our findings, I designed a low-fi wireframe that satisfied our personas and the dynamic features the system was receiving. I leaned on internal and industry best practices for landing pages.
For example, someone with a plumbing emergency needs to know the company can fix the problem fast, as opposed to someone considering a new heating system, who may want to learn a lot about the system and company installing it.
- In response to feedback from stakeholders, I designed higher fidelity wireframes displaying more data.
- Once approved, the design team and I retrofit existing clients to the new system and created processes to get new clients up and running.
- New landing page system launched.
Lesson: This project was a challenge across the board because no one person owned it. So there was no authority to go for questions. It launched and was a success but was a serious learning experience for the company about PM.
3. Data-Team Creates An AB Testing Protocol & Gets A Couple Wins
As a new team, we had no systems of process yet, just the dream of using more data at our company. On it sat an SEO specialist, a developer, a SEM specialist, and le designer (moi).
Actions / Results
Processes! I lead the following:
- Determined platforms for AB testing, usability testing, and heatmapping.
- Created and implemented workflows.
- Answered questions as they came in by running tests or analyzing data.
Proved a reduced word-count doesn't ruin SEO!
- I helped run and strategize an AB test for existing content length vs shorter content length. Our hypothesis was that less words wouldn't hurt conversion.
- Our page-views were low, so we bundled the test on 5 clients for more data.
- The UX improved because no one likes reading an SEO-stuffed page, we want our question answered. Conversions improved. And SEO didn't suffer.
Retired a button strategy because of data!
- Upon research, we learned a method of button flippers we used didn’t follow industry best practices. These multi-state elements turnover to display new info on their back. What was wrong with them?
- They hide conversion opportunities.
- They require a mouse-hover on desktop, but on touchscreens, they lack an indication they’re flippable. So people get surprised; which makes them feel less in control. A no-no.
- Clients liked seeing the flipping animation more they considered what links went on the back. Some instances wound up with a dump of links which didn't consider the user.
Learned our hero buttons convert better when they go to a service page rather than a form page!
- The buttons had variations from just stating a service like "Air Conditioning", or the more conversational "I need Air Conditioning."
- Using 3 research methods (heatmaps, click-tracking, and analytics research), I determined that:
- In general, the hero buttons get a good amount of engagement so we shouldn't stop using them all together.
- We learned from click-tracking that we should stop sending people to forms because service pages converted better. This may be because someone ends up on a form page too early in the process, before getting a chance to research if the company is a good fit. It's also not clear you're heading to a form. Again, we don't want surprises.
Moved field titles outside of the text-field box!
- Putting field titles where placeholder text should go does save vertical space and reduces some clutter, but we decided the cons outweigh the aesthetic.
- Cons: As the viewer begins to type their responses in the form field, the prompt disappears. When submitting the form, if there was an error the title would no longer be present which may confuse someone. This may deter the user from submitting the form.
Lesson: AB tests need A LOT of data for statistical significance. We ended up canceling a number of experiments due to a lack of it. Our solution was to use aggregate client data and to be strategic when deciding to use AB testing.
Please don't share this URL. Some of the info is proprietary. Thanks!