All work
Case study · CloudHealth by VMware

CloudHealth by VMware — multicloud, day one.

Strategy, research, and design lead for the CloudHealth Cloud Management Experience — a live multicloud platform that unifies AWS, Azure, and GCP cost and operations into one product and delivers value on the first login, with no setup. A rebuild of a sophisticated tool around a single rule: stop asking the user to earn the value; put the value where the user lands.

Role
Strategy · User Research · UX/UI · Interaction Design
Engagement
Multi-year program · CloudHealth → VMware
Surface
Multicloud platform · AWS · Azure · GCP
Status
Live · now part of Broadcom
CloudHealth Cloud Management Experience — desktop and mobile views of the multicloud Cost Overview
The CloudHealth Cloud Management Experience — multicloud cost, security, and operations in one platform, captured from the live rebuild. Toggle to view in light or dark mode.
// tl;dr

CloudHealth was a sophisticated platform that asked too much of new users on day one. We rebuilt the experience around a single rule: deliver value in the first session, no setup, no training, no support call. Six concrete moves carried it — zero-click insights, role-curated content, off-canvas detail, pre-computed actions, goal-led navigation, and recommendation-next-to-action — and the redesign became the playbook for the rest of the platform.

The challenge

CloudHealth (acquired by VMware, now part of Broadcom) is a leading public cloud management platform — and a sophisticated one. By the time I came in, it had grown into a power tool for cloud administrators. The downside of that evolution: new users couldn't get value out of it without extensive onboarding. They'd sign in, struggle, schedule a training session, and only after a support call did the platform start paying back the ticket price.

Our post-onboarding survey told the story plainly. Only 30% of new users called the platform "Empowering." Only 56% called it "Intuitive." The product was powerful and people knew it. They just couldn't get to the power.

Project goals


Research approach

We used a mixed-methods program — competitive analysis, customer interviews, brainstorming workshops, prototype testing, personas, end-to-end customer journey maps, and information architecture work. The questions we kept coming back to:

What we asked // framing

  • Which design and system changes will have the greatest impact on time-to-value?
  • What obstacles block new-user adoption today?
  • Which data could we surface earlier in the experience?
  • Which tasks could we automate or eliminate to deliver value sooner?
  • How do customers actually interact with the app on a daily basis?

How we found out // methods

  • Customer interviews across roles and seniority levels.
  • Brainstorming workshops with stakeholders, ranked on an impact-effort matrix.
  • Competitive analysis of the multicloud management category.
  • Personas built from interview data and re-validated with the field.
  • End-to-end customer journey maps to visualize friction.
  • Sketch → wireframe → high-fi prototype → user-test loops in Figma and InVision.

Personas

Three personas shaped the design. The primary kept us honest: if Jenny the Cloud Analyst couldn't get to value in her first session, the redesign hadn't shipped.

Primary persona

Jenny — Cloud Analyst, Day One

Jenny signs in to the platform on her first day at work. She tries to build a report on AWS Reserved Instance savings. After ten minutes of clicking around, she calls support — and finds out she needs to sign up for training before the platform will start helping her.

"As a Cloud Analyst, I have several internal and external tools at my disposal. I don't have the time for exhaustive training to learn another tool to do my job effectively."
Three persona cards: Jenny the Cloud Analyst, Jason the Cloud Admin, Shauna the Cloud Ops lead — each with a description and tools list
Persona cards we kept on the wall. Jenny is the one we built for first.

Workshops & the customer journey

I ran stakeholder workshops to surface needs and roadblocks early. We used an impact-effort matrix to rank feature proposals so the team could sequence them honestly — the easy-and-impactful ones first, the moonshots tagged for later.

Impact-effort matrix plotting feature ideas in four quadrants — quick wins, big bets, fill-in, thankless — color-coded by priority tier
Impact-effort matrix from the stakeholder workshops. Top-left was where we started — high impact, low effort.

From the interviews and workshops, I built an end-to-end customer journey map. Each friction point in the map became a candidate feature for the redesign — and a place where, in hindsight, an agent could now do what we did with rules.

End-to-end customer journey map across five stages: Discover, Try, Buy/Adopt, Use, Grow/Consumption — with rows for user goals, personas, touch points, pain points, emotions, ideas, and metrics
The end-to-end journey map. Friction points became feature candidates.

Research insights drove the redesign

We synthesized the research with affinity mapping, sketched a new direction, built a high-fidelity prototype, and re-tested with users to validate. The redesign came back to the same handful of moves, over and over.

Before Testing
CloudHealth dashboard before testing — all eight research insights annotated 1 2 3 4 5 6 7 8
What testing surfaced
  1. The download action did not show to be as popular as we thought.
  2. Instead of total spend, show the most impactful top cost drivers.
  3. Opportunities by month were not as important as we thought.
  4. Branded colors in the chart were deceiving.
  5. The detail was used twice as often when we changed it to an off-canvas panel.
  6. Security-focused users did not like seeing cost and security mixed in one view.
  7. Hero report was seen as a powerful motivator, but admins thought it would be difficult to define success.
  8. We found the efficiency score was very important, so we made it more prominent.
After Testing
CloudHealth dashboard after testing — clean redesign with annotations resolved
What shipped

What we removed or quieted. The Download CTA, which testing showed was barely used, dropped from a primary button to a quiet icon link. Opportunities-by-Month was retired entirely — a trailing 12-month chart never beat current-state at a glance. Branded sparkline colors gave way to a single neutral CloudHealth blue once we saw users reading orange and green as positive/negative signals instead of as provider identity. Security was lifted out of the cost view because security-focused users told us the mix was friction, not context.

What we promoted. Total spend was reframed around the top cost drivers — the aggregate number wasn't what anyone made decisions from. The detail panel that crowded the canvas moved to an off-canvas drawer and was used twice as often. The hero report tested as a strong motivator, but admins worried about defining success, so it shipped as an opt-in module rather than the page's lead. The efficiency score that had been buried in the rail became the page's lead metric.


What shipped — six moves that made the redesign

The new design moved from low-fi wireframe to hi-fi prototype the moment the research had landed. Layout, IA, and the day-one content slots came first; visual treatment and motion came after.

Side-by-side wireframe and high-fidelity rendering of the Cost Overview, showing the redesign moving from grayscale layout to the polished CloudHealth UI
Wireframe to hi-fi. Same layout, same content slots — the structure was settled in low-fi before the visual layer landed on top.

1. A new top-level overview experience

Overview landing pages became the user's first stop on every login. Previously hidden, hard-to-calculate data — savings opportunities, efficiency scores, anomaly counts — sat right at the top, no setup required.

Cost Overview landing page from the CloudHealth rebuild
Top-level overview. The first thing on the page is the most useful thing on the page.

2. Off-canvas drilldown that does the work for you

Users wanted to drill into the opportunities surfaced on the overview without navigating away. Our first attempt put detail persistently on the page and promptly created visual noise nobody could parse. We moved detail to an off-canvas drawer that opened on demand — but kept going past "drilldown." Each panel reads like an analyst's brief: a plain summary, the metrics that matter, and the next thing to do. The right rail isn't showing you data, it's handing you a draft.

Every metric panel is shaped to its question. AWS cost surfaces a numbered action plan ("three plays I'd run this week"). Azure cost flips into an auto-fix card with an Approve / Skip pair. Public exposure traces a chronological root cause from a Sep 12 Terraform apply to the buckets that lost their policy 90 seconds later. Latency spikes get a minute-by-minute timeline with the deploy that caused them. Approval-heavy metrics like Automation Actions show a queue of pending changes with bulk approve. None of these are tabs the user has to find — they're already in the right shape when the rail opens.

Try dark mode →
Cost Overview with the AWS detail rail open. The rail shows a Quick Answer summary, four metric tiles, and a numbered three-step action plan (Rightsize 1,675 EC2 instances · Convert cold S3 to Glacier IR · Renew Compute Savings Plans) with a Stage All Three button
Click the AWS card on the Overview and the rail offers a draft, not a dashboard. Each step has its impact in dollars and a one-click "stage" — the assistant did the analysis; the user provides the approval.
Security · Public exposure rail showing a chronological root-cause timeline. Sep 12 14:32 Terraform apply on prod-network · 14:34 two buckets lost their policy · Sep 13 09:00 posture score dropped 1.3 points · Sep 14 04:11 exposure detected. Below: a re-apply locked policy module autofix with Stage Terraform Plan / Show Diff buttons.
Public exposure on Security. Same off-canvas pattern, different brief: a root-cause timeline, then a Terraform plan the system already drafted to put both buckets back behind the locked policy.

3. Improved navigation

We rebuilt navigation around the user's job, not the product's catalog. The IA stopped asking "where do you want to go" and started asking "what are you trying to do."

Four section sub-navigations stacked: Cost, Security, Performance, Automation, each with its own contextual sub-items
Goal-led navigation. Each business domain — Cost, Security, Performance, Automation — has its own contextual sub-nav. Less catalog, more verbs.

4. Zero-click cloud insights & actions

High-impact optimization opportunities appeared by default for every user — efficiency scores, top recommendations, one-click actions to remediate. No filters to set, no reports to build, no training to take. The smartest thing on the page was already on the page.

Recommendations page showing $184,727 in available potential monthly savings, broken down by Rightsizing, Reservations & CUDs, and Idle & orphaned, with a list of one-click apply / review / resolve actions
Zero-click insights. $184,727 / mo in pre-computed savings opportunities, with the action button right next to the diagnosis. The pattern that, eight years later, became the AI Recommendation Banner.

5. A search bar that answers, not just finds

Search lived in the top nav, but pressing it stopped looking like search. The icon expanded into a dark inline field that took the user's question — phrased in plain English — and the dropdown that opened underneath returned an answer first, links second. Empty state offered four typed-out questions worth asking ("Where are my biggest savings opportunities?", "Which security findings should I fix first?"). A typed query routed across nine topic models — cost, savings, accounts, budget, anomalies, security, AWS, Azure, GCP — and produced a one-paragraph summary plus four metric tiles tuned to the question. Below the answer, matched pages and related insights kept the navigation use case alive.

For "show me the AWS accounts trending over budget" the panel returns the two budgets that breached, the dollar overage, the account that will breach next, and a forecast date — before the user has hit enter. The same query a year earlier would have asked the user to pick a report, then a filter, then a column.

Two-up showing the search dropdown's two states. Left (Empty state): the dropdown right after clicking the search icon, listing four typed-out prompts under 'Try asking', plus 'Recently viewed' and 'Insights worth your attention'. A gray arrow connects to the right (Answered state): the same dropdown with 'Show me the AWS accounts trending over budget' typed into the dark inline header field, returning a Quick Answer card with a one-paragraph summary and four metric tiles.
Empty → answered. Left: the dropdown right after the icon click — four worth-asking prompts and a stack of insights worth the user's attention. Right: a typed question gets a one-paragraph brief and four metric tiles tuned to the query, before the user has hit enter.

6. Accessibility & mobile responsiveness, designed in

Accessibility and responsive layout weren't a "phase two" — they were in the original spec, the original tokens, the original component library. The Elevate Design System we'd already shipped at CloudHealth made this affordable to do correctly.

Desktop browser window showing the full Cost Overview, with a phone in the bottom-right corner showing the same page in mobile responsive view
Designed accessible from day one. WCAG, keyboard nav, mobile parity. Same data, same actions, both sized for whichever screen the user landed on.

The principles underneath

Six concrete moves are easy to inventory. The principles underneath them are the harder thing to name — and the ones the rest of the team kept reaching for as the redesign spread to other workflows.

P_01

Earn day one

The smartest thing on the page is already on the page. No setup wizard, no empty state, no "configure to get started."

P_02

Curate to the role

An analyst is not an admin is not a finance partner. The default view matches the role; everything else hides until asked for.

P_03

Pre-compute the answer

If a question gets asked every Monday, answer it before the user shows up. Move the work from the user's morning to the platform's overnight.

P_04

Diagnosis next to action

Every problem the platform surfaces travels with the way to fix it. No dead-end alerts, no "go find someone who can do this."

These four sit underneath every screen in the redesign. They are also the part of the work I find myself going back to whenever I'm stuck on a new product — different domain, different stack, same questions.


Measuring success

Both qualitative and quantitative metrics. The interviews told us if the new experience felt right; the numbers told us if it actually worked.

+45%
Increase in users rating CloudHealth as "Intuitive."
+60%
Increase in users rating CloudHealth as "Empowering."
−15 min
Time-to-completion drop on common tasks like AWS Rightsizing.

Beyond the numbers, the redesign got picked up as the new standard for priority workflows across CloudHealth — Northstar Dashboards, Cost Reports, AWS Savings Plans, Instance Rightsizing, Discount Manager, Anomaly Detection, API Test Access, Perspective Management, Partner Billing, and CloudHealth One. One redesign that became the playbook for the rest of the platform.

Measuring success — qualitative sentiment shift bars on the left (Intuitive +25 pts, Empowering +60 pts, Easy to learn +36 pts) and quantitative outcome cards on the right (+45% Intuitive rating, +60% Empowering rating, −15 min on common tasks)
Measuring success — qualitative interviews and quantitative usage targets, side by side.

Lessons learned

Persistence and patience

This project was funded for development and then pushed down the priority list on the roadmap twice. It wasn't until we presented the research and the value of the improved experience to wider audiences that the project finally got its real funding. Sometimes the right move is to keep telling the story until the right ears hear it.

Use meaningful data points early on

Projects often don't have an obvious ROI or a way to articulate success. Find data that expresses the value of the improvements in a language stakeholders already speak. The "Empowering" / "Intuitive" rating delta sold this redesign more than any heatmap or flow diagram could.

And one quieter lesson, the one I take with me to every product after this: get the principles right and the surface stops being the hard part. Earn day one, curate to the role, pre-compute the answer, put the diagnosis next to the action — once those rules are in the bones of the system, individual screens get easier and the whole product gets clearer.