Case Study

AI Dashboard: Designing Visibility into AI Usage Across the Organization

  • Role: UX Lead

  • Team: Internal client stakeholders, 1–2 design collaborators

  • Duration: Ongoing

  • Scope: Design strategy, early concept testing, prototyping, stakeholder alignment, and validation planning

Some details and visuals have been redacted or adapted to protect proprietary information. The case study emphasizes leadership approach and outcomes over exact designs.

1. Problem Statement

As AI adoption grew internally, leadership needed a way to understand how teams were using AI tools: not just for oversight, but to identify opportunities, trends, and enablement gaps. However, no system existed to track this at a glance.

We were asked to explore how to visualize AI usage data across the company, starting from a basic stakeholder sketch, and transform that into an intuitive, role-appropriate interface.

Strategic Goal:
Provide leadership with a centralized view of how AI tools are being used across the organization, enabling informed decisions around enablement, governance, and cross-team knowledge sharing, while ensuring the solution fits naturally into existing workflows.

A blurred wireframe image labeled “Initial Sketch,” showing the early concept for an internal AI usage dashboard. Sticky notes on the interface capture exploratory stakeholder comments such as “What kind of awards could be?” and “Anything we can do after achievement?” The visual reflects the starting point for transforming a high-level idea into a validated, user-informed product design.

2. Tech Stack

Task

Success Rate

 Notes

Locate and interpret The Score

67%

Most could find it, but had trouble interpreting percentages and comparisons

Navigate to the leaderboard

0%

All users mistakenly tried to access via the Badge area

Access training suggestions

50%

Only 2/3 users answered.

Some success, but most shared that content was visually lost or unengaging

Identify top-used AI tool

 67%

Generally interpreted well, especially via Teams usage cards

User Suggestions for Improvement
  • Group usage insights by work vs personal topics
  • Visualize time saved using AI vs doing the task manually
  • Add filters for better focus and customization
  • Clarify terminology
  • Improve visual hierarchy: especially in cluttered areas like  suggestions panel
  • Make widgets and icons more interactive and self-explanatory

3. KPIs & Evaluation Summary

This section highlights how early testing informed the usability and satisfaction benchmarks for the dashboard. It also outlines how this research phase laid the foundation for post-launch comparison and ongoing improvement.

After completing moderated concept walkthroughs, participants were asked to fill out a structured post-interview survey. The survey included standardized UX metrics (CSAT and SUS) along with open-ended prompts on clarity, visual hierarchy, and feature usability.

Quantitative Results

1. Customer Satisfaction Score (CSAT)
  • Average: 3.67 / 5
  • Interpretation: Users found value in the dashboard concept, but visual clarity and ease of navigation require improvement.
2. System Usability Scale (SUS)
  • Average: 60.83 / 100
  • Interpretation: Slightly below the industry benchmark of 68, indicating usability friction, particularly in interaction clarity and visual hierarchy.

4. Design Approach

To move quickly from idea to direction, I used a lean, iterative design process grounded in early feedback, low-effort prototyping, and scalable UI design practices.

Tools & Process

  • Started with Figma Make to quickly prototype the concept based on the initial sketch
  • Used these interactive low-fidelity prototypes to align with stakeholders early — before investing in visuals
  • Prioritized clarity of structure, navigation flow, and data hierarchy during this stage

 

Transition to Final Design

  • Once direction was validated, I shifted to high-fidelity design using our internal design system
  • Leveraged existing UI components to ensure consistency and speed up implementation
  • Extended visual patterns where needed, while respecting system tokens and layout rules

Collaboration & Iteration

  • Maintained tight feedback loops with stakeholders and users
  • Adjusted layout and data structure based on real user interpretation of prototypes
  • Iterated based on insights from usability walkthroughs and survey findings
  • Focused on modularity and flexibility to support future integrations (e.g., grouping, filters, personalization)

Design Principles

  1. Move fast with clarity: early interactive prototypes helped unblock discussions
  2. Design for evolution, not perfection:  structure was prioritized over polish early on
  3. Balance feasibility and vision: aligning user needs with stakeholder expectations while staying technically grounded

5. Current Status and Next Steps

The UI is finalized, reflecting both stakeholder direction and early user feedback.

We are now:

  • Preparing to test the prototype on UserTesting.com
  • Distributed a company-wide survey to inform final validation scenarios
  • Awaiting broad testing results before finalizing MVP direction

We made a decision to avoid over-investing in polishing the current version, and instead are embedding key dashboard concepts into upcoming feature designs where usage context is clearer and organic.

I plan to track (know how):

  • Customer satisfaction (to correlate with previous version)
  • System Usability Scale (to correlate with previous version)
  • Success rate and the time spent on each task
  • Users Growth
  • Feature Adoption Rate
  • Usage Frequency
  • Drop-off Rate
  • Conversion to Power Users

Nice to track (need help to set-up):
  • Time Saved per Task
  • Productivity Gain
  • AI Efficiency Upfit

 

Why

This will help me connect user experience outcomes with business metrics, showing how improvements in usability and satisfaction translate into higher engagement, faster adoption, and greater efficiency. This will help us to  focus on right ares for next updates.

1. Customer Satisfaction → Engagement & Retention

Logic: Happier users engage more often and are less likely to churn.

  • Active User Growth: % increase in weekly/monthly active users after launch
  • Feature Adoption Rate: % of users actively using AI-related features (e.g., AI insights, automation tools)
  • Usage Frequency: Avg. number of sessions per user per week
2. System Usability Scale (SUS) → Adoption & Task Completion

Logic: When a system feels easy to use, people explore more and adopt features faster.

  • Drop-off Rate: % of users abandoning workflows mid-task
  • Conversion to Power Users: % of users using advanced AI features weekly
3. Task Success Rate & Time on Task → Productivity & Efficiency

Logic: Faster, more successful task completion translates directly to business efficiency.

  1. Time Saved per Task: Avg. reduction in task completion time (e.g., minutes saved)
  2. Productivity Gain: Aggregate time saved across all users, converted to FTE hours
  3. AI Efficiency Uplift: % improvement in speed when users leverage AI features vs manual workflows