Skip to main content
01 / 10

Case Study Presentation

Democratizing Machine Learning for Everyone

Anuja Harsha — Lead Product Designer

ML Functions · Cloud Software Group — WebFOCUS

Case Study 002

DemocratizingMachineLearningforEveryone

I came in with zero ML background and got MIT certified to do this work. Over two years, I navigated engineering delays, layoffs, and resource constraints — leading design through it all.

Role & Expertise

Lead Product Designer (End-to-End Ownership)

Timeline

Jan 2023 – Jan 2025

SHIPPING 2026

Company

Cloud Software Group — WebFOCUS

Act I

Zero ML Knowledge. So I Got MIT Certified.

Turned a knowledge gap into an advantage. MIT certified. Months learning before designing.

Zero ML background could have been a liability — instead, it became an advantage. I enrolled in MIT's AI/ML product design certification and embedded weekly with our Principal Data Scientist.
By the time I started designing, I understood the domain deeply enough to challenge assumptions and ask the right questions.
// CHALLENGE: ZERO_ML_KNOWLEDGE

I entered this project knowing nothing about machine learning. So I got obsessed.

01

MIT Professional Certificate

MIT xPRO, Boston

Product Design for Machine Learning & AI

  • ML product lifecycle
  • Design for AI systems
  • Responsible ML practices
02

Weekly DS Embedding

Constant collaboration with Principal Data Scientist

  • Model training logic
  • Evaluation metrics
  • Domain expertise transfer
03

AI-Accelerated Learning

Filled knowledge gaps in real-time

  • Concept clarification
  • Technical terminology
  • Quick domain ramp-up
outcome_log.txt
>RESULT: Within weeks, I could challenge technical assumptions, translate DS requirements into UX patterns, and earn the trust needed to redesign the entire workflow.

System Audit: The Baseline

Identified critical friction points in the legacy 9.2 workflow.

Legacy ML workflow showing fragmented data flow canvas
Zoom Image
Legacy 9.2

Fragmented Workflow

// 01_FRAGMENTATION
  • 4+ scattered steps to train a single model
  • Drag "model pill" onto data flow canvas
  • Configure in disconnected popup windows
  • Hidden toolbar controls for execution
Legacy ML interface with hyperparameters hidden in context menus
Zoom Image
Legacy 9.2

Hidden Configuration

// 02_HIDDEN_COMPLEXITY
  • Hyperparameters buried in right-click menus
  • Settings only accessible after training
  • No inline guidance or documentation
  • Technical jargon without explanation
Legacy ML error state showing results not generated message
Zoom Image
Legacy 9.2

Opaque Error States

// 03_DEAD_END_ERRORS
  • "Results not generated" with no explanation
  • No guidance on how to fix issues
  • Silent failures during model training
  • Users abandon workflow mid-process
Legacy Run Model landing page - disconnected from main WebFOCUS workflows
Zoom Image
Legacy 9.2

Platform Disconnect

// 04_PLATFORM_DISCONNECT
  • No connection to main WebFOCUS workflows
  • Felt bolted-on, not platform-native
  • Multiple context switches required
  • No clear entry points from hub
audit_summary.log
> AUDIT_CONCLUSION:The legacy ML workflow wasn't just hard for beginners — it was fragmented, opaque, and frustrating.
NEW MANDATE:Make predictive modeling usable, understandable, and trustworthy — for analysts and everyday business users.

Act II

No Users. No Access. So I Built a Proxy Network.

Three personas with conflicting needs. The existing experience served none of them well.

Three personas. Three different needs: data scientists wanted depth and control over hyperparameters, business users wanted simplicity and guidance, analysts wanted both. The existing experience served none of them well.
After documenting every workflow and decision point, I realized: if I find this frustrating after weeks of study, a first-time user has no chance. That insight drove the redesign.

Gathering Insights Without Direct Access

How I built user empathy through proxy networks when direct research wasn't possible.

CONSTRAINT: NO_DIRECT_ACCESS

Enterprise security policy blocked direct access to end users. I could not interview the actual people using the tool.

// BLOCKED_METHODS
×Direct user interviews
×On-site observation sessions
×Usage analytics access
STRATEGY: THE_PROXY_NETWORK

Talked to everyone I could — support reps, technical staff, and data scientist friends outside work. If they touched ML, I picked their brain.

// PROXY_SOURCES
+Support ticket pattern analysis
+DS friends outside work
+SME usability sessions
+Internal domain experts
BUILT A PROXY NETWORK
User Personas

User Personas

Two distinct user types drove the dual-experience approach.

Techy Analyst

TECHNICAL

Self-sufficient power users who need right-click entry and advanced controls.

Financial Strategist

BUSINESS

Goal-oriented users who need guided workflows and plain-language explanations.

Dual-experience approach: Technical users get right-click entry + advanced controls. Business users get guided workflows + inline teaching.

Act III

From Black Box to 4-Step Guided Flow.

Designed a structured 4-step guided flow: Problem Type → Target → Predictors → Hyperparameters.

The mapping revealed the problems, but solving them required inventing infrastructure that didn't exist.
The biggest gap: there was NO Predict Data landing page. The original workflow was so broken you could barely begin training a model I designed it to serve multiple purposes: select or change a dataset, show both Run and Train model options, and display existing models — both previously trained models and models available to run on the current dataset.

. To increase adoption and discoverability, a landing page was a must.

Initially, I put everything on a single display. Engineering pushed back: "It's too confusing — code-wise it's also difficult to manage." So we split Train and Run into two tabs engineering's requirement of separating the two workflows.

. It took real brainstorming to get there — our need for clean UX and easy decision-making vs.

The model display was another breakthrough. The original approach used dense tables — rows of metrics that made comparison nearly impossible for non-experts I iterated on the card design with our Principal Data Scientist (who ensured the right metrics appeared) and my Director of Design (who ensured I was following the design system — I was still less than a year at the org).

. I introduced a **model card** pattern instead: each model gets its own card with the key details needed for an informed decision.

The 4-step UX spine came from asking the right question: "What do you absolutely need to train a model responsibly?" Problem type, target variable, predictors, and hyperparameters. That became the guided flow that made ML accessible without dumbing it down.

The System Blueprint

Physical sketches and architecture maps that defined how the new ML wizard would work within the WebFOCUS platform.

Sketches · 11 pages
Early concept wireframes

Sketching the "Train Model" wizard steps on paper.

System Notes

Initial thoughts and requirements gathering.

User Flow Mapping

Mapping the data scientist's journey through the system.

Architecture Diagram

Connecting the disparate machine learning subsystems.

Logic Map

Defining decision trees for model training.

Early Wireframes

Conceptualizing the drag-and-drop interface.

Refining Flow

Refining the step-by-step model configuration.

Identifying Pain Points

Solving for parameter fragmentation.

Brainstorming Solutions

Exploring different layout options.

UI Layout Sketches

Visualizing the results dashboard.

Finalizing Approach

The chosen direction for the ML workflow.

Architecture Diagrams
ML UI Structure

How the ML workflow redesign fits within the existing WebFOCUS shell and navigation.

ML workflow by user type

How Train Model and Run Model connect and flow into each other by user type.

ML workflow in IQ Plugin

Entry points, data selection, and training steps within IQ Plugin.

All model types architecture

Complete system architecture for all ML model types supported.

Mapping the Existing Black-Box Workflow

I documented every workflow step, every user decision point, and every place where users got stuck.

01

Data Selection

  • >How users selected data
  • >Data flow integration
  • >Dataset compatibility
02

Training Configuration

  • >How they configured training
  • >Hyperparameter access
  • >Model type selection
03

Model Execution

  • >How models ran (or failed)
  • >Tiny toolbar play icon
  • >Hidden execution states
04

Results Interpretation

  • >How results were interpreted
  • >Confusing error messages
  • >Unclear model outputs
CORE_PROBLEMS_IDENTIFIED
Technical language everywhere
Fragmented flows across screens
No guidance for non-experts
Hidden states and errors
No WebFOCUS integration
system_insight.log
> Insight:
$

The core problem wasn't the algorithm—it was the fragmented workflow. Users didn't know *where* to start or *how* to finish without error.

DESIGN PIVOTS

Three Critical Design Pivots

The mapping revealed the problems, but solving them required multiple iterations.

01

Split Train vs Run

LEGACY PATTERN
ALL

Unified Train+Run experience mixing two mental models

NEW ARCHITECTURE
T
R

Separate tabs for distinct mental models: Create vs Apply

RATIONALE: We were mixing two different mental models — creating models vs applying them. Splitting them simplified both UX and implementation.

02

Remove Data Flow from ML UI

LEGACY PATTERN

Engineering wanted data flow visible inside ML UI

NEW ARCHITECTURE
ML

Data flow accessible elsewhere, ML UI stays focused

RATIONALE: After training models hundreds of times myself, I concluded the data flow was adding noise and risking users losing progress.

03

Model Cards vs Tables

LEGACY PATTERN

Legacy UI used tables for everything, including models

NEW ARCHITECTURE

Model cards for better scanning and comparison

RATIONALE: Despite initial pushback, cards became the primary way to choose a model, with tables reserved for dense tabular data like training logs.

// BREAKTHROUGH

These pivots weren't just UX decisions — they were architectural choices that simplified implementation while improving user experience. The structured guided flow emerged from our domain expert's answer to "What do you absolutely need?" → problem type, target, predictors, hyperparameters.

Act IV

10+ Iterations. One Breakthrough Screen.

10+ iterations on the confusion matrix. Productive tension with Data Science → best screen in the project.

Led cross-functional alignment across Product, Engineering, and Data Science. Weekly syncs. Shared Figma. Screen-by-screen design reviews.
The confusion matrix screen alone went through 10+ iterations. We moved from standard tables to a real-time, three-panel visualization that gives users immediate control.
Our Principal Data Scientist pushed for advanced metrics; I pushed for clarity. That productive tension produced a multi-panel view that serves both data scientists (threshold control, AUC scores) and business users (visual charts, side-by-side comparisons). He called it "the best screen in the entire UX revamp."
What I'd do differently: Earlier, cleaner alignment with our Data Scientist. He surfaced insights late in the process that were harder to incorporate — some we did, but it cost us time.
View:
Use ← → arrow keys or drag
legacy.exe
modern.tsx
New Guided ML Workflow - Linear and accessible
Legacy ML Workflow - Fragmented and technical
Legacy Workflow
Guided Workflow
Key Differences

Transformed a fragmented, error-prone technical process into a guided, linear wizard accessible to business users.

Before: Original Interface: Fragmented 4+ step workflow hidden behind context menus

After: Redesign: Guided 4-step wizard with clear error handling

Balancing Model Control with Simplicity

Layered disclosure: serve multiple user types within a single experience.

Default Experience

Non-technical users

  • Guided, safe, error-proof flow
  • Clear steps, clear terminology
  • Tooltips, wizards, onboarding text
  • Inline explanations
  • Progressive disclosure

Advanced Controls

Technical users

  • Expandable sections for parameters
  • Ability to adjust algorithms
  • Feature selection options
  • Training configuration

Expert Mode

Data scientists

  • Full control available but not required
  • Advanced tuning options
  • Hyperparameter presets
  • All within the same unified experience
Safety guardrails
  • Incompatible datasets blocked early (Step 1)
  • Inline warnings in plain language
  • Standardized error patterns from design system
  • Users always have a way out: go back, change input, or safely cancel
..
Future Roadmap
  • Auto-suggesting best model type based on dataset
  • Deep hyperparameter presets and advanced tuning
  • Unified Train+Run single-screen concept
  • Heavier onboarding overlays / carousels
>

Default users never see expert controls unless needed. Experts can dive deep without wading through tutorials.

System Inspection: The Design Artifacts

Predict Data - Empty State

Entry Points — Initial landing when no models exist

The Guided 4-Step Workflow

Every screen of the redesigned ML wizard.

Empty State — Train Models

Entry Point: Empty state with clear call-to-action

Legacy ML Workflow
New Guided Workflow

Password Protected

Click to unlock

Before
  • Fragmented 4+ step workflow
  • 12+ clicks through data flows and menus
  • Hidden hyperparameters behind right-click
  • Confusing "results not generated" errors
After
  • Structured 4-step guided flow
  • 7-9 clicks via right-click entry
  • Clear error handling and validation
  • Accessible for non-technical users

Legacy vs. Redesign

legacy_ui.exe
workflow_step_1.tsx
After
Before
Legacy
Redesign

The Confusion Matrix

"The best screen in the entire UX revamp." — Principal Data Scientist

Confusion Matrix — 10+ iterations to perfect

Design System Components

Shared design system used across all three case studies. A unified language for coherence.

Color Palette
Color Palette
Typography
Buttons
Content Guidelines
Samples

Artifact 01 of 24

Act V

I Led This While Owning Three Other Products.

Dual-experience approach emerged from constraints. Led this while simultaneously owning ReportCaster and IQ Plugin.

The dual-experience approach emerged from engineering constraints. We couldn't rebuild the advanced mode — it had to coexist with the new guided flow. What felt like a limitation became a feature: experts got their power, newcomers got guidance.
I led this while simultaneously owning ReportCaster and IQ Plugin Workload prioritization wasn't optional; it was survival. I created daily trackers, weekly trackers, and ticket trackers — all in Slack — and shared them with my manager and Director of Design so they never had to wonder where I was at. They always knew my progress.

. After the January 2025 layoffs, I took on WebFOCUS Designer as well — juggling four major enterprise systems at once.

The layoffs didn't force me to cut ML features — but they made me more protective of the work and more aware of my unique position Cross-project pattern sharing meant solutions in one project accelerated the others: the structured flows and upstream validation patterns from ML directly informed IQ Plugin, and the modal architecture from RC became a platform-wide reference.

. I was one of the few people left who understood the full ecosystem end-to-end.

// TRUST_EARNED

The Explainability Deep Dive

How tackling the most complex ML visualization earned the trust to revamp the entire experience

// THE_CHALLENGE

My Principal Data Scientist handed me a screenshot from an external tool and said:"This is the explainability visualization I need in WebFOCUS. Can you figure it out?"

// REFERENCE_INPUT: Data Scientist's Requirement
Reference explainability visualization from external tool - what the Principal Data Scientist wanted recreated

// REF_IMAGE: External explainability tool our Data Scientist wanted me to recreate for WebFOCUS

// MY_SOLUTION: WebFOCUS Native Implementation
New explainability UI designed for WebFOCUS ML workflow

// NEW_DESIGN: Custom explainability visualization I created for WebFOCUS

testimonial.log
"

The other designers I worked with before didn't really understand what they were doing—they just gave designs. But you sat with us, talked to us, and actually understood. That's why I trust you.

> AUTHOR:

Marcus Horbach

Principal Data Scientist

// OUTCOME

Understanding before designing: I sat with the data scientists, learned the domain, and understood why explainability mattered before designing how to show it. This approach—earning trust through genuine understanding—unlocked the opportunity to redesign the entire ML training workflow from scratch.

Navigated PM Change. Earned DS Trust.

Weekly syncs with Principal Data Scientist. 10+ iterations on the confusion matrix alone.

DS Partnership

1:1s with Principal Data Scientist · ML concepts → UX decisions · Weekly UX + ML syncs

PM Transition

PM changed mid-project · Onboarded new PM to engineers · Early flows as shared language

Trust Earned

Principal DS became advocate · Owned ML + RC simultaneously · Leadership greenlit early

"Anuja turned one of our most complex ML flows into an experience users can finally understand." — Principal Data Scientist

Act VI

5/5 Validated. Patterns Became the Platform Standard.

5/5 SMEs found entry without help. Patterns became foundation for IQ Plugin and platform-wide AI strategy.

5/5 SMEs found the entry point without help. Dead-ends → clear guidance. Design demos to 150-200 person business unit earned leadership support.
The patterns I developed here — structured flows, upstream validation, right-click entry, dual-experience — became the foundation for IQ Plugin and platform-wide AI strategy.

Impact & Validation

Demo-Ready at Scale

For the first time, ML was stable enough for 200+ person org-wide demos. Sales engineering could finally showcase the capability confidently.

Zero Abandonment

Eliminating dead-end errors didn't just save clicks; it stopped users from quitting. All SME testers completed the full workflow.

1 Core Pattern

The 4-step guided flow pattern was so robust it was directly inherited by the IQ Plugin, reducing future design/dev time.

New User Tier

Lowering the technical barrier allowed Business Analysts to self-serve, expanding the addressable market beyond just Data Scientists.

Voice of the Team

The clarity of her designs, in spite of the underlying data science and machine learning complexity, is impressive and has greatly contributed to the success of our products. Her design solutions are rooted in a deep understanding of the purpose of the product.

Marcus HorbachPrincipal Data Scientist

During a User Acceptance Test session, Anuja observed me navigating the screen. I was highly impressed with Anuja's approach. Her design was clean, intuitive, and clearly addressed the needs of users across different skill levels.

Anita GeorgePrincipal Account Tech Strategist

Honest Reflection

What I'd Push Harder For

Early Data Science Alignment: I would have established the "productive tension" earlier. By the time I challenged the Principal DS on the confusion matrix, we had already lost sprint cycles on less effective visualizations. Bringing design into the definition phase—not just the execution phase—would have saved ~3 weeks.

Where I'd Take It Next

Real-Time Visualization: Currently, users must finish training to see results. I'd implement streaming websocket updates to show the confusion matrix evolving during training. This would turn a passive wait time into an active monitoring experience, allowing users to abort bad runs early (saving compute costs).

Interested in working together?