GenAI
Automation
Design Systems
Innovation
Automating Product Design with AI
Skippr, Head of Product Design
Mar - Sep 2023
Context
Problem
Designing digital products is slow and resource-heavy, with teams spending countless hours translating ideas into UI.
Goal
Skippr set out to pioneer AI-driven workflows that generate production-ready screens from text prompts and export directly to Figma.
Challenge
No dataset of high-quality, labeled design screens existed, so the first task was to create one robust enough to train AI models effectively.
My Role
As a founding team member, I worked at the intersection of design, product, and AI — defining the data strategy, leading research into design patterns, building the component library and workflows, and designing Skippr’s user-facing interface. I also hired and mentored interns, coordinated with ML engineers, and partnered with founders on priorities.

Approach
How can we generate data that fuels machine learning and delivers both speed and design value?
The key decisions early on were about where to focus and how to make the AI outputs valuable to real designers. We chose mobile-first because smaller screen real estate reduced complexity and made it achievable for an MVP. We also prioritized Figma export since it’s the industry standard and would allow product teams to continue iterating seamlessly on generated files.
With those foundations in place, the main challenge was defining how to generate a dataset that was large enough, diverse enough, and consistent enough to train the ML model effectively within limited resources. My strategy was to frame this as an experiment: start with the most flexible approach to maximize learning potential, then pivot if speed and quality demanded it.

Iteration 1
Atoms-based screens
01
Hypothesis
If the AI saw a large variety of complete screens built from atomic UI elements, it would learn how to combine them logically into new layouts.
02
Execution
Defined a workflow where interns manually designed mobile screens by combining UI atoms.
Prioritized diversity of layouts to “teach” flexibility.
Chose mobile-first, since smaller screen real estate made the problem more constrained and achievable for MVP.
03
Outcome

Iteration 2
Component library system
01
Pivot
Instead of full screens from atoms, I defined a structured, reusable component library (“organisms”) covering most common patterns.
02
Execution
Researched most popular apps across industries to identify essential vs. optional patterns.
Designed detailed, variant-rich components covering almost every common use case.
Built complete user flows by combining these components — e.g., onboarding, checkout, search.
Added semantic naming conventions + descriptive metadata for each screen so the model could map text descriptions to visuals more precisely.
03
Outcome



Result
5–10x faster screen creation (from 30 mins → 3–7 mins)
Delivered a high-quality dataset powering functional AI screen generation
Enabled Skippr to secure next round of investment and validate its product vision
Skippr generating mobile app - screen recording
Reflection
Even though Skippr didn’t ship commercially, it was a first-mover experiment that gave us deep insight into both AI’s potential and its limitations.
The experience proved that design leadership in frontier tech is as much about inventing methodology as shipping features. My impact was in showing how design can guide the responsible, usable application of new technologies.