|
15
min read

UI/UX Design for AI Products: An AI-Driven Marketing Platform Case Study

UI/UX Design for AI Products: An AI-Driven Marketing Platform Case Study
HOW WE TURNED A 3–5 DAY CAMPAIGN PROCESS INTO A 31-MINUTE WORKFLOW
Nataliya Sambir
Nataliya Sambir
Chief Design Officer
OUTLINE

Background

Every marketer knows the feeling. But UI/UX design for AI products adds a layer most design briefs ignore entirely: you're not just designing an interface — you're designing trust in a system users can't fully see.

Element X was built by a U.S.-based startup accelerator that incubates AI-driven marketing tools, grows them to MVP, and spins them off as independent ventures. The founder came to Linkup ST with a specific frustration — one shared by every marketing team working with modern tools: even with all the data available, building a campaign still feels like assembling a machine from spare parts. Strategy in one platform. Audiences in another. Creatives somewhere else. Hours of setup before a single ad goes live.

The goal was a system where a marketer types an objective in plain language and gets a launch-ready campaign back. Not in days — in 31 minutes.

What UI/UX Design for AI Products Actually Demands

From the first briefing, it was clear that Element X couldn't look or behave like a typical SaaS platform. The users weren't general consumers learning a new tool on the weekend. They were marketers — people who spend their careers inside ad platforms, who have strong opinions about workflows, and who will abandon a product the moment it slows them down or treats them like a beginner.

Every action inside Element X had to lead somewhere meaningful. Dead ends, redundant confirmations, and vague loading states weren't just bad UX — they were product-killers for this audience.

That raised the bar significantly. The interface needed a modular, component-driven architecture with state-aware interactions. Micro-feedback had to communicate AI reasoning in real time — not as a technical readout, but as a human signal that the system understood what the user was trying to do. The AI had to feel less like a tool being operated and more like a collaborator doing the heavy lifting.

The resulting experience achieved a 93% task success rate and a system usability scale score of 86/100 — rated "Excellent" by test participants. But those numbers came later. First, we had to understand who we were designing for.

The Hardest Part Wasn't Technical

Designing UX for AI products requires mapping not just screens, but moments of uncertainty. Before we touched a single screen, we needed to understand something more fundamental than feature requirements: why would a marketer resist handing control to an AI, even if that AI performed perfectly?

The research phase started with qualitative interviews and behavioral mapping across three distinct roles — digital marketers, performance specialists, and growth strategists. Twenty-three conversations in total. We asked about daily workflows, about the moments that caused frustration, about which decisions they'd never want automated and which ones they'd gladly give away.

The pattern that emerged was consistent across every role and every seniority level. Marketers are not anti-AI. They are anti-opacity. They are open to automation — genuinely open — but only when the system shows its reasoning, allows adjustment, and doesn't make them feel like passengers in their own campaign.

In parallel, Linkup ST UX team — working alongside our internal marketing team — conducted a competitive audit of the platforms these users lived in every day: Meta Ads, Google Ads, Google Analytics, Canva. We went further than usability patterns. We looked specifically at how marketers were already incorporating ChatGPT into their workflows: for audience segmentation, ad copy drafts, performance interpretation. That detail mattered. These weren't AI-skeptical users who needed convincing. They were AI-curious users who had already experimented — and who had clear expectations about what good AI assistance should feel like.

The core insight that came out of all of it: the challenge of Element X wasn't building a system that worked. It was building a system that felt trustworthy while it worked. Those are two completely different design problems.

Mapping the Journey Before Designing the Interface

With the research complete, the team moved into a structured ideation phase — but not by jumping straight to screens. Instead, we mapped the entire ideal campaign journey from the moment a marketer types their first prompt to the moment they're looking at post-launch analytics.

At every step we asked the same question: what does this person need to feel right now — guided, supported, or simply informed? That question sounds soft. In practice it was one of the most useful filters we applied throughout the project, because it forced us to prioritize emotional clarity alongside functional clarity.

From that mapping process, three core UX principles emerged. They weren't invented in a workshop — they came directly from what the research told us users needed at each critical moment.

Conversational Flow replaced traditional form-based inputs with a guided chatbot interface. Instead of filling out campaign fields one by one, users describe what they want in plain language. The system interprets intent and translates it into campaign logic. This decision removed the blank-page problem entirely — the moment of staring at an empty brief and not knowing where to start.

Progressive Context Visibility was about discipline as much as design. At each stage of the flow, users only see what's relevant to that exact moment. Advanced options and settings don't appear until the AI has enough context from user behavior to make them meaningful. This kept cognitive load consistently low without hiding the platform's depth. Users who wanted more control could always find it — but they weren't confronted with it before they needed it.

Behavioral Feedback Loop was the principle that did the most work on trust. Every AI recommendation inside Element X is visible, explained, and editable. Users can see why the system made a particular audience suggestion or creative choice — and they can change it with immediate visual feedback. This transparency wasn't just a UX feature. It was the mechanism through which marketers moved from skepticism to confidence.

Together, these three principles drove a frustration score of 1.9 out of 5 and a task performance index of 0.84 — meaning users moved through even the most complex flows with speed, accuracy, and minimal friction.

Architecture That Moves With the User

The information architecture of Element X followed one organizing principle: a single, guided path where users never lose context or momentum.

Most complex platforms fragment the experience across multiple dashboards, settings panels, and separate modules. Users have to carry mental context from one screen to another, remember where they were, and rebuild their orientation every time they navigate. Element X eliminated that entirely.

The core flow connected every major module in one uninterrupted journey: Prompt → Strategy → Creative → Insights → Learning

A marketer begins with intent and ends with behavioral understanding — without switching platforms, opening new tabs, or retracing steps. The architecture itself was modular and data-driven, which meant real-time AI responses could modify the interface dynamically: adding sections when new data became available, collapsing options that were no longer relevant, adjusting content density based on where the user was in the process.

Instead of navigating between static pages, users moved through contextual states of one evolving workspace. The platform came to them, rather than asking them to find it.

This structural decision contributed directly to a 93% task success rate, a 7% drop-off rate, and a CSAT score of 4.7 out of 5.

Visual Language and Interaction Design

The visual direction for Element X had a single brief: make the AI feel like a smart colleague, not a black box.

The aesthetic was intentionally restrained. Clean typography, balanced spacing, and a disciplined color palette kept attention on content and decisions — not on the interface itself. The color logic wasn't decorative. It was functional: each hue was chosen to communicate professionalism, innovation, and trust — the qualities you associate with a knowledgeable collaborator who knows what they're doing and doesn't need to prove it.

Microinteractions carried a significant portion of the communication load. Subtle motion confirmed that the AI was processing, showed users where the system was taking them, and signaled completion without interrupting the flow. These weren't animations added for visual polish. Each one was mapped to a specific moment of uncertainty in the user journey — the moments our research had identified as the highest-risk points for frustration or abandonment.

Data visualization followed the same logic. Performance metrics were displayed as minimal, explanatory charts — designed to tell a story rather than display a dataset. The goal was immediate comprehension, not analytical depth.

Tone of voice throughout the interface was concise, human, and confident. The copy guided without over-explaining, answered questions users hadn't asked yet, and never talked down to a professional audience.

In the 5-second first impression test, 92% of users correctly identified the platform's purpose and described it as "innovative and confident." That result reflects the architecture as much as the aesthetics. Purpose clarity at that speed comes from structure, not styling.

These decisions produced an ease of use rating of 4.6 out of 5 and a trust in AI score of 4.2 out of 5.

When the Testing Humbled Us

The first version of the Creative Builder launched and the numbers looked reasonable. Then we looked more carefully at what users were actually doing inside it.

They weren't approving outputs. They were fighting the interface trying to edit them. Brand colors they couldn't change. Tone settings that didn't sync with the rest of the campaign. Image choices they disagreed with but couldn't replace. Ad formats that were fixed when they needed to be flexible.

The feedback was direct: marketers didn't want a generator. They wanted a collaborator. The distinction sounds subtle. The product implication was significant.

We rebuilt the Creative Builder from the ground up. Editable brand, color, and tone fields that synchronized with campaign data across the platform. Real-time caption and CTA generation previewed directly on the ad mockup — so users could see the change as they made it, not after. Background image selection and upload capability with AI-based visual matching. Flexible ad format management that let users control layouts and sizes for different placements.

The new builder transformed Element X from an automated output machine into a genuine design environment — one where AI assistance remained visible and useful, but human judgment had real room to operate.

The A/B results between Creative Builder v1 and the Enhanced Canvas Builder v2 were among the clearest signals we saw across the entire project: creative completion rate up 31%, satisfaction up 23%, AI trust scores up 16 percentage points, and return usage within 20 days up 50%.

That iteration taught us something worth stating plainly: in AI-powered products, the moment you remove control from the user — even in service of making their life easier — you risk losing them entirely. Agency and automation aren't opposites. The best AI interfaces make both feel natural at the same time.

Results

All performance indicators came from structured UX research and quantitative testing during the Element X closed beta. Three moderated usability-testing rounds with 12 participants each, tracked through Maze and Hotjar, with behavioral data validated through Google Analytics 4 and SPSS.

Metric Result
Task Success Rate93%
Average Time on Task31 minutes from prompt to launch
System Usability Scale86/100 — Excellent
AI Creative Builder Usage72% of users
Audience Builder Usage54% of users
Return Usage49% within 7 days
Manual Edit Rate18% of AI outputs
Ease of Use4.6 / 5
Trust in AI4.2 / 5
Frustration Score1.9 / 5
Drop-off Rate7%
CSAT4.7 / 5
First Impression Test92% correct identification
A/B Uplift — Creative Completion+31%
A/B Uplift — Return Usage+50% within 20 days
Dive into Our Portfolio
Explore a selection of our finest projects delivered to our esteemed customers.
Сheck Out
Right arrow

Frequently Asked Questions

No items found.
Nataliya Sambir
Nataliya Sambir
Chief Design Officer
Share
Dive into Our Portfolio
Explore a selection of our finest projects delivered to our esteemed customers.
Сheck Out
Right arrow

Craft your idea into awesome digital experience

Let’s talk
Right arrow
No items found.