← Back to work

Case Study

FURLATE – Translating paws into plans

Brought structure to an AI experiment so pet parents can record sounds, tag context, and learn what their cats or dogs might be asking for.

Role: Product & Research Lead Timeline: 4 weeks Status: App Store launch

Overview

FURLATE pairs on-device audio capture with a lightweight classifier hosted on Firebase. I designed the data tagging flow, reward moments, and iconography so the experience feels playful yet trustworthy.

Problem / Goal

My Role

Conducted user interviews, mapped flows, created UI kit, and defined the tone-of-voice for insights.

Constraints

Needed to keep ML inference cost near-zero and respect Apple's microphone permissions guidelines.

Process

  1. Wireframe → storyboarded the capture flow and data review moments.
  2. UI → used neon gradients, sticker-style badges, and haptics to signal success.
  3. Shipping → delivered SwiftUI-ready specs plus a motion pack for the listening avatar.

Outcomes

Live Product

Next Steps

Rolling out shared pet profiles and improving accuracy with community-labeled clips. Download the build on the App Store or reach out for a teardown.