1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

Available for work

Available for work

hello@amercer.com

Button Hover Text

Available for work

Eat n' Log

From idea to App Store: A food journaling app that captures how a meal actually felt

Timeline

2021–2024

Company

Eat n' Log Technologies Inc.

Role

Founder, Product Designer, Product Manager, eventual sole developer

Teams

Up to 14 at peak — 7 developers, 4 marketers, me as founder, PM, and lead designer

Tools

Figma, Maze, React Native (alpha), FlutterFlow + Firebase (beta), Google Maps API, ChatGPT API

Outcome

Shipped to the App Store. NPS improved 140% from -38.5 to 16.3. Recognised by UBC Innovative Project Fund Grant, AcceleratedIP, and Microsoft for Startups.

Link

PROBLEM
PROBLEM

69% of millennials photograph food. Users end up with messy albums.

During COVID I ended up with thousands of food photos that captured nothing — no context, no flavor, no memory of who I was with or how it actually tasted.

The behavior existed at scale. A tool that made those photos meaningful didn't. I went to users before building anything.

RESEARCH

56 interviews across two rounds confirmed the gap in the market

Six interviews in Summer 2021 — food lovers aged 22–35, casual diners and enthusiasts. Then accepted into Entrepreneurship@UBC in Fall 2022, where I ran 50 additional market interviews and sized the market top-down and bottom-up.

Three things came back consistently: food photos alone felt incomplete without context, people were stitching experience journaling across three or more tools, and no app existed for the category — everything was either restaurant discovery or calorie tracking.

Conviction and research aren't the same thing. I was certain users wanted this. The research happened to agree — but I hadn't earned that certainty until the data came in.

BUILDING THE ALPHA

Team of 14. Shipped to TestFlight. Spring 2023.

I assembled 14 UBC students — seven developers, four marketers — and built in React Native. As founder and PM, I ran weekly standups in Discord, maintained the product backlog in Notion, and coordinated Git workflow across the full development team. I directed implementation across all core flows, provided UX sign-off before every release, and launched an Instagram campaign to seed the early user base.

Before launching on TestFlight we ran usability testing in Maze to validate core flows and resolve critical IA issues. The alpha shipped in Spring 2023.

THE PIVOT

The data was clear. A surface fix wouldn't move the number.

Following the TestFlight release, 34 participants completed a post-use survey. NPS came back at -38.5. SUS at 42. I went back into the data.

WHAT THE NUMBERS SAID

Net Promoter Score (NPS) -38.5 System Usability Scale (SUS) 42

Both qualitative attitudinal data point at the same conclusion — real usability problems, not just rough edges

WHAT USERS SAID

"Tedious." "Like a survey." "Too many steps."

Entry creation felt like overhead, not experience.

WHAT THE PRODUCT DID

50+ bug reports.

Occasional failed saves and UI glitches. The product wasn't stable enough to trust.

The NPS and SUS told me the product was struggling. The qualitative data told me why — two separate problems pulling in the same direction. The bugs were destroying trust; the entry flow was exhausting users even when it worked. Patching one wouldn't move either number. Both needed to be solved at the root

OPTION A

Patch the React Native app.

The code base was not production quality. Technical founder left due to personal matters and all student teammates started their fall term.

OPTION A

Patch the React Native app.

The code base was not production quality. Technical founder left due to personal matters and all student teammates started their fall term.

OPTION B — CHOSEN

Rebuild from scratch in FlutterFlow with AI at the core.

Own the full stack alone and address core usablity issue with AI.

OPTION B — CHOSEN

Rebuild from scratch in FlutterFlow with AI at the core.

Own the full stack alone and address core usablity issue with AI.

SOLO REBUILD

Same product, new premise: entry creation in under 30 seconds.

Auto location detection removed the most repetitive field. AI-driven flavor suggestions let users edit rather than articulate from scratch. Roni handled the rest — taking the food name and autofilling nutritional information automatically.

The goal behind all three was the same: automate the toil so users could focus on what actually made the meal worth logging — the memory, the people, the moment. Roni wasn't just a functional shortcut though. Giving the AI a character — a personified gourmet-dog with a name — turned a data entry task into something that felt worth returning to. No competitor in food journaling had treated the emotional layer of the experience as a design problem.

Alpha → Beta: The Numbers

System Usability Scale (SUS)

42 → 68

With the system being bug free and stable built with Flutterflow and Firebase, the Beta SUS crossed the industry average threshold

Net Promoter Score (NPS)

-38.5 → 16.3

A substantial 140% improvement, reaching B2C industry standard

SHIPPING

Apple rejected the app. I found Tim Cook's email and wrote directly.

Apple rejected the app under Guideline 4.3(a) — flagged as a spam clone despite being a new category. I diagnosed it as a misclassification, found Tim Cook's office email, and wrote directly: explained the product, the category, the error.

Apple called within 48 hours and personally assisted with publishing. The app went live and the App Store. Featured on Product Hunt.

Product Hunt Badge
REFLECTION

Pushing for more conceptual range

The research validated the problem — but I committed to the most obvious solution without pushing further into conceptual range first. The more interesting question was whether users needed easier manual logging at all, or something further upstream: an app that automatically organizes and contextualizes food photos using machine learning image recognition, no input required.

When working on any problem, the first instinct is a useful place to start — but rarely the most interesting place to land.

That said, machine learning image recognition at that level of accessibility simply wasn't there yet when Eat n' Log was conceived so there were real technical constraints.

I’m proud of what it became, proud that it made it onto the App Store after emailing Tim Cook, and grateful for what it taught me.

What I'd measure if continued:

30-day entry frequency and photo-to-entry conversion rate. NPS and SUS tell me about first-session quality — but the original problem was about changing a behavior that happens every day. Whether the habit actually forms, and whether users reach for the app instead of the camera roll, are the only metrics that answer whether the product worked at all.

© 2026 Build by Henry Yang 楊添 - All rights reserved.
© 2026 Build by Henry Yang 楊添 - All rights reserved.
© 2026 Build by Henry Yang 楊添 - All rights reserved.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16