NutriLens- AR Smart Glasses

NutriLens- AR Smart Glasses

"Lens that look after your nutrition!"

"Lens that look after your nutrition!"

About Project

Preposition for ​AI-powered AR glasses that analyze physiological health data, recognize food, scan surroundings, and suggest personalized healthy meals in real time.

Theme: Data Mirrors

Brief

Research, conceptualise and design a product, digital ecosystem or interactive material intervention that responds to the theme Data Mirrors. Project can span the diametrically opposed extremes of a commercially focused product to that of a speculative social intervention.

Role

UX UI Designer


3 months

UX UI Designer


4 months

NutriLens

Digital Companion

A pair of AR smart glasses that provide real-time health insights and food guidance by analyzing physiological data, identifying food, and interpreting the user’s physical environment. It continuously analyze the user’s physiological signals (like heart rate, hydration, and glucose trends), meal history and provide personalized healthy food suggestions via minimal AR display, helping users make better dietary choices.

Challenge

Most people are unaware of how their internal health metrics relate to the food they consume.
People are unable to find healthier food alternatives and keep a track of their activities and meals.
There’s a gap between biometric feedback and real-time actionable dietary decisions especially for those with conditions like hypertension, diabetes, or dehydration.

Solution

NutriLens bridge gap between biometric data and mindful eating.
Passively collects real-time physiological data
Recognizes food visually and contextualizes it against the user’s current health.
Delivers timely food recommendations through a subtle, non-intrusive AR HUD.
Enables better awareness, prevention, and smart food choices, seamlessly integrated into daily life.

Chapter 1

Chapter 1

Chapter 1

Research and Analysis

Research and Analysis

Research and Analysis

Objective

To understand how users make food decisions and how physiological signals (like hydration or blood pressure) can inform real-time, health-conscious eating behavior.

Competitive Analysis

How users currently track health and food?
Where the gaps are in real-time guidance?
How effective their UX/UI solutions are?

SWOT Analysis

SWOT Analysis

Persona

Persona

Pain Points

Too Many Touchpoints
Users are required to log every meal, drink manually.
Food tracking apps often depend on barcode scanning or search menus.
Switching between devices (watch, phone, app) to check vitals or track food breaks flow.
Requires visual focus, typing, and confirmation—especially inconvenient while eating or cooking.
High friction leads to low user engagement over time.


No Environmental Awareness
Apps can’t detect where the user is or what food is physically present.
They don't know if you're in a kitchen, store, or near a food source.
Context-free advice feels generic and often irrelevant.


Lack of Real-Time Suggestions
Most apps only provide retrospective analysis (after meals or at day’s end).
Health devices show vitals but rarely link them to actionable food advice.
Users don’t get help when they need it during food decisions.

Insights

Users want automation, not input.
Systems must work in the background, detecting and suggesting without manual effort.
Switching apps or typing disrupts flow—wearables should remove these points of friction.
AR interface ➝ frictionless, ambient guidance


Context is everything.
Food guidance must be aware of location, timing, and surroundings to be meaningful.
Object/environment recognition ➝ contextual nudges


Data should lead to decisions.
Raw stats (like 120/80 BP) mean little to most users unless paired with smart suggestions.
Real-time biometric data ➝ smarter food suggestions

Chapter 2

Chapter 2

Chapter 2

Define and Ideate

Define and Ideate

Define and Ideate

Visual Form

Visual Form

AR Heads-Up Display (HUD) integrated into AI-powered AR glasses

This HUD overlays real-time, personalized visual feedback onto the user’s field of view without requiring manual input or disrupting their environment.

Physiological signals: Vital signs like Heart Rate(color-coded), Blood Pressure, Hydration Status, Glucose Level.
Food recognition: Displays nutritional value and records passively
Surrounding context: Scan surroundings (e.g., kitchen, bakery) to offer timely suggestions
Activity tracking: Tracks user activity and prompts them to go for healthier food options

Why Smart Glasses?

Why Smart Glasses?

Smart Glasses

Users receive health data and food suggestions directly in their field of vision: no touching, typing, or swiping.

Automatically detect surroundings (e.g., kitchen, bakery), and respond with contextual prompts.

Identify food in real time using computer vision, and display nutrition info.

Show live vitals (e.g., heart rate, BP, hydration, glucose) in a subtle AR overlay, connected to body sensors.

No need to pause and switch context between eating and using a phone.

Mobile App

Require constant interaction (opening the app, searching food, entering values).

Lack real-world spatial awareness unless manually prompted.

Require barcode scanning, manual search, or image upload for proper food scanning.

Often display vitals after syncing from another device or lack real-time insight.

Break immersion, requiring effort and focus to operate.

"Data mirrors" Aspect

"Data mirrors" Aspect

Physiological Health Data

(via built-in biosensors)

These metrics help determine what the body needs at any given time.

Heart Rate (BPM): Indicates exertion, rest state, or stress levels.

Blood Pressure: Helps avoid high-sodium food if elevated.

Blood Glucose: Critical for managing sugar intake; especially for diabetic users.

Hydration Level: Suggests wate​r-rich foods when levels are low.
(Optional future extension: Body temperature, oxygen saturation)

Food Recognition Data

(via AR lens and computer vision)

This ensures the system knows what the user is about to eat.

Food Identification: Real-time visual detection of items (e.g., pizza, apple).

Portion Size Estimation: Helps calculate nutritional value more accurately.

Nutritional Breakdown: Calories, protein, carbs, fats, sugar, sodium, fiber.

Meal Logging: Automatically logs recognized meals for dietary tracking.

Environmental Context Data

(via object and location detection)

The glasses adapt based on where the user is and what’s around them.
Physical Location Recognition: Kitchen → triggers meal suggestions, Bakery/store → alerts about high-sugar environments

Object Detection: Fridge, stove, counter → determines food preparation context

Time of Day Awareness: Helps align food suggestions (e.g., breakfast vs. dinner)

Historical Meal Data

(via record food data)

This ensures food recommendations consider what the user already consumed.

Last Meal Composition & Time: Avoids repeating high-sodium/sugar meals and adjusts meal portion or type based on recent intake

Daily Caloric and Nutritional Summary: Guides toward balance (e.g., more fiber if low)

User Journey 1

User Journey 1

User Journey 2

User Journey 2

Chapter 3

Chapter 3

Chapter 3

Design

Design

Design

Interactive Mockup
Interactive Mockup

Prototype

See Smart, Eat Smarter!