
Neurotechnology Wearables
Neurotechnology Wearables
Neurotechnology Wearables
07
2025
Prototyping
Prototyping
XR
XR
Neurotech
XR
XR
DELIVERABLES
Working prototypes
INTRO
Pregnancy is a journey filled with emotions, sensations, and connections—yet these experiences are often felt in isolation. Our project, CoFeel, seeks to bridge this gap.
Pregnancy is a journey filled with emotions, sensations, and connections—yet these experiences are often felt in isolation. Our project, CoFeel, seeks to bridge this gap.
Pregnancy is a journey filled with emotions, sensations, and connections—yet these experiences are often felt in isolation. Our project, CoFeel, seeks to bridge this gap.
The problem
Pregnancy Is Felt in Isolation
Pregnancy brings intense physical and emotional changes—but often, these experiences are invisible to loved ones. This emotional gap can increase anxiety and leave expectant mothers feeling alone, even with strong support systems



This is what a Radar look like, but there are more than 20 more them!
This is what a Radar look like, but there are more than 20 more them!
This is what a Radar look like, but there are more than 20 more them!
What it is
Turning Pregnancy Into a Shared Journey
Turning Pregnancy Into a Shared Journey
Turning Pregnancy Into a Shared Journey
CoFeel is an immersive VR experience that lets partners and family feel pregnancy together. By translating real fetal movements and emotional states into a shared virtual environment, CoFeel makes pregnancy a multisensory, participatory experience rather than a solitary one
CoFeel is an immersive VR experience that lets partners and family feel pregnancy together. By translating real fetal movements and emotional states into a shared virtual environment, CoFeel makes pregnancy a multisensory, participatory experience rather than a solitary one
CoFeel is an immersive VR experience that lets partners and family feel pregnancy together. By translating real fetal movements and emotional states into a shared virtual environment, CoFeel makes pregnancy a multisensory, participatory experience rather than a solitary one



Discover Bigger Picture
Discover Bigger Picture
Discover Bigger Picture



Build Depth for Personalized Path
Build Depth for Personalized Path
Build Depth for Personalized Path
Sensing
We used EMG sensors on the abdomen to detect fetal movements and EEG sensors on the frontal and occipital lobes to monitor emotional states. These signals form the bridge between the pregnant individual’s inner experience and the shared virtual world
We used EMG sensors on the abdomen to detect fetal movements and EEG sensors on the frontal and occipital lobes to monitor emotional states. These signals form the bridge between the pregnant individual’s inner experience and the shared virtual world
We used EMG sensors on the abdomen to detect fetal movements and EEG sensors on the frontal and occipital lobes to monitor emotional states. These signals form the bridge between the pregnant individual’s inner experience and the shared virtual world






Environment Visualization in VR
Emotions as Weather
Emotions as Weather
Emotions as Weather
Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby
Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby
Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby



Galaxy Visualization: Enables open-ended exploration, revealing hidden connections through an intuitive, visual map.
Galaxy Visualization: Enables open-ended exploration, revealing hidden connections through an intuitive, visual map.
Galaxy Visualization: Enables open-ended exploration, revealing hidden connections through an intuitive, visual map.



Personal Dashboard: Adapts to individual roles and goals, surfacing timely, relevant insights that support day-to-day decisions.
Personal Dashboard: Adapts to individual roles and goals, surfacing timely, relevant insights that support day-to-day decisions.
Personal Dashboard: Adapts to individual roles and goals, surfacing timely, relevant insights that support day-to-day decisions.
Final Result
The project was recognized as Grand Finalist at MIT Reality Hack 2025 and received 2nd Place for Best Use of OpenBCI, celebrating its innovative use of biosignals and immersive technology to create new forms of intimate connection
The project was recognized as Grand Finalist at MIT Reality Hack 2025 and received 2nd Place for Best Use of OpenBCI, celebrating its innovative use of biosignals and immersive technology to create new forms of intimate connection
The project was recognized as Grand Finalist at MIT Reality Hack 2025 and received 2nd Place for Best Use of OpenBCI, celebrating its innovative use of biosignals and immersive technology to create new forms of intimate connection
Listening to the Body and Mind
Listening to the Body and Mind
Listening to the Body and Mind



Technology View
What it does: Groups projects by the technologies they apply or explore.
Hierarchy: Tech field → Specific methods/models → Projects
Why it’s helpful: Ideal for identifying innovation trends, technical overlaps, or evaluating tech adoption maturity.
Technology View
What it does: Groups projects by the technologies they apply or explore.
Hierarchy: Tech field → Specific methods/models → Projects
Why it’s helpful: Ideal for identifying innovation trends, technical overlaps, or evaluating tech adoption maturity.
Technology View
What it does: Groups projects by the technologies they apply or explore.
Hierarchy: Tech field → Specific methods/models → Projects
Why it’s helpful: Ideal for identifying innovation trends, technical overlaps, or evaluating tech adoption maturity.



Domain View
What it does: Groups projects based on problem space within SLB, application area, or impact field (e.g., “Subsurface,” “Grid Modernization”).
Hierarchy: Domain → Product → Projects
Why it’s helpful: Useful for strategists or external partners to see applied impact areas and identify gaps or overlaps in research.
Domain View
What it does: Groups projects based on problem space within SLB, application area, or impact field (e.g., “Subsurface,” “Grid Modernization”).
Hierarchy: Domain → Product → Projects
Why it’s helpful: Useful for strategists or external partners to see applied impact areas and identify gaps or overlaps in research.
Domain View
What it does: Groups projects based on problem space within SLB, application area, or impact field (e.g., “Subsurface,” “Grid Modernization”).
Hierarchy: Domain → Product → Projects
Why it’s helpful: Useful for strategists or external partners to see applied impact areas and identify gaps or overlaps in research.



Team View
What it does: Organizes projects by contributing teams (e.g., AI Lab, Frontend, Robotics).
Hierarchy: Team → Sub teams within a Lab → Projects
Why it’s helpful: Great for internal alignment, performance tracking, and collaboration mapping across the org.
Team View
What it does: Organizes projects by contributing teams (e.g., AI Lab, Frontend, Robotics).
Hierarchy: Team → Sub teams within a Lab → Projects
Why it’s helpful: Great for internal alignment, performance tracking, and collaboration mapping across the org.
Team View
What it does: Organizes projects by contributing teams (e.g., AI Lab, Frontend, Robotics).
Hierarchy: Team → Sub teams within a Lab → Projects
Why it’s helpful: Great for internal alignment, performance tracking, and collaboration mapping across the org.
Final Result
Inside the VR world, calm waters mirror tranquility, while shifting weather patterns (sunlight, rain, clouds) reflect emotional changes.
Inside the VR world, calm waters mirror tranquility, while shifting weather patterns (sunlight, rain, clouds) reflect emotional changes.
Inside the VR world, calm waters mirror tranquility, while shifting weather patterns (sunlight, rain, clouds) reflect emotional changes.
Due to the limitation of time, we didn't get to implement all the features. Below are the proposed features that are possible with available data from radar.



Inspiration Prompt:
Inspire user about what question to ask
Presentation of popular questions
Build mental model of how the system work
Inspiration Prompt:
Inspire user about what question to ask
Presentation of popular questions
Build mental model of how the system work
Inspiration Prompt:
Inspire user about what question to ask
Presentation of popular questions
Build mental model of how the system work



Intelligent Search & Filters:
Help user ask better questions by:
Evaluating if the system has enough parameters from user input
Ask follow up questions and give suggestions on refinement
Intelligent Search & Filters:
Help user ask better questions by:
Evaluating if the system has enough parameters from user input
Ask follow up questions and give suggestions on refinement
Intelligent Search & Filters:
Help user ask better questions by:
Evaluating if the system has enough parameters from user input
Ask follow up questions and give suggestions on refinement



The interactive timeline serves as both a filter and a line graph that indicates change in activity over time.
The interactive timeline serves as both a filter and a line graph that indicates change in activity over time.
The interactive timeline serves as both a filter and a line graph that indicates change in activity over time.



See projects with related domains, tech, team easily
See projects with related domains, tech, team easily
See projects with related domains, tech, team easily
Final Result
visual parameter
Mapped to
What’s needed to implement
Planet size
effort invested (time or money)
Actual effort metrics based on e.g. time and money
Star Brightness
Impact or visibility
quantitive impact score, or stakeholder priority
Rotation speed (within the trail or self-rotation)
Update frequency within stages, major updates
More granular way of documenting updates in projects
Split off trails
if multiple spin-off projects emerged from an original one
More relationship data to indicate relationships between projects
Satellite (Level 3 Category)
work (tech evaluation, partner, spin off) that supported that specific projects
Evaluation of DataOS
Planet shape
Entity type
Project, tech evaluation, partner
Domain star cluster
Product sub concepts
product line-features
Bidirectional Interaction
Feeling Each Other’s Presence
Feeling Each Other’s Presence
Feeling Each Other’s Presence



Our approach enables adaptive interfaces through conversational feedback. Users can directly express what they want to track or explore.



By entering their goal or question into a prompt input box, the system performs intent tagging to categorize the query and automatically surfaces the most relevant UI components.
Step 1: Intent Tagging: Understanding the ask
On interpreting the user input side, I defined 5 parameters to tag a questions.
On interpreting the user input side, I defined 5 parameters to tag a questions.
On interpreting the user input side, I defined 5 parameters to tag a questions.








For example ☝️
How did I come up with these parameters? Why?
How did I come up with these parameters? Why?
How did I come up with these parameters? Why?
During the user interview, I asked each interviewee: " If you can ask the galaxy page any question, what would you ask?" and looked for recurring patterns in what they were fundamentally asking for.
While an LLM could infer UI responses directly, defining a parameter tagging framework adds a structured, interpretable layer between user intent and system response. Putting LLM on rails as oppose to letting it range free.
During the user interview, I asked each interviewee: " If you can ask the galaxy page any question, what would you ask?" and looked for recurring patterns in what they were fundamentally asking for.
While an LLM could infer UI responses directly, defining a parameter tagging framework adds a structured, interpretable layer between user intent and system response. Putting LLM on rails as oppose to letting it range free.
During the user interview, I asked each interviewee: " If you can ask the galaxy page any question, what would you ask?" and looked for recurring patterns in what they were fundamentally asking for.
While an LLM could infer UI responses directly, defining a parameter tagging framework adds a structured, interpretable layer between user intent and system response. Putting LLM on rails as oppose to letting it range free.



Step 2: UI Matching: How to Answer with Different UI Components
On the system output side, for each of the potential intent, I have a set of corresponding UI components to answer.
On the system output side, for each of the potential intent, I have a set of corresponding UI components to answer.
On the system output side, for each of the potential intent, I have a set of corresponding UI components to answer.

01

01

01

02

02

02

03

03

03

04

04

04



For example, for "Discover" intent, an UI component like this be fetched☝️
Step 3: Curate Dashboard Space with Feedback



Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby
Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby
Partners immersed in VR can respond by lighting a symbolic lotus lamp—sending comforting LED patterns back to the pregnant person’s wearable vest. Interacting with the baby’s avatar in VR also triggers glowing feedback, creating a continuous emotional loop between mother, partner, and baby



During the process of brainstorming how to leverage the power of LLM to empower better search, I talked to a start up company called glean, which provides a software that combines all source of information across platforms and domain and fuse them into smart search that's specific to the company.
Reflection
The launch of the new fashion e-commerce platform has had a transformative impact on the clients online presence and sales performance. The enhanced design and functionality have resulted in increased user engagement, higher conversion rates, and improved customer satisfaction.
The launch of the new fashion e-commerce platform has had a transformative impact on the clients online presence and sales performance. The enhanced design and functionality have resulted in increased user engagement, higher conversion rates, and improved customer satisfaction.


