SenseWear

SenseWear predicts and responds to potential danger to improve safety in warehouse environments with a smart safety vest and AR headset.

view live product page

What I did

  • Conducted tech-centered research with a focus on mixed reality technology and spatial interaction design
  • Explored mixed reality case studies to understand the design capacity of integrating the physical and digital experience in Unity
  • Prototyping and soldering sensors for data assimilation for multi-sensing interactions

TimeFrame

Sep - Dec 2022, 14 weeks

Tools

Unity, Blender, Figma, C#, Python, Arduino, Soldering

Team

Ning Zheng, Lucas Wang, Isabel Li, Mingshan Wang

What if you could predict potential dangers?

Our Solution

The SenseWear is an ecosystem that includes haptic vests, lights and the AR hud display.  It tracks users' states and movements data to map their trajectory in relation to their surroundings and to predict and respond to danger for preventive safety and increased productivity.

Our challenge

Being inspired by the sensing ability of smart cars, we wanted to empower humans with the ability to sense and warn in advance of their surroundings.

We used warehouse as the context for our investigation. Warehouses are hazardous environments, with an estimated 2.4 million nonfatal injuries (about 40% are caused by collisions) in 2021. It is valuable to address these challenges to create a safer and more supportive  multi-sensing environments for warehouse workers.

researching
through
prototyping.

Our process

Mixed reality technology is a relatively new medium that has not yet been fully integrated into daily use. As such, it is important to conduct research to better understand its design capabilities.

Using a technology-centric design process, I continually prototyped in Unity to explore the concept of data assimilation and bridging the digital and physical experiences.

By prototyping in Unity, it helped us to showcase our ideas and allowed us to test and validate our approach with VR headsets.

Digital system feature 01

status

detection

clone this code on Github

Different working status will cause different level of danger. We want the Sensewear to have the ability to sense which state workers are to give the best protection.

Behavior Analysis

To deign to avoid collisions, we need to firstly understand what behaviors workers have in the warehouse. With research, we found out workers in the warehouse can be categorized into 6 status: resting, standing, moving, lifting and carrying.

By referring to the Hall's proxemic zones and architectural space criteria, we defined the size of the distance zone for each of states. The faster the workers move and the more heavier object they pick up, the larger the distance zone is.

tech-centered prototype

To test out the viability as well as showcasing the idea, I made a low-fidelity prototype to real-time detecting change of different states.

Digital system feature 02

avoidance

trajectory

We replaced traditional navigation arrows with a humanoid guide in our AR system to  foster a sense of warmth and connection with the workers as they navigate the warehouse making the AR system more intuitive and user-friendly.

clone this code on Github

Digital system feature 03

collision

warning

Our guide uses trigger zones to detect other guides and create a trajectory map that is three seconds in advance. This helps it predict potential collisions and change its path to guide users away from them, improving safety and efficiency.

Spacial Awareness Analysis

To deign to avoid collisions, we need to firstly understand what behaviors workers have in the warehouse. With research, we found out workers in the warehouse can be categorized into 6 status: resting, standing, moving, lifting and carrying.

By referring to the Hall's proxemic zones and architectural space criteria, we defined the size of the distance zone for each of states. The faster the workers move and the more heavier object they pick up, the larger the distance zone is.

Physical feature

haptic vest

Physical smart PPE vest scanned with photogrammetry

Testing with Rapid prototype

We created a rapid haptic vest prototype by stitching 2 haptic gloves at the front and back of the vest. Then tested with several people and found out:

  • Only front and back are hard to convey a comprehensive sense of direction so we changed to 6 sensors, which representing the front left, front right, left, right, back left and back right.
  • The frequency and pattern of the haptic we used was too stiff and cannot alert users of the direction effectively. As a result, for the final prototype, we changed the frequency in the Arduino code.

design the haptic with the arduino

By linking the Arduino board to the Unity, we connecting the digital signal to the physical experience.

Soldering the wires

By linking the Arduino board to the Unity, we connecting the digital signal to the physical experience.

full unity prototype

takeaways

Tech-centered design approach

By adopting a design approach centered around technology, I consistently pushed the boundaries and explored new possibilities, resulting in innovative forms of interaction. My biggest takeaways from this experience were "trust the process" and the concept of learning through hands-on experimentation.

focus on the Scalability

In our workflow, we pay a lot of attention to the scalability of different interactions. I consciously prioritize the use and management of code to implement our features. This makes it easier to manage our test prototypes and feature versions, while we can more easily transfer our features to different scenarios and future iterations.

Tech-centered design approach

By adopting a design approach centered around technology, I consistently pushed the boundaries and explored new possibilities, resulting in innovative forms of interaction. My biggest takeaways from this experience were "trust the process" and the concept of learning through hands-on experimentation.

focus on the Scalability

In our workflow, we pay a lot of attention to the scalability of different interactions. I consciously prioritize the use and management of code to implement our features. This makes it easier to manage our test prototypes and feature versions, while we can more easily transfer our features to different scenarios and future iterations.