This work was inspired by a heated social phenomenon in China -- matchmaking corner. Targeted on single urbanites, an AR application based on a mask device was design to improve blind date experience and make matchmaking more efficient and intelligent.
MIT Media Lab - MAS. 834 - Tangible Interface
Group of 5: Hye Jun Youn, Aria Xiying Bao, Vishal Vaidhyanathan, Joseph Wu, Lillian Liu
Ideation, Wearable design & fabrication, VR software+hardware prototyping, Video producing
We propose TeleCon (Tele + Silicon), a pneumatically actuated flexible haptic wearable device composed of a soft silicone patch that could be engineered as a self-contained soft-robotic wearable that can be worn anywhere on the user’s skin. TeleCon provides tactile stimuli by leveraging kinesthetic cues from force-feedback silicon actuators that push against the user’s body.
Existing wearable haptic devices have mainly focused on 1) domain-specific design and fabrication strategies, like gloves or rings on a finger that simulate sensations and 2) a Real time haptic feedback. Our goals of this project are 1) creating a self-contained wearable device that can be where anywhere on the user’s body and 2) provide application scenarios and an interface for both real time and Past and present telepresence interaction.
The sensation of feeling the shape of an object when grasping it in Virtual Reality (VR) enhances a sense of presence and the ease of object manipulation. Most prior works focus on force feedback on fingers, yet the haptic emulation of grasping a 3D shape often requires the sensation of touch using the entire hand. When a user's hand encounters an object in a mixed reality world, pneumatic patches on hands and wrist inflate in an appropriate shape, ready for grasping. Then the patches deflate when the object is no longer in play.
When wearing VR goggles, users are unaware of their surroundings, and may risk injury from running into surrounding objects, walls, stairs, and other obstacles. A wearable device on the neck alerts the user by inflating itself when a user gets close to a wall, signalling the user not to move forward. A neck-placed haptic stimuli is most appropriate because the neck tends to lead movement, and can potentially cause change in movement when forces are applied.
Visual and auditory information can be easily saved in current daily used devices and shared in social media. However, tactile sensation is still difficult to save and share within currently available interfaces. Recording a memory of tactile sensation captures something more vivid, immediate and visceral - a core component of our perception tied to emotions such as happiness, sadness, anger, and excitement. Revisiting memories in media with audio, visual and haptic outputs makes them more likely to accurately replicate our original experience.We created and designed an application that saves patterns and forces of inflation on our device and shares the tactile sensation with other people. Users can save ephemeral sensations, such as the feel of bubbles popping on their skin, or the impression of their dogs’ paws. By storing and transmitting past sensations on our devices, one can retrieve memories, share sensations with their loved ones, or even sell and buy unique patterns of sensation in social media or the Metaverse.
We did a round of usability tests that recruited 12 pairs of participants. In each pair, we ask one participant to touch a textile and describe their feelings. Then we ask the other participant to wear our haptic device and describe the sensation from it. Participants verbalize which sensations they feel on their skin, using a list of pre-existing descriptors. We then compare the two descriptions and evaluate the similarities and the differences between them, with an eye toward making the second participant “feel” through haptic output something as similar as possible to what the first participant is feeling through direct touch.
During an epidemic, people often encounter situations where they are unable to interact face-to-face. This also happens between patients and physical therapists. Telecon can better support the remote physical therapy session. By placing the Telecon on the general body area where the patient feels discomfort, and by inflating each grid of the Telecon pad one by one, then the exact location of the patient's pain can be identified. Thereby the therapist can better develop a treatment plan based on this situation.
Model-interactive inflatable shape controling: changing size according to the size of the virtual object
Testing the communication from Arduino to Unity:
First, the bubble change size according to the clicked object.
Then, bubble detect grasping and change color of that object.
In the current status-quo, with numerous studies in HCI making evident the exponential increase of interest in soft-robotics,, designing and programming soft-robotics based wearable devices face a significant upfront hurdle. Rather than designing unique soft actuators, systems, or user experiences, one must usually create their own drivetrain components from the initial concept and devote majority of their effort to building pneumatics, electronics, and software. FlowIO is a pneumatic development platform with a software toolbox for soft robots and programmable materials control, actuation, and sensing. FlowIO is ideal for most wearable and non-wearable pneumatic applications in HCI and soft robotics, with five pneumatic ports and multiple fully integrated modules to meet diverse pressure, flow, and size requirements.
Owing to its versatility, ease of use and accessibility, we chose pneumatic actuation with the FlowIO as our primary mechanism for haptic feedback.
To sense touch, connect a touch sensor microcontroller, which is the We have designed two types of input sources for different application scenarios.One is designed for the present to present scenarios world. Another input source is from the virtual world. Extracting essential variables from VR devices.We rely on the software development to synchronize the two pads. We compute the inputs into several variables here, including Ports, Inflation time, Pneumatic action (Inflation, Vacuum, Release, and Stop), and PWM values of the signalMPR121 to Arduino. Then we use conductive paint, which works like a circuit, to paint these sensing areas on our telecon, connect them to the controller board.
To sense touch, connect a touch sensor microcontroller, which is the MPR121 to Arduino. Then we use conductive paint, which works like a circuit, to paint these sensing areas on our telecon, connect them to the controller board.
To assess morphological features of the design with respect to input pressure, material density and required output actuation, we implemented a computational fluid dynamics (CFD) simulation of the TeleCon. The overall simulation setup and execution entails the following:
The TeleCon design is modeled parametrically in a 3D CAD environment with variable length, breadth, height, cross section thickness, pneumatic “pixel” dimensions and density. The parametric variables are chosen based on their impacting extent on the internal air flow and pressure. They are also chosen based on the required resolution of stimuli, and to minimise complexity of fabrication.
Physical material properties of the silicon - like elasticity, breaking stress, thickness and softness - are programmed to the geometry. Associated dimensional properties like thickness and size are kept parametric to optimise the design sample space. The air-flow properties of the pneumatics hardware are programmed as parametric variables as well, to closely simulate the target pneumatic actuation.