Interested in discussing a project or hiring me? Let's do it! Contact Me

Octopus Glove
A funded project by OCAD University. The Octopus Glove is a wearable designed for people with special needs. The target audience was people who are visually impaired. Through this project, the team built a language based on haptic to communicate day-to-day information to the user. Such as time and directions.
Technology Used: Arduino, vibration motors and Piezoelectric actuators
The Idea
Touch is an important and often underutilized sense in human-computer interaction. For those with hearing or seeing impairments, touch becomes even more important and may be the only suitable communication channel. Touch can also be used to convey information discreetly. Also, time plays a crucial role in our daily lives and therefore telling the time is very important. The two main subjects of this project are touch and time.
The team realised the importance of 'senses', which we as humans have not fully explored. Therefore, we started reaching and building a new tactile language through haptics that can be used for day-to-day activities for visually challenged people.
The Question:
Can we use actuators to communicate complex messages?
Parameters such as frequency, amplitude, rhythm, and duration can be used to translate complex messages.

Research 1
Tactons: structured tactile messages for non-visual information display
Example: Reading with Spatio-temporal messages.
-
Encoding alphabets and numbers in a grid
-
Using time and location to train users to recognize patterns
[3] Stephen Brewster and Lorna M. Brown. 2004. Tactons: structured tactile messages for non-visual information display. In Proceedings of the fifth conference on Australasian user interface - Volume 28 (AUIC '04). Australian Computer Society, Inc., AUS, 15–
Research 2
Skin Reading: Encoding Text in a 6-Channel Haptic Display
-
Choosing locations for tactile sensors
-
Encoding used 3-vibromotors at a time
Types of techniques:
-
Actuators
-
Air-pumped Soft Pneumatic actuators
-
Peizoelectric actuators
[2] Granit Luzhnica, Eduardo Veas, and Viktoria Pammer. 2016. Skin Reading: encoding text in a 6-channel haptic display. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC'16). Association for Computing Machinery, New York, NY, USA, 148–155. https://doi.org/10.1145/2971763.2971769
Demo Prototype
Reading time with touch.

