Human-Machine Interfaces (HMI) are constantly evolving. In 2007, the iPhone redefined the concept of portable HMIs with its creative use of a touchscreen interface that gave users the functionality of a phone and a computer, all in their pocket. Prior to the adoption of the touchscreen, portable HMIs consisted mostly of single function mechanisms such as buttons, switches, and knobs that limited the functionality that devices could offer. As the touchscreen revolution has continued forward smaller, more invisible, and programmable interfaces are finding their way to users.
One of the more recent evolutions, wearable user interfaces (UI), offers users a narrower range of inputs but within more specialized devices that are tuned to a smaller subset of activities. For instance, products like the Apple Watch demonstrate that changing the size of the screen might alter certain functions like typing on a keyboard with two hands but move users towards alternate forms of input like voice or single finger typing. AirPods, as another example, utilize a few basic gestures like tap, squeeze, or in ear detection, yet still provide an engaging experience because of the matching of limited input options with the most important experiences that users want control over (such as pausing or resuming their audio feed).
Today, while sometimes limited in the number of available user inputs, wearable UIs offer a different value proposition by being more personal and specific to the activities they are tuned for.
Intelligent fabric sensing designs offer the opportunity to expand the interactions of the portable or wearable UI of today’s flat or rigid devices (e.g., smartphones, smartwatches, and earbuds) by being lightweight, thin, flexible, and invisible. When deployed as an interface, fabric-based sensing also provides similar customizable inputs as the touchscreen by providing a surface of interaction instead of traditional input controls, such as buttons. This touch interaction surface enables familiar gestures like tap, swipe, pinch, zoom, long or hard press, and multi-touch offers product designers even more options for expanding functionality to a wide range of surfaces and objects. In order to soften some of today’s interfaces, wearable e-textiles and soft circuitry can be integrated into products like earbuds and wrist-worn trackers or watches, but also less common places like helmets or in the sleeves of jackets.
Expanding past the flat or rigid interface (i.e., touchscreens or typical control panels) allows for more than just new wearable UI experiences. Contoured surfaces that naturally fit to the human form also become opportunities for new interactions or measurements. So, whereas wearable UIs reside on the user, handheld devices, tools, or equipment (such as anything with a formed grip or handle) that are built for human users also benefit from sensing interfaces made using flexible smart fabric solutions.
For example, those who want to embed interfaces into handheld devices (say, VR controllers) can design new interactions that are contoured to how the device is held in the hand. In order to maintain the wide range of touchscreen-like inputs and gestures, a flexible sensing solution is the ideal option for comfort and a full-featured user experience.
“Any surface can be your UI.” – Jerry Kurtze, VP of Business Development and Marketing at BeBop Sensors.
Additionally, with the new wave of connected fitness equipment that is built to improve the human body, an opportunity has emerged to provide designs specifically to sense and measure the user’s interactions using smart fabrics. Today, your connected stationary bike or strength trainer can tell you how hard you are pumping your legs or arms. Tomorrow, it will be able to tell you whether you are putting too much or too little pressure on your handlebars or grips, leaning too far left or right, or have your seat or bench positioned incorrectly relative to the rest of your equipment.
From injury prevention to optimization, smart sensing surfaces that go beyond the flat and rigid interface add intelligence in places where eyes and optics can’t see.
The sensor revolution has evolved so fast that it’s easy to forget that we use them every day to both detect our interactions and sense everyday behaviors. When you raise your wrist to look at your smartwatch or tap your fitness band, the screen turns on or a light acknowledges you–the sensors inside are programmed to understand this movement as an intentional input. These same sensors are also programmed to interpret the motion from your bouncing gait or swinging arms as steps or some other form of exercise. In the same way, intelligent fabric interfaces can be programmed to accept various forms of inputs and provide interaction measurements, all using the same electronics and customizing through software.
So, as we demand that the Human-Machine Interface be more and more seamlessly integrated into our lives and expect these interfaces to provide more input capabilities and intelligence, new forms of sensing will continue to emerge. These interfaces will be worn and carried, but they can also be built into any shaped device with diverse use cases (like combining inputs and capturing measurements for intelligence) that were not possible until the development of smart fabric interfaces. By bringing the capabilities of smart, programmable surfaces into more human-friendly forms and unlocking new and valuable insights, intelligent fabric promises the next evolution of interaction.
To learn more about how fabric sensors are integrated and to see more examples, check out BeBop’s next generation human-computer interactions, click here >>
Watch Our Human Interaction Video