A Path To A Sustainable Haptics Design Process

The haptics design problem

Interhaptics
6 min readMar 9, 2022

The haptics design problem is a well known question from academics and creatives alike. With the development of wideband and expressive haptics actuators included in products, the creative and research communities interrogate how to leverage this communication channel to improve the user experience for several use cases.

One of the big issues of only concentrating on haptics design is disregarding the enormous impact of the technology stack on the result of the haptics design process identification and definition. As Chris Ullrich (CTO of Immersion) mentioned in the Haptics Club 17 podcast: “Haptics is a system-level problem that requires careful consideration of all the technology stack to achieve a successful implementation”, which for haptics design means that the results that are identified in the specific test, and experiment can be hardly exported to the whole ecosystem.

The solution to this problem could be to simply test these results on several different ecosystems and haptics actuators to obtain general design guidelines. However, today this requires a tremendous overhead for the siloed haptics ecosystems as discussed in the Open Haptics Ecosystem article. In practice, it is not sustainable to achieve today.

Fragmented haptics ecosystem

How do we then exit from this impasse?

The haptics language

As discussed in The MPEG Haptics Standards article, the first step is to identify a common speaking language between different ecosystems. This is what is going on with the MPEG standardization and other similar efforts in the IEEE community.

This is the foundation to start approaching the haptics design problem. Without a common language, different designers can’t efficiently communicate the results of their design effort. We were lucky enough at Interhaptics to be the main contributors to the upcoming MPEG haptics encoding standard which reflects well in the philosophy of our haptics full-stack.

What does that mean?

Fundamentally 2 things:

  • We have been iterating design processes, tools, and methods based on a MPEG-like common language for a few years with our users.
  • We are aware of the deployment problem on different haptics ecosystems. We have been developing the Interhaptics engine for 5 years now, today powering a wide range of haptics devices on different platforms.

Let’s focus on the haptics design process and tools and forget the deployment pipeline we will discuss in another post.

Haptics design today

Ab initio haptics design process is usually approached in two separate ways:

  • Waveform design based on audio tool pipelines
  • Abstract amplitude / sharpness designers

Each approach has its pro and cons. Waveform direct design has the value of addressing wideband haptics by manipulating periodical effects. It requires a certain level of expertise and is well suited for audio designers converting to haptics design.

Ableton live audio creation tool

The abstract design focuses on a high-level manipulation of abstract parameters called amplitude and sharpness to drive a synthesizer to create the final waveform. It is accessible to entry-level designers, but it is hard to refine the experience on a specific system due to the absence of direct control on the final deployed signal.

Capitan AHAP composition tool

After shipping the first version of our Haptics Composer in 2019, we focused on the user’s feedback to understand how to reconcile the two approaches in a designer-friendly package without losing the complete customization of the signal for pro users.

Haptics composer 2019

Solution: The haptics note

After quite a lot of user experience research, testing, and frustrating use of our first product we identified a music concept around which we could shape our entire design language : The Haptics Note.

// math geek on 🔍

A haptics note is a unique haptics element defined by an arbitrary modulated and an arbitrary modulating function of an arbitrary length defined in an arbitrary space.

// math geek off

The future haptics MPEG standard is based on the Haptics Notes concept. We align or stack haptics notes to create complex haptics melodies and harmonies generating the final experience.

Why is this exciting?

We are falling in a well-known design paradigm used in music MIDI sequencers, video mounting software or almost any other audio or video focused editing software for that matter: a timeline with unique elements generating the final experience.

Adobe Premiere Pro timeline

Around this concept, we developed our Haptics Composer 2, which is entering beta now.

Haptic Composer 2

What is cool is that it opens a world of opportunities for haptics.

Every haptics note can be fully customizable by the user down to define the 1-millisecond behavior of the final effect for pro users, and we can provide a super simple high-level customization method like simple haptics for every note.

Preset: Footsteps

You move them around on the timeline, glue them together, stack them, and test rapidly the results with the mobile app, or directly in Unity on supported and custom devices through our SDK.

The design cycle is a few seconds by the way for any device.

Solving the blank haptics canvas problem

This is all cool and fun, nothing revolutionary. Here comes the stuff we came to realize iterating the UX and user interviews confirmed. What if we could provide a wide range of premade haptics notes designers can simply drag and drop around to prototype and experience? What about tweaking them for the final experience?

No blank scary canvas to start from, but a wide range of pre-designed haptics notes to assemble and simply understand the impact of the various parameter on the user. We call them notes presets. From a probably useful feature to test on our users, this is quickly becoming the strength of our product. We sink quite a lot of hours to prepare 60 + haptics notes presets to assemble to create complex patterns in the Haptics Composer 2 at launch.

Check this out:

What is cool, is that we plan to allow users to customize their haptics presets to allow to create each designer or project “haptics language”.

Audio-Haptics problem

The Haptics Note preset concepts is also helping us identify how to manage the complex audio-haptics relationship in the future Converting audio to haptics directly through procedural methods generates a somewhat ok entry-level user experience. The reason is simple: audio does not carry haptics design intention. Good haptics requires haptics semantics and design language. We are iterating internally on what could be a semantic audio—haptics design process.

More to come soon!

Looking forward to contributing? Check out the beta! We are looking for your feedback.

About Interhaptics

Interhaptics is a software company specialized in haptics. Interhaptics provides hand interactions and haptic feedback development and deployment tools for the metaverse, mobile, and console applications. Interhaptics’ mission is to enable the growth of a scalable haptics ecosystem through haptics standardization, cross-technology, and cross-platform deployment.

--

--

Interhaptics

Interhaptics is a development suite designed to build and create realistic human like interactions as well as haptics feedback for 3D application in XR