About the Project


Muscle and motor control rehabilitation is a difficult and slow process. Forms of therapy include different task exercises which target a specific group of muscles or motor control function. Giving aural instructions and working one-on-one with the patient can be effective, but it can be difficult to explain finite movements of a muscle group to a patient who has no control over that group yet. When a patient goes home and tries the therapy on their own, the exercises may become skewed, lazily done or forgotten.

Over the summer of 2012, interaction and sound designer Kyle Kramer has conducted research at the Human Bio-mechanics and Control Laboratory at the University of Michigan under the supervision of Arthur Kuo, in using inertial sensors to provide aural and visual feedback of a user’s motion. By creating musical instruments and visual environments programmed to be responsive toward specific motions, therapy can be independent, informative, and fun for the patient.

Programming was done in the patch-based visual programming environment, Max/msp. Using wireless inertial measurement units (IMU), called SHAKE sk7, gyroscope and accelerometer data was streamed into Max and used to control various sound synthesis models.

The following videos are demonstrations of different instruments and tools which target specific motions, encourage different range of motions and speeds, or discourage a specific movement (such as the Harp demo).

Source code will be posted soon. Questions or comments may be sent to contact@kylekramer.com
<-go to the Human Biomechanics and Control Lab website
<-go to www.kylekramer.com

SOUND WAVE:
In this first attempt at mapping motion to sound, data from the SHAKE is used as control signals for a basic FM synthesis model. Tilt or “pitch” of the wrist controls a saw wave’s frequency in the low range, spanning an octave of semitones. Rolling of the wrist controls the frequency in a higher range octave. There is always an interval of two notes playing. However, volume and the amplitude of the modulating frequency in the FM model, are directly proportionate to the average velocity of the roll axis.

In this example a desired movement is set as a goal, using orientation and rolling velocity thresholds to ensure the user is in the right boundaries to be executing that motion. The desired motion, if repeated, causes the same sound to occur. With the recorded gesture able to be played back as audio, the user then uses their ear to search for the desired gesture. Additionally an optional graphic was added in the form of a meter to show the “score” or closeness to the desired gesture the user is performing.

KEY PUSHER:
Using the wave recognition and sound code, keyboard functionality has been added, mapping taps of the “w” key to the waving gesture. The better and more consistent the user’s wave is, the longer the key will stay pushed down.

ARROW KEY CONTROL:
By setting a positive and negative degree threshold for pitch and roll values, keyboard keys can be triggered by the orientation of the user’s hand. Point up/down and rolling left/right correspond to those keys.

AUDIO PLAYER:
In this example, the average velocity of a desired axis controls playback speed of a preloaded audio file. The file will not playback too fast, only too slow or not at all.

HARP:
Use of the shoulder to assist with a turning motion in the wrist is discouraged through using the control parameters in an FM model. Rolling the wrist by itself plucks across a pentatonic scale with a slight vibrato, while adding motion in the shoulder causes the amplitude of the modulating sine wave to rise greatly, bringing the vibrato to a very exaggerated depth. A tolerance is easily adjustable, changing the allowed ratio between wrist and shoulder velocity before the vibrato becomes more unstable.

JUMP:
Velocity and range of motion are tracked in this example. By having the user throw their hand forward and down, the difference in pitch and average velocity from beginning to end of the motion are immediately input into a square wave “jump” synthesizer corresponding to “height” of the jump, and time.

NOISE BAND:
This example uses two IMUs simultaneously to sense the extension of the forearm away from the shoulder. The video shows the expansion of the arm narrowing a noise band.

MIDI CONTROLLER:
In a Max patch, the accelerometer and gyro data being streamed from the SHAKE (just as well could be a Nintendo Wii-mote) is being converted into MIDI control change messages and sent out as “Max/msp 1” MIDI controller messages. Controllers were paired with effect parameters in Ableton Live.

In this example, music streaming in Spotify is streaming through Live to the main output via internal audio routing using Soundflower (available for free at http://www.cycling74.com. Simply set your computer audio out and Ableton Live audio in to Soundflower 2ch, and change Live’s audio out to Main Output), so any audio playing out the computer is first processed by Live. The two effects, “Delay” and “Grain Delay” are toggled by a sudden outward movement of the shoulder, or downward movement of the wrist respectively. Pitch, or up/down movement of the wrist, controls the frequency/pitch of the audio stream, while roll of the wrist controls the delay time. Up/down movement of the shoulder controls a rapid delay effect, so the higher the shoulder is raised, the sound becomes more clustered and atmospheric.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s