May 29, 2009

Treasure Islands

Last week, I took part in the SenseStage workshop at the Hexagram BlackBox in Montreal. http://sensestage.hexagram.ca/workshop/introduction/. It was a workshop designed to bring together people from different disciplines (dance, theatre, sound, video, light) and cooperate in a collaborative environment with interactive technologies.

During the workshop, there were tons of sensors – light, floor pressure, accelerometers, humidity etc. – all connected to little microcontrollers which in turn were all wirelessly connected to a central computer that gathered all the data and sent it forward as OSC to any client conected to the network.

Basically, we had 5 days to complete an interactive performance sequence using the data gathered by the sensor nodes. This is what our group came up with.

We call it Treasure Islands and it’s a bit twisted interactive performance/game where a girl finds herself in a weird world where she is floating on a donut in the middle of the ocean with a mermaid talking in her head. She has to travel to all of the different Islands around her, and collect sounds from them in order to open a portal into this strange dream world for all her friends. Sounds like a good concept, doesn’t it? Check out the video and you’ll see that it actually makes sense.

There was a lot of sensor data available, but we ended up using just the pressure sensors on the floor and camera tracking. With a bit more time we could have evolved the world to be more responsive to the real world, but I’m pretty happy with the results we were able to achieve in such a short time. Our group worked really well together, which is not always the case in such collaborative projects.

Credits:

Sarah Albu – narrative, graphics, performance
Matt Waddell – sound, programming
Me – animation, programming

And I guess I need to include some more technical details for all the people who check my site for that kind of stuff (I know you’re out there).

We used camera tracking with tbeta to track Sarah and used that data to move the doughnut and to make the environment responsive to her movements. All of the real-time animation was done in Animata, which really is a perfect tool for something like this, because it allows me to animate things really fast without compromising in quality. Max was used as the middle man to convert the TUIO messages and the OSC from the sensor network into the kind of messages Animata needs to hear.

sense hat
We sewed some IR LEDs on the hat to help with tracking in a dark space.

Each island is an instrument that you can play with. Stepping on a certain area would trigger loops, add effects to your voice etc. Matt could explain the sound part better than me, but the video should make it pretty clear. it doesn’t reproduce the effect of the quadraphonic sound system we used though. Some visual clues were also triggered in the animation based on her movements on the sensors.

That’s pretty much it. If you have any questions, leave a comment and I’ll try to get back to you as soon as possible.

April 3, 2009

Mixmaster 1200

The second installment of my Mixed Up series has now seen the light. Let me introduce you to the Mixmaster 1200.

mixed up

The Mixmaster 1200 is a wireless scratching device for the turntablist who prefers to deliver his/her scratches like a 5 star chef. As you can see, the Mixmaster does not have any beaters attached to it. This is because it has small laser powered plasma emitter beaters that actually heat up the airwaves around the device itself producing the unique sounding aural explosions.

More information: http://originalhamsters.com/motion/mixedup.php

March 18, 2009

Beat Blender Preview

Have you ever wondered what a banana mixed with a strawberry sounds like? Or how about kiwi-watermelon puree? Watch this video and you will find out.

I found this old blender from a flea market and noticed that the names of the different blending modes are very similar to the terminology used in DJing. So I decided to turn this kitchen appliance into a DJ mixer.

The audio tracks are triggered by inserting different fruits into the blender. The buttons on the front panel control the mixing modes and you also have two different types of transformer switches for cutting the sound in and out.

The options are:

  • Stir
  • Puree
  • Whip
  • Grate
  • Mix
  • Chop
  • Grind
  • Blend
  • Liquefy
  • Frappé

How does it work?

  • Arduino for brains
  • RFID reader
  • Different kinds of fruits made out of felt
  • RFID tags inside the fruits
  • Max/MSP for converting the serial data to MIDI
  • Ableton Live for playback
  • Mad skills to pay the bills

Stay tuned for more information, pictures and and better quality videos. This is just a preview, I’m hoping to improve it in the following weeks.

I also need to come up with a better name for this thing. Please leave your suggestions in the comments.

Beat Blender documentation page.

Vimeo link for the video.

Youtube link for the video.

Tags: , , , , , ,
Filed under: Electronics,Music,Programming,Video — Posted by: Månsteri @ 8:51

November 16, 2008

Arduino & Quartz Composer

Here is a quick solution on how to get the analog inputs from an Arduino into Quartz Composer. For now, this only supports the analog values. Reading the digital input pins is not hard to implement, but I still haven’t decided what is the best way to do that.

Sending serial data from QC to Arduino is a little bit trickier, but I will definitely try to work on that also.

What do you need for this to work?

Tags: , ,
Filed under: Electronics,Programming,Tutorial — Posted by: Månsteri @ 23:08

November 10, 2008

Mickey Mann

I’m really interested in stereoscopy, which you might have guessed, if you’ve ever seen me running around with my View-Master camera. In my opinion, View-Master is still a superior method for viewing stereoscopic images, but it’s only still images. That’s why I wanted to see if I could improve the design and make an interactive View-Master for animations.

This little hybrid between Mickey Mouse and Steve Mann enables you to control and view stereoscopic animations that are animated in real-time.

It’s an old View-Master viewer modified to have ChromaDepth lenses, some custom buttons, accelerometer, bluetooth radio and an Arduino to control it all. I thought about hiding the electronics with bigger ears, but decided not to, because I like the ghetto-cyborg look he’s got going on there.

So how does it work? You look through the viewer to the screen where you will see some 3-layer Månsteri-action in all of it’s stereoscopic glory. The great thing about ChromaDepth stereoscopy is that it works with basic colors. You don’t need two channels for the video to achieve a 3D-effect. On a dark background, everything that is blue will appear to be in the background and everything that is red will appear to be in the foreground. Colors in the spectrum between blue and red will appear to be somewhere in the middle. If you didn’t understand my explanation, look it up on the interwebs.

The accelerometer detects your motion and will move the character on the middle layer, giving the illusion that the character is trying to mimic your movement. You can control the content of the layers with the three buttons on the side of the viewer. Button three controls the background, button two the middle layer and button one controls the foreground. Check out the video and you’ll understand what I mean. If you have ChromaDepth glasses, put them on to see the 3d effect.

The Arduino sends the sensor data and the button states wirelessly via bluetooth to my computer. The information is parsed in Max/MSP, which in turn sends the data as OSC packets to Animata (my favourite software at the moment). Animata then animates everything in real-time and handles the hiding/revealing of different layers.

If you are interested, I have uploaded the Arduino and Max 5 source codes and also the Animata scene. It’s all very specific to my setup, but someone might find it useful. Download the source.

Tags: , , , , ,
Filed under: Electronics,Programming,Video — Posted by: Månsteri @ 3:31

May 24, 2007

Phidgets Patches for Quartz Composer from fdiv.net

I wrote a tutorial a while ago for building a cocoa app that will enable you to control Quartz Composer with Phidgets sensors. Now there’s a LOT easier way to achieve this. Just use the new QC patches from fdiv.net.

Tags: , ,
Filed under: Stuff That I Like — Posted by: Månsteri @ 19:23

February 1, 2007

Phidgets + Quartz Composer

We’re going to use Phidgets for a university project pretty soon, so I tried to come up with a way to control Quartz Composer with the Phidget sensors. Hiroaki has done Accelerometer and RFID custom patches that you can find here, but I want to use the Interface Kit 8/8/8 and be able to use all the different sensors.

I’m not smart enough to program my own patches for QC, but I was able to build a Cocoa application that sends the data from the sensors to the QC composition. I got a little help from the Phidget Interface Kit -sample code that you can download here and from the instructions found from the Phidgets forum. The code on the forum post didn’t work for me so here’s a small tutorial that shows you how build a Cocoa app that will control your Quartz Composer composition with Phidgets sensors.

What do you need?

- A computer running Mac OSX 10.4 with the developer tools installed
- Phidgets Interface Kit 8/8/8 + some sensors
- Download and install the Mac OSX Framework 21 from Phidgets website. This will also include the source code that you need for this tutorial.


OK. Let’s start

1)
Open the CocoaIFKit Xcode project from the /Examples/Source Code/CocoaIFKit folder.

2)
First we need to modify the project so that we can import and use a Quartz Composer Composition. These next steps are explained with more detail in the Quartz Composer programming guide by Apple and I recommend that you try the tutorials first, but I’ll go through the important stuff quickly.

- First, choose Project > Add to Project.
- Navigate to System/Library/Frameworks/Quartz.framework and add it to the project.
- Drag the Quartz.Framework to your Frameworks folder in the Groups & Files list
- Double click the MainMenu.nib file and Interface Builder launches.
- If you haven’t done this before, you have to install the Quart Composer palette to Interface Builder. Choose Tools > Palettes > Palette Preferences and click add.
- Navigate to /Developer/Extras/Palettes/ and add QuartzComposer.palette.
- Now your palette window should show the Quartz Composer palette. The blue thing is QCView and the green thing is QCPatchController.

- Drag the QCPatchController from the Palette window to the Instances tab in the MainMenu.nib window
- Go to the Cocoa Windows tab in the Palette window and drag a window to the Instances tab also. A new window appears.
- Your MainMenu.nib window should look something like this

- Select the new window and go to Tools > Show Inspector. In the Attributes section, make sure that the “Visible at launch time” is checked for the new window
- Now drag a QCView from the palette to the new window and resize the QCView the way you like it.
- Select the QCView and then open the Bindings pane of the Inspector window. Click the small triangle where it says patch.
- Choose QCPatchController from the “Bind to” pop-up menu. Enter patch in the Controller Key text field.

3)
OK. Now we need a QC Composition to use with our app. I won’t go to any details about using Quartz Composer. There’s plenty of info for that available and this tutorial is already becoming too long.

What you need to do is to create any kind of composition that has a published input with the name of sensorZero. I just used an Image With String connected to a Sprite and published the String input with the name sensorZero. Save the .qtz file to the CocoaIFKit folder. I used the name phidgets.qtz.

Back in Xcode, drag phidgets.qtz to the Resources folder in the Groups & Files list. In Interface Builder, select the QCPatchCotroller from the MainMenu.nib window and select the Attributes pane of the Inspector and click “Load from Composition File…” Navigate to your phidgets.qtz file. Choose File > Test Interface. You should see your composition in the window. The sensors won’t work in the testing mode.

4)
Now the application shows our composition, but we want to change the published value with the sensors. Read on and I’ll tell you how to do just that.

- Select the blue PhidgetInterfaceKitController from the instances tab. Then select the classes tab.
- Choose Classes > Add Outlet to PhidgetInterfaceKitController.
- Name the new outlet theController
- In the Instances tab, ctrl-click and drag from the PhidgetInterfaceKitController to the QCPatchController

- Choose Connect to connect the outlet.
- Save and go to Xcode again.

5)
Now we need to change the code so that the data goes from the sensors to the QC composition

- Double-click the PhidgetInterfaceKitController.h file.
- Add this to the list of outlets:

IBOutlet id theController;

- Close the window and double-click the PhidgetInterfaceKitController.m file.
- Find the SensorChange section and add this code there:

// sensors to Quartz Composer
if(Index==0){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorZero.value”];
}else if(Index==1){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorOne.value”];
}else if(Index==2){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorTwo.value”];
}else if(Index==3){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorThree.value”];
}else if(Index==4){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorFour.value”];
}else if(Index==5){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorFive.value”];
}else if(Index==6){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorSix.value”];
}else if(Index==7){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorSeven.value”];
}

- We only published the sensorZero from QC, but this code works with all the 8 analog inputs in the Interface Kit (Index numbers 0-7). So you can publish more values from QC. Just name them correctly: sensorZero, sensorOne, sensorTwo…
- Note! I’m a total amateur when it comes to Cocoa, so this code might not be the best way to do this but it works.

6)
That’s it. Save. Then choose Build and Go to see if it works.

If you want to change the QC composition, you need to reload the composition to the QCPatchController in Interface Builder and rebuild the app.

Pheew! This is the first tutorial I’ve ever written. I hope it’s useful to someone.

Tags: , ,
Filed under: Electronics,Programming,Tutorial — Posted by: Månsteri @ 9:29