May 29, 2009

Treasure Islands

Last week, I took part in the SenseStage workshop at the Hexagram BlackBox in Montreal. http://sensestage.hexagram.ca/workshop/introduction/. It was a workshop designed to bring together people from different disciplines (dance, theatre, sound, video, light) and cooperate in a collaborative environment with interactive technologies.

During the workshop, there were tons of sensors – light, floor pressure, accelerometers, humidity etc. – all connected to little microcontrollers which in turn were all wirelessly connected to a central computer that gathered all the data and sent it forward as OSC to any client conected to the network.

Basically, we had 5 days to complete an interactive performance sequence using the data gathered by the sensor nodes. This is what our group came up with.

We call it Treasure Islands and it’s a bit twisted interactive performance/game where a girl finds herself in a weird world where she is floating on a donut in the middle of the ocean with a mermaid talking in her head. She has to travel to all of the different Islands around her, and collect sounds from them in order to open a portal into this strange dream world for all her friends. Sounds like a good concept, doesn’t it? Check out the video and you’ll see that it actually makes sense.

There was a lot of sensor data available, but we ended up using just the pressure sensors on the floor and camera tracking. With a bit more time we could have evolved the world to be more responsive to the real world, but I’m pretty happy with the results we were able to achieve in such a short time. Our group worked really well together, which is not always the case in such collaborative projects.

Credits:

Sarah Albu – narrative, graphics, performance
Matt Waddell – sound, programming
Me – animation, programming

And I guess I need to include some more technical details for all the people who check my site for that kind of stuff (I know you’re out there).

We used camera tracking with tbeta to track Sarah and used that data to move the doughnut and to make the environment responsive to her movements. All of the real-time animation was done in Animata, which really is a perfect tool for something like this, because it allows me to animate things really fast without compromising in quality. Max was used as the middle man to convert the TUIO messages and the OSC from the sensor network into the kind of messages Animata needs to hear.

sense hat
We sewed some IR LEDs on the hat to help with tracking in a dark space.

Each island is an instrument that you can play with. Stepping on a certain area would trigger loops, add effects to your voice etc. Matt could explain the sound part better than me, but the video should make it pretty clear. it doesn’t reproduce the effect of the quadraphonic sound system we used though. Some visual clues were also triggered in the animation based on her movements on the sensors.

That’s pretty much it. If you have any questions, leave a comment and I’ll try to get back to you as soon as possible.

April 3, 2009

Mixmaster 1200

The second installment of my Mixed Up series has now seen the light. Let me introduce you to the Mixmaster 1200.

mixed up

The Mixmaster 1200 is a wireless scratching device for the turntablist who prefers to deliver his/her scratches like a 5 star chef. As you can see, the Mixmaster does not have any beaters attached to it. This is because it has small laser powered plasma emitter beaters that actually heat up the airwaves around the device itself producing the unique sounding aural explosions.

More information: http://originalhamsters.com/motion/mixedup.php

March 18, 2009

Beat Blender Preview

Have you ever wondered what a banana mixed with a strawberry sounds like? Or how about kiwi-watermelon puree? Watch this video and you will find out.

I found this old blender from a flea market and noticed that the names of the different blending modes are very similar to the terminology used in DJing. So I decided to turn this kitchen appliance into a DJ mixer.

The audio tracks are triggered by inserting different fruits into the blender. The buttons on the front panel control the mixing modes and you also have two different types of transformer switches for cutting the sound in and out.

The options are:

  • Stir
  • Puree
  • Whip
  • Grate
  • Mix
  • Chop
  • Grind
  • Blend
  • Liquefy
  • Frappé

How does it work?

  • Arduino for brains
  • RFID reader
  • Different kinds of fruits made out of felt
  • RFID tags inside the fruits
  • Max/MSP for converting the serial data to MIDI
  • Ableton Live for playback
  • Mad skills to pay the bills

Stay tuned for more information, pictures and and better quality videos. This is just a preview, I’m hoping to improve it in the following weeks.

I also need to come up with a better name for this thing. Please leave your suggestions in the comments.

Beat Blender documentation page.

Vimeo link for the video.

Youtube link for the video.

Tags: , , , , , ,
Filed under: Electronics,Music,Programming,Video — Posted by: Månsteri @ 8:51

December 13, 2008

Tooniversity

Not many people know this, but Concordia University in Montréal also has a toon department deep inside the maze that is known as the EV building. The university officials would prefer to keep this knowledge as a secret, since the brutal self torture that goes on inside the faculty would shock many people. In the same way that the Average Joe or Jane does not want to know where the meat inside his/her burger comes from, no-one really wants to know the shocking truth about the stories behind your Saturday morning dose of laughter.


Tooniversity from Matti Niinimäki on Vimeo.

When watching cartoons, people rarely think about the amount of time and dedication the cartoon characters spend on perfecting their sketches and routines. Unfortunately, consumers love to see toons getting hurt. There is just something special about dropping heavy anvils on the heads of unsuspecting cartoon characters that appeals to the majority of viewers.

Like in all fields of entertainment, the competition in the cartoon business is also very harsh. You are only as good as your last fall from a huge cliff. That’s why all the aspiring cartoon students at tooniversities across the world practice new and inventive ways of getting themselves hurt.

A group of activists from PETT (People for the Ethical Treatment of Toons) have been able to sneak a spy camera inside the Tooniversity facilities at Concordia University. Because of their brave action, all the dirty secrets inside the Tooniversity will be exposed. Please go to http://tooniversity.originalhamsters.com
to find more information and sign a petition to stop this madness.

This project was made at Concordia University for Vincent Leclerc’s Tangible Media and Physical Computing class. You can find technical details from my class website

Tags: , , , , , ,
Filed under: Electronics,Programming,Video — Posted by: Månsteri @ 1:16

November 16, 2008

Arduino & Quartz Composer

Here is a quick solution on how to get the analog inputs from an Arduino into Quartz Composer. For now, this only supports the analog values. Reading the digital input pins is not hard to implement, but I still haven’t decided what is the best way to do that.

Sending serial data from QC to Arduino is a little bit trickier, but I will definitely try to work on that also.

What do you need for this to work?

Tags: , ,
Filed under: Electronics,Programming,Tutorial — Posted by: Månsteri @ 23:08

November 10, 2008

Mickey Mann

I’m really interested in stereoscopy, which you might have guessed, if you’ve ever seen me running around with my View-Master camera. In my opinion, View-Master is still a superior method for viewing stereoscopic images, but it’s only still images. That’s why I wanted to see if I could improve the design and make an interactive View-Master for animations.

This little hybrid between Mickey Mouse and Steve Mann enables you to control and view stereoscopic animations that are animated in real-time.

It’s an old View-Master viewer modified to have ChromaDepth lenses, some custom buttons, accelerometer, bluetooth radio and an Arduino to control it all. I thought about hiding the electronics with bigger ears, but decided not to, because I like the ghetto-cyborg look he’s got going on there.

So how does it work? You look through the viewer to the screen where you will see some 3-layer Månsteri-action in all of it’s stereoscopic glory. The great thing about ChromaDepth stereoscopy is that it works with basic colors. You don’t need two channels for the video to achieve a 3D-effect. On a dark background, everything that is blue will appear to be in the background and everything that is red will appear to be in the foreground. Colors in the spectrum between blue and red will appear to be somewhere in the middle. If you didn’t understand my explanation, look it up on the interwebs.

The accelerometer detects your motion and will move the character on the middle layer, giving the illusion that the character is trying to mimic your movement. You can control the content of the layers with the three buttons on the side of the viewer. Button three controls the background, button two the middle layer and button one controls the foreground. Check out the video and you’ll understand what I mean. If you have ChromaDepth glasses, put them on to see the 3d effect.

The Arduino sends the sensor data and the button states wirelessly via bluetooth to my computer. The information is parsed in Max/MSP, which in turn sends the data as OSC packets to Animata (my favourite software at the moment). Animata then animates everything in real-time and handles the hiding/revealing of different layers.

If you are interested, I have uploaded the Arduino and Max 5 source codes and also the Animata scene. It’s all very specific to my setup, but someone might find it useful. Download the source.

Tags: , , , , ,
Filed under: Electronics,Programming,Video — Posted by: Månsteri @ 3:31

July 2, 2008

Multitouch Table Post #2 – Acrylic and LEDs

Gather around children, the multitouch adventures continue. Today we will hear the story of The Acrylic Screen and 112 Little LEDs.

I decided to go for a 4:3 format for my screen and bought a 10 x 820 x 620 mm acrylic screen from Foiltek. Acrylic that size cost me around 60 euros. Couldn’t find any place cheaper here in Finland. The edges of the acrylic weren’t clear so I sanded them with sandpaper. I started with 180 grit paper and worked my way up to 1000 grit paper. Then I finished the edges with some metal polish. They should be clear enough now. Here’s my screen.

In order for the FTIR tracking to work, you need a lot of LEDs that shine the IR light inside the acrylic screen. So basically you need too make LED strips that will be placed on the edges of the acrylic. Here’s a handy calculator for helping you design your LED array.

I decided to save a lot of time, by ordering the circuit boards for the LEDs from dshop.ch. Each of the boards will hold 2 x 7 LEDs and 2 x 15 ohm resistors I got a bunch of the boards with resistors already mounted, but not the LEDs. I ordered my LEDs from eBay. 120 for $48.

I’m using 4 of these boards on each of the longer sides of the acrylic. That’s 112 LEDs in total. This might be a little overkill, but. We’ll see once I get a power source for the strips and I can test it properly.

Next time: How to hack a webcam to see only IR light.

Tags: , ,
Filed under: Electronics,Tutorial — Posted by: Månsteri @ 17:21

July 1, 2008

Multitouch Table Post #1 – How Does It Work?

Most of you have probably seen Jeff Han’s multitouch screen, the Microsoft Surface, and all these other multitouch devices that are popping up everywhere. What’s the first thing that comes to your mind when you see those videos? If you are anything like me you’ll probably want three of them, preferably yesterday. Alas, I don’t have extra 10,000 lying around to buy a MS Surface, so I decided to build my own.

Fortunately, the excellent NUI Group forum is filled with people who have the same intentions and are willing to share advices. There are dozens of blogs describing how to build a DIY multitouch screen. I’ve been planning on building one ever since I saw the Reactable in action a few years ago. Now that I have the time and motivation to actually do this, I have started the building process and I will try to document the steps here on the blog. I will also write how-to pdfs both in English and Finnish and release them once the table is finished.

I don’t want to go into too much detail about how different multitouch surfaces work, but I’ll just quickly go over the science behind the technique I’m going to use. Basically I’m going to build a multitouch table that will use the FTIR method that Jeff Han also uses in his screen.

Source of the image

So basically, you have an acrylic screen that is lighted up with infrared LEDs from the side. Once you put your finger on the acrylic, it will reflect the IR light downwards and your finger would look really bright below the acrylic if you could see IR light. So, you will need something that IS able to see infrared light. For this purpose you use a video camera that has been modified to pass through infrared light and block visible light. Why not just use normal light then, you might ask? Because you want to also project an image on the screen and the light from the projector would disturb the finger tracking.What your camera sees is pretty much like this video (after some image processing to make the blobs brighter)

OK, you have a bunch of bright blobs where someone touches the screen/table. How do those blobs then turn into a cool multitouch screen? You need some software that detects the blobs and then does something cool with the information or sends it along to some other software that might do something even cooler. The weapon of choice for many people is the Touchlib library, but it only works on Windows or Linux. So us Mac users need to use Boot Camp, Parallels etc. or do the tracking with some different software altogether. Unfortunately, there aren’t many options. ReacTIVision 1.4 is pretty much the only free, easily available software for this. It is mainly used for fiducial tracking and I’ve used it in some of my projects. Version 1.4 does plain finger tracking also, but it’s pretty slow when using USB cameras. I will cover the software side some other day in more detail and show you some other options that you might have.

The tracking software uses the TUIO protocol to send the information to Flash, Processing, Quartz Composer, Max/MSP or any other programming environment that understands OSC. Then it’s just up to you what to do with the tracking data.

This is a really superficial explanation on how this all works. I will go on to details within the next days and weeks when I start to document the building process my own multitouch screen. There is tons of information on the Internet about all of this if you are interested. The NUI Group and Google are your friends.

Tomorrow, Multitouch Table Post #2 – Acrylic and the LEDs.

Tags:
Filed under: Electronics,Programming — Posted by: Månsteri @ 19:43

May 27, 2008

Modded/Modattu and Other News (e-MobiLArt, Montreal, etc)

If you are in Rovaniemi from 2nd to 28th of June go check out the exhibition of two of my very good friends.

Jeanne Jo and Rachelle Beaudoin are two talented individuals who I met at RISD last summer. Modded/Modattu will have wearables, hacked headphones and other cool shitten. (shitten™ by Steven)

In other news, things are kind of hectic and I´m running out of energy. I´m leaving to Helsinki on Thursday and on Monday I will be on my way to the e-MobiLArt workshop in Athens, Greece. I’m really glad I have the opportunity to travel and be part of these great projects, but I could really use just a few moments when I don’t have some deadlines waiting around the corner. My schoolwork is lagging miles behind, countless of other projects would need a lot of attention and I´m stressing a little about work. I´ll manage, just bear with me if I seem like my head is not attached to my body sometimes.

What else… I will be moving to Montreal for a year next September. So holla at me, if you are from there and interested in working with me (DJing, VJing, design, art, anything…) And also contact me if you know of a good and cheap place to live for me and my girlfriend, preferably in Mile End.

February 1, 2007

Phidgets + Quartz Composer

We’re going to use Phidgets for a university project pretty soon, so I tried to come up with a way to control Quartz Composer with the Phidget sensors. Hiroaki has done Accelerometer and RFID custom patches that you can find here, but I want to use the Interface Kit 8/8/8 and be able to use all the different sensors.

I’m not smart enough to program my own patches for QC, but I was able to build a Cocoa application that sends the data from the sensors to the QC composition. I got a little help from the Phidget Interface Kit -sample code that you can download here and from the instructions found from the Phidgets forum. The code on the forum post didn’t work for me so here’s a small tutorial that shows you how build a Cocoa app that will control your Quartz Composer composition with Phidgets sensors.

What do you need?

- A computer running Mac OSX 10.4 with the developer tools installed
- Phidgets Interface Kit 8/8/8 + some sensors
- Download and install the Mac OSX Framework 21 from Phidgets website. This will also include the source code that you need for this tutorial.


OK. Let’s start

1)
Open the CocoaIFKit Xcode project from the /Examples/Source Code/CocoaIFKit folder.

2)
First we need to modify the project so that we can import and use a Quartz Composer Composition. These next steps are explained with more detail in the Quartz Composer programming guide by Apple and I recommend that you try the tutorials first, but I’ll go through the important stuff quickly.

- First, choose Project > Add to Project.
- Navigate to System/Library/Frameworks/Quartz.framework and add it to the project.
- Drag the Quartz.Framework to your Frameworks folder in the Groups & Files list
- Double click the MainMenu.nib file and Interface Builder launches.
- If you haven’t done this before, you have to install the Quart Composer palette to Interface Builder. Choose Tools > Palettes > Palette Preferences and click add.
- Navigate to /Developer/Extras/Palettes/ and add QuartzComposer.palette.
- Now your palette window should show the Quartz Composer palette. The blue thing is QCView and the green thing is QCPatchController.

- Drag the QCPatchController from the Palette window to the Instances tab in the MainMenu.nib window
- Go to the Cocoa Windows tab in the Palette window and drag a window to the Instances tab also. A new window appears.
- Your MainMenu.nib window should look something like this

- Select the new window and go to Tools > Show Inspector. In the Attributes section, make sure that the “Visible at launch time” is checked for the new window
- Now drag a QCView from the palette to the new window and resize the QCView the way you like it.
- Select the QCView and then open the Bindings pane of the Inspector window. Click the small triangle where it says patch.
- Choose QCPatchController from the “Bind to” pop-up menu. Enter patch in the Controller Key text field.

3)
OK. Now we need a QC Composition to use with our app. I won’t go to any details about using Quartz Composer. There’s plenty of info for that available and this tutorial is already becoming too long.

What you need to do is to create any kind of composition that has a published input with the name of sensorZero. I just used an Image With String connected to a Sprite and published the String input with the name sensorZero. Save the .qtz file to the CocoaIFKit folder. I used the name phidgets.qtz.

Back in Xcode, drag phidgets.qtz to the Resources folder in the Groups & Files list. In Interface Builder, select the QCPatchCotroller from the MainMenu.nib window and select the Attributes pane of the Inspector and click “Load from Composition File…” Navigate to your phidgets.qtz file. Choose File > Test Interface. You should see your composition in the window. The sensors won’t work in the testing mode.

4)
Now the application shows our composition, but we want to change the published value with the sensors. Read on and I’ll tell you how to do just that.

- Select the blue PhidgetInterfaceKitController from the instances tab. Then select the classes tab.
- Choose Classes > Add Outlet to PhidgetInterfaceKitController.
- Name the new outlet theController
- In the Instances tab, ctrl-click and drag from the PhidgetInterfaceKitController to the QCPatchController

- Choose Connect to connect the outlet.
- Save and go to Xcode again.

5)
Now we need to change the code so that the data goes from the sensors to the QC composition

- Double-click the PhidgetInterfaceKitController.h file.
- Add this to the list of outlets:

IBOutlet id theController;

- Close the window and double-click the PhidgetInterfaceKitController.m file.
- Find the SensorChange section and add this code there:

// sensors to Quartz Composer
if(Index==0){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorZero.value”];
}else if(Index==1){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorOne.value”];
}else if(Index==2){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorTwo.value”];
}else if(Index==3){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorThree.value”];
}else if(Index==4){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorFour.value”];
}else if(Index==5){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorFive.value”];
}else if(Index==6){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorSix.value”];
}else if(Index==7){
[theController setValue:[NSNumber numberWithInt:(int)Value] forKeyPath:@”patch.sensorSeven.value”];
}

- We only published the sensorZero from QC, but this code works with all the 8 analog inputs in the Interface Kit (Index numbers 0-7). So you can publish more values from QC. Just name them correctly: sensorZero, sensorOne, sensorTwo…
- Note! I’m a total amateur when it comes to Cocoa, so this code might not be the best way to do this but it works.

6)
That’s it. Save. Then choose Build and Go to see if it works.

If you want to change the QC composition, you need to reload the composition to the QCPatchController in Interface Builder and rebuild the app.

Pheew! This is the first tutorial I’ve ever written. I hope it’s useful to someone.

Tags: , ,
Filed under: Electronics,Programming,Tutorial — Posted by: Månsteri @ 9:29