Evidently Reactable has been around for a couple of years, but I hadn’t heard about it until one of my ex-students, Gabi, sent me the link.
Undoubtedly Reactable is a really great implementation of a tangible interface and it is also plugging into the whole multi-touch mania (although it’s a completely different system - it uses a camera to track the faces and orientation of the objects).
I’m just slightly disappointed to see them using it to create that kind of electronica synth noise blinky-blonk stuff again. If you create a new interface, it’s worth thinking about how it means you can do things differently to before. For me the actual audio they are producing could still be a couple of guys behind laptops on Reason or something similar.
It’s great to see the collaborative abilities of it, but again, same result. Interestingly one of the creators says in the Berlin video (the one above) that they created it from a concept rather than the technology, unlike many other technology heavy projects.
I think this is often a really good approach, especially for things that have another ‘purpose’ behind the interface (like YouTube, etc. which is about sharing content, not the interface necessarily). But sometimes, especially when trying to develop new interactive paradigms, it’s more suitable to work from the technology outwards. I don’t mean the kind of awful computer science kind of projects, which are totally cold. I mean a kind of balance in the middle where you are playing with the technologies to see what inherent properties and language it has. That way you can find out new ways of thinking and doing and interacting.
Often starting from a concept means that you’re basing your concept on previous paradigms - in the case of Reactable, that’s all those Max/MSP patch style audio applications. They’ve pretty much substituted the boxes and lines for, well, real boxes and virtual lines.
Maybe they just need to play with Reactable more to work out what it can do. You can take a look at more of their videos on YouTube.