Here are some experiments in VR that I made after watching the Doctor Strange movie.
Visuals created using Unity, sounds created using Ableton Live.
Here are some experiments in VR that I made after watching the Doctor Strange movie.
Visuals created using Unity, sounds created using Ableton Live.
What if you could make things appear in front of you with the power of your voice?
Skyfall
I had an idea of making a virtual reality (VR) demo where you can say things and make them fall from the sky.
For the initial demo, the only objects that you can magically summon are:
The “car” command randomly picks between one of 2 cars.
You can also say the word “clear” to remove all the objects from the scene.
Additionally, you can also move around the 3D space using an XBox controller or keyboard and mouse.
All this works within the Oculus Rift DK2.
To take this project further I can:
Problem
One of the roadblocks I hit early on was that Unity 5.3.2 does not support the .NET speech recognition DLL’s since it is using Mono.
Solution
In order to get around this, I created 2 applications and used a TCP port for them to communicate with each other.
Components
1) Server: C# .NET application
The voice recognition is being done using System.Speech.Recognition.
A GrammarBuilder object is created with the following words.
private string[] VALID_COMMANDS = {“clear”, “cube”, “car”, “house”, “deadpool”, “star destroyer” };
Once a word is recognised, a message is sent to a specified port via TCP.
2) Client: Unity application
On the client side, there is a TCP listener running on a thread that listens for TCP messages.
If a word is received (e.g. cube) the model named “Cube” is then cloned and added to the scene in front of and above where the user is looking.
First person controls and VR support was also added to make the experience more immersive.
I was trying out Adobe Edge Animate the other night, and wanted to get a simple walk cycle animation happening.
Eventually I decided it was too hard to get something something simple going, and that it wasn’t scalable. I could get 1 sonic animating, but what if I wanted 100?
So, I decided to do it with with JavaScript instead. After a quick google, and some minor modifications to the code, I created this.
http://benxtan.com/test/sonic/
Features:
(WIP = work in progress)
Copy of Elephantmen #6 cover (WIP) – Heres the reference http://www.comicartfans.com/gallerypiece.asp?piece=594821&gsub=73013
Heres something really quick I made at a Flashback 2011 (http://www.defame.com.au/flashback.php)
Pretty much learnt it there and then thanks to help from the Internet, and Mr. Johnny Jim Jams.
Managed to get first place in the wild compo with KMIDIC (http://pmidic.com) too!
http://www.defame.com.au/voting/
Instead of making a stop motion video of my giant Lego man (you can see him in the video below) like I was planning to do, I ended up playing with Processing again. Basically I didn’t really have the right space to setup my stop motion, and I really needed to have a green screen to do what I wanted to do. But its all good, no time wasted as I was very productive anyway!
Firstly, heres the video of my creation today:
Now an explanation…
In summary, I am waving my hand around in the air like an idiot and it is controlling the music coming from my computer. The hardware and software components that are at play here are Microsoft Kinect -> Processing -> Ableton Live and Novation Launchpad. I had a chat to my friend DJ Gustavo Bravetti and he had some good tips for me on how to setup Ableton clips to make the transitions smoother and sound more musical. When I have time, I’ll set up a whole song and give a better performance!
A bit more detail…
Basically, the Kinect is sending the location of my hand to Processing which is in turn sending MIDI note on messages to both Ableton Live and the Novation Launchpad. In this version, I have separated the grid into 4 quadrants, each on playing a different MIDI note that is going into 2 channels in live. The first channel has an arpeggiator triggering an Impulse drum kit, and the second channel has an arpeggiated synth. The lights on the Launchpad are also set to light up each of the quadrants as they are triggered.
For those wanting to delve into the code, its not highly commented, but you should be able to get the idea of what I’m doing. Any issues, just leave a message here or send me an email on benxtan [at] gmail [dot] com.
Processing source code and Live set are available here (UPDATED):
http://benxtan.com/kmidic/kmidic_processing_v0.2.zip
http://benxtan.com/kmidic/kmidic_processing_v0.1.zip
You will need to install Processing, and the rwmidi and libfreenect libraries in the libraries folder of your Processing sketches.
Here are some links if you are after more information.
Software Links:
http://www.ableton.com/
http://processing.org/
http://ruinwesen.com/support-files/rwmidi-0.1c.zip
http://ruinwesen.com/support-files/rwmidi/documentation/RWMidi.html
https://github.com/shiffman/libfreenect/tree/master/wrappers/java/processing
Hardware Links:
http://www.xbox.com/kinect
http://www.novationmusic.com/products/midi_controller/launchpad
Once again, if you are a musician or music business in Australia, its free to sign up to http://rockstarhookups.com.au so go do it! I’m giving you free stuff, so help me out here ok? 🙂
Digital Art – Where design and visual art meets programming
Having all this free time is great. I finally have a chance to explore and play with things that have been sitting in my todo list, collecting (digital) dust.
Today’s topic of interest is digital/generative art and my weapon of choice is Processing. Check out the links below for some truly beautiful works of art.
Artworks:
Tools:
After doing some research, I started playing around with Processing. I began messing around with the built-in examples and started a program that uses the mouse speed to draw circles (speed affects radius) and generate a sine wave (speed affects volume and frequency). I soon tired of this and began the task of getting Processing to talk to Ableton and my Novation Launchpad. The idea is that eventually, I can make an aesthetically pleasing visual display that is controlled by a combination of pre-programmed audio coming from Ableton, and a live performance from me using the Launchpad via MIDI, all working together as one synchronised unit.
Heres a screenshot of a work in progress:
What it does:
Sounds pretty simple…but it was a bit of a challenge getting all the settings right! I still have some issues to iron out, and hopefully try and simplify the system more.
I’m very keen to check out Open Frameworks and Max for Live as well. Theres a 30% discount on Max for Live that ends in 4 days, and I’m very very tempted to buy it.
Ableton Live, Processing, Max for Live, Launchpad, iPad, Kinect, Open Frameworks…all wonderful tools to explore and play with. Somewhere in there is just the right combination of hardware and software that will allow me to create something awesome. Something that combines my love of music and programming…
Oh, heres one more link. Its a TED talk on research being done on improvisation (jazz and rap). It doesn’t really reveal much that is new, but its an entertaining talk and opens up some interesting questions. http://www.ted.com/talks/charles_limb_your_brain_on_improv.html
1. Woke up and checked stats. My most recent software creation, a Kinect MIDI Controller (KMIDIC) is on engadget in the US, Japan and Germany!
2. Managed to get a ticket to one of my favourite bands Kyuss for next year. So happy! Tickets sold out super quick!
3. Got to work, did a bunch of work, then my band’s (Throw Catch) freshly mastered debut EP arrived in the mail, and it sounds amazing! Big thanks to Dylan Adams (recording and mixing ) and Michael Lynch (mastering at shoehorse sound).
4. Did more work, finished work, and off I went to my first guitar lesson in 12 years…with Peter Northcote! I learnt sooo much in that one hour lesson, and have now made the commitment to pratice guitar at least 10 minutes a day.
5. Started practicing as soon as I got on the train and had a random conversation with a stranger, who is a guitarist and IT professional like myself, and I now know of Animals As Leaders. Fantastic band!
All in all, pretty much the BEST day ever!!!
Also…if you are a musician or a music business in Australia, check out Rockstar Hookups for your music classifieds needs!
Nostalgia…
So I downloaded a version of QBasic today. This is a re-enactment of one of the first computer programs that I wrote in 1995 when I was 11 years old sitting in computer class.
The code might have been copied from QBasic help, but looking through here, http://www.qbasicnews.com/qboho/, I couldn’t seem to a find an example that does this. Anyone seen an example in QBasic help that draws random purple rectangles on screen?
Basically, when I saw that I could do this with a computer…write commands and create something from nothing…I was hooked.
Source code here:
SCREEN 7
COLOR 5, 0
RANDOMIZE TIMER
DO
CLS
x = INT(RND * 320)
y = INT(RND * 200)
size = INT(RND * 10) + 10
LINE (x, y)-(x + size, y + size), , BF
LOOP WHILE INKEY$ = “”