Here is a new track I made over the weekend (8-9 May 2021).
I wanted to make a hardstyle track with guitar sections. Something loud, fast, aggressive and heavy with lots of distortion. I’ve also been watching sci-fi horror space movies recently (Life, Sunshine, Event Horizon) so the track is subliminally influenced by that.
tldr; I did lots of steps and connected lots of pipes together to control audio and visuals in real time with a MIDI controller + Ableton + Unity + MIDI-OX (this was the secret sauce)
Intro
(FYI: This is a really really long explanation of my process with no code and no links to download it. Its mostly a record for myself, but maybe there is something here that will help you!)
I’ve been doing Houdini tutorials and was inspired by complex shapes I could make through combining simple functions like Subdivision, Taper, Twist & Copy To Points.
2021-03-28 – Clawed Ball, Houdini
From this, I wanted to manipulate the object in real time with a MIDI controller and decided to re-create the Houdini functions in Unity. The tweet below is the result:
Here's my creative output from the Easter long weekend. Code + Music + Art + Math!. Created with @unity3d, @Ableton , MIDI OX & @Behringer X Touch Mini. Inspired by #Houdini.
Then I added a parameter to add subdivisions to the cube along the Y-Axis.
Cube mesh with subdivisions.
Then I added a Twist parameter. Each “layer” starting from the bottom is rotated clockwise.
Next, I added a Taper parameter. Each “layer” starting from the bottom get smaller until the size is 0 at the top.
I now have a single “tendril”.
Tendril: Twisted and tapered.
Then I needed to find a sphere without too many vertices on it. I ended up using a “cube sphere” from https://catlikecoding.com/unity/tutorials/cube-sphere/ with a Grid Size of 2. This resulted in a sphere with 26 vertices.
Emulating Houdini’s Copy to Points node, I instantiated one tendril per vertex on the sphere, resulting in the Tendril Ball below.
Tendril ball: Copy to points
I then exposed 3 parameters:
Distance
Length
Twist Amount
MIDI Control
For many years now I’ve been trying to find a good solution to send MIDI from a controller to Ableton Live and Unity simultaneously. Finally I found the missing piece of the puzzle and I have actually been on their website multiple times without realising the answer was always there! This is probably the main reason I wanted to write this blog post.
The answer is MIDI-OX (http://www.midiox.com/). It lets me take a signal from my MIDI controller (Behringer X-Touch Mini) and split/send it to 2 virtual MIDI ports. I am on Windows 10 and I use loopBe30 for virtual MIDI ports.
MIDI-OX
In Ableton, I receive MIDI input from the virtual port “01. Internal MIDI”. I loaded a Wavetable synth and mapped the macros to the first 3 knobs on the X-Touch Mini.
In Unity, I receive MIDI input from the virtual port “02. Internal MIDI”. I used Minis by keijiro (https://twitter.com/_kzr) to receive MIDI input. I map they first 3 knobs on the X-Touch Mini to control the 3 values on my Tendril Ball (Distance, Length & Twist Amount).
Behringer X-Touch Mini
And thats it!
Sorry, not sharing any code because this is part of long ongoing project of mine and this is just one small piece of a much larger execution. I don’t really expect anyone to read all this anyway!
The next problem to solve is using the Push 2. The knobs are endless/relative encoders. Turning a knob doesn’t send a value from 0-127. Instead it sends a value saying whether it increased (< 0.5) or decreased (>0.5). I need to do more research into 2’s Compliment). When I turn the knob in one movement, Unity (Minis) is only receiving one event even though I can see many events being triggered in MIDI-OX and in the Unity Input Debugger. So if I turn a knob from 0 to max in one go, in Unity its only incrementing the value by a small amount.
Its been over 3 years since I posted anything here, but this was so frustrating, I thought I would post all the solutions in one place if it could potentially help you save a few hours of tearing your hair out.
Its Easter Monday. One of my tasks today, before the end of the long weekend was to create a soundscape for my friend Adam’s short animation. Sounds easy enough, but I wasted hours trying to get Ableton to load and export video.
I am running:
Windows 10 Home (1903) 64-bit
DirectX 12.0
Ableton Live 10.1.9
Laptop with:
NVIDIA Geforce GTX 1060
Intel Integrated Graphics 630
PROBLEM : Can’t load video in Ableton Live 10 using Windows 10 SOLUTION: Install Matroska Splitter and FFDShow.
When I tried to drag a video in to the timeline in Ableton, it would crash or show error messages like “The file could not be read. It may be corrupt or not licensed”.
We are all used to text search. Google, Bing, Yahoo. In VR, you don’t have a keyboard.
We sometimes use voice search. Siri, Google, Cortana. Voice search is used in the Samsung Internet VR app to search the web.
With VR, will there be new ways to search?
In Steam you can scroll through lists with thumbnail images or VR apps and you can point at and select them with hand controllers. Other VR apps use gaze as an input mechanism.
What if you wanted to search the web where there are millions of results? What is the best way to represent this in a 3D space where you can have hand controllers, gaze, voice, gestures and eye tracking?
Google has 2D image search to find similar images. Will there be a 3D image search to find similar 3D images?
What if I wanted to do a meta-search for across multiple platforms? AltspaceVR, Second Life, Hi Fidelity, VTime, etc. What if I wanted to search for a location, a person, an object, a colour or an event across these platforms? Will there be new meta-apps that sit on top of other apps to enable this? VR search engines, VR crawlers, vrrobots.txt, VR cookies, etc.
Incognito mode where you can traverse the Metaverse discretely?
Will there eventually just be one Metaverse, just as there is only one Internet?
On Friday the 26th of June 2015, a collaborative VR artwork between Vaughan O’Connor and myself was exhibited at the MCA. The 2 artworks, a holographic print and a Virtual Reality (VR) experience were both based on 3D scans of quartzite rocks from Oberon, NSW.
I created the VR component of the artwork based on the scans that Vaughan supplied.
Inspiration
Inspiration came from lyrics to a song by Covenant called Bullet.
“As the water grinds the stone, we rise and fall. As our ashes turn to dust, we shine like stars.”
Idea
The idea behind the artwork is that it is a place outside of time. Here, concepts such as gravity, space and time do not behave as you would expect. Tempus Incognito. The events that occur here, in the event horizon, cannot affect an outside observer. The 4 classical elements, fire, air, water and earth are present in this place, reflecting the essential building blocks for all things. Even though we are outside of time, matter still exists albeit in a rather astonishing manner.
Technology
The Quartzite VR artwork was created using Unity (a game engine) and experienced via the Oculus Rift DK2 (a virtual reality headset). The ambient soundscape was composed in Ableton Live and is binaural.
A few interesting things that happened on the night:
one of the computers decided to misbehave (no audio, no projector)
someone decided to walk about 2 metres away from the desk. The Oculus is plugged into a computer!
2 people decided to look under the desk
a lot of people stood up to experience the artwork rather than sit down in the seats provided
People were amazed at the Oculus Rift
People found the experience soothing, calming and meditative.
Some people wanted to stay in side the artwork forever!
People asked a lot of questions about the technical aspects of the artwork
More
Futures also featured performances, lectures and music by: Vaughan O’Connor & Ben X Tan, Michaela Gleave, Josh Harle, 110%, Mark Brown, Eddie Sharp & Mark Pesce, Andrew Frost, Claire Finneran & Alex Kiers, Kate MacDonald, Sitting & Smiling, Baden Pailthorpe, Hubert Clarke Jr, Polish Club and Ryan Saez.