Archive for the ‘Music’ Category

tldr; I did lots of steps and connected lots of pipes together to control audio and visuals in real time with a MIDI controller + Ableton + Unity + MIDI-OX (this was the secret sauce)

Intro

(FYI: This is a really really long explanation of my process with no code and no links to download it. Its mostly a record for myself, but maybe there is something here that will help you!)

I’ve been doing Houdini tutorials and was inspired by complex shapes I could make through combining simple functions like Subdivision, Taper, Twist & Copy To Points.

2021-03-28 – Clawed Ball, Houdini

From this, I wanted to manipulate the object in real time with a MIDI controller and decided to re-create the Houdini functions in Unity. The tweet below is the result:

There are 2 things I’d like to talk about here:

  1. Tendril Ball – The visuals
  2. MIDI Control – Using a MIDI controller to manipulate Ableton Live (audio) & Unity (visual) simultaneously

Tendril Ball

It’s a ball made of tendrils, hence the name.

First, I started by creating a mesh of a cube.
I used this site: http://ilkinulas.github.io/development/unity/2016/04/30/cube-mesh-in-unity3d.html

Then I added a parameter to add subdivisions to the cube along the Y-Axis.

Cube mesh with subdivisions.

Then I added a Twist parameter. Each “layer” starting from the bottom is rotated clockwise.

Next, I added a Taper parameter. Each “layer” starting from the bottom get smaller until the size is 0 at the top.

I now have a single “tendril”.

Tendril: Twisted and tapered.

Then I needed to find a sphere without too many vertices on it. I ended up using a “cube sphere” from https://catlikecoding.com/unity/tutorials/cube-sphere/ with a Grid Size of 2. This resulted in a sphere with 26 vertices.

Emulating Houdini’s Copy to Points node, I instantiated one tendril per vertex on the sphere, resulting in the Tendril Ball below.

Tendril ball: Copy to points

I then exposed 3 parameters:

  1. Distance
  2. Length
  3. Twist Amount

MIDI Control

For many years now I’ve been trying to find a good solution to send MIDI from a controller to Ableton Live and Unity simultaneously. Finally I found the missing piece of the puzzle and I have actually been on their website multiple times without realising the answer was always there! This is probably the main reason I wanted to write this blog post.

The answer is MIDI-OX (http://www.midiox.com/). It lets me take a signal from my MIDI controller (Behringer X-Touch Mini) and split/send it to 2 virtual MIDI ports. I am on Windows 10 and I use loopBe30 for virtual MIDI ports.

MIDI-OX

In Ableton, I receive MIDI input from the virtual port “01. Internal MIDI”. I loaded a Wavetable synth and mapped the macros to the first 3 knobs on the X-Touch Mini.

In Unity, I receive MIDI input from the virtual port “02. Internal MIDI”. I used Minis by keijiro (https://twitter.com/_kzr) to receive MIDI input. I map they first 3 knobs on the X-Touch Mini to control the 3 values on my Tendril Ball (Distance, Length & Twist Amount).

Behringer X-TOUCH MINI Midi Controller: Amazon.co.uk: Musical Instruments
Behringer X-Touch Mini

And thats it!

Sorry, not sharing any code because this is part of long ongoing project of mine and this is just one small piece of a much larger execution. I don’t really expect anyone to read all this anyway!

The next problem to solve is using the Push 2. The knobs are endless/relative encoders. Turning a knob doesn’t send a value from 0-127. Instead it sends a value saying whether it increased (< 0.5) or decreased (>0.5). I need to do more research into 2’s Compliment). When I turn the knob in one movement, Unity (Minis) is only receiving one event even though I can see many events being triggered in MIDI-OX and in the Unity Input Debugger. So if I turn a knob from 0 to max in one go, in Unity its only incrementing the value by a small amount.

Annotation 2020-04-14 000719

Its been over 3 years since I posted anything here, but this was so frustrating, I thought I would post all the solutions in one place if it could potentially help you save a few hours of tearing your hair out.

Its Easter Monday. One of my tasks today, before the end of the long weekend was to create a soundscape for my friend Adam’s short animation. Sounds easy enough, but I wasted hours trying to get Ableton to load and export video.

I am running:

  • Windows 10 Home (1903) 64-bit
  • DirectX 12.0
  • Ableton Live 10.1.9
  • Laptop with:
    • NVIDIA Geforce GTX 1060
    • Intel Integrated Graphics 630

 

PROBLEM : Can’t load video in Ableton Live 10 using Windows 10
SOLUTION: Install Matroska Splitter and FFDShow.

When I tried to drag a video in to the timeline in Ableton, it would crash or show error messages like “The file could not be read. It may be corrupt or not licensed”.

See: https://help.ableton.com/hc/en-us/articles/209773125-Using-Video

The first option, CCCP did not work for me. Neither did K-Lite Codec Pack Mega.

Only Option 2, Matroska Splitter and FFDShow, solved this issue.

 

PROBLEM : Video not playing in Ableton Live 10 using Windows 10
SOLUTION: Set NVIDIA to Integrated Graphics

Now that I could load the video, it wouldn’t play in Live. Instead I would just get a black screen in the video player.

In my case, I have a laptop with:

  • NVIDIA Geforce GTX 1060
  • Intel Integrated Graphics 630

See: https://forum.ableton.com/viewtopic.php?t=232488

Open the NVIDIA Control Panel and select “Manage 3D Settings”.

Click on Program settings, and add Live into the list.

Set the preferred graphics processor to “Integrated graphics”.

 

PROBLEM : Can’t export video from Ableton Live 10 using Windows 10
SOLUTION: Set the Export PCM Bit Depth to 16

The final problem was that I couldn’t export the video. Live would export the WAV and then just crash.

See: https://forum.ableton.com/viewtopic.php?t=231960

  1. Click on File > Export Audio/Video
  2. Set the PCM > Bit Depth to 16
  3. Select the Video Encoder Settings as you wish
  4. Click Export
  5. Voila!

 

And that’s it. I hope this one consolidated is helpful to you. Seem like a lot of people have issues with Live and Video.

 

Cheers,

Your friendly neighbourhood Ben X Tan.

Here are some experiments in VR that I made after watching the Doctor Strange movie.

Visuals created using Unity, sounds created using Ableton Live.






On Friday the 26th of June 2015, a collaborative VR artwork between Vaughan O’Connor and myself was exhibited at the MCA. The 2 artworks, a holographic print and a Virtual Reality (VR) experience were both based on 3D scans of quartzite rocks from Oberon, NSW.

The artworks were on display at the Museum of Contemporary Art (MCA) ARTBAR – Futures, curated by Dara Gill. Facebook event link here.

I created the VR component of the artwork based on the scans that Vaughan supplied.

Inspiration

Inspiration came from lyrics to a song by Covenant called Bullet.

“As the water grinds the stone,
we rise and fall.
As our ashes turn to dust,
we shine like stars.”

Idea

The idea behind the artwork is that it is a place outside of time. Here, concepts such as gravity, space and time do not behave as you would expect. Tempus Incognito. The events that occur here, in the event horizon, cannot affect an outside observer. The 4 classical elements, fire, air, water and earth are present in this place, reflecting the essential building blocks for all things. Even though we are outside of time, matter still exists albeit in a rather astonishing manner.

Technology

The Quartzite VR artwork was created using Unity (a game engine) and experienced via the Oculus Rift DK2 (a virtual reality headset). The ambient soundscape was composed in Ableton Live and is binaural.

Photos
MCA ARTBAR 'Futures' 01MCA ARTBAR 'Futures' 11MCA ARTBAR 'Futures' 02MCA ARTBAR 'Futures' 03MCA ARTBAR 'Futures' 04MCA ARTBAR 'Futures' 06MCA ARTBAR 'Futures' 05MCA ARTBAR 'Futures' 07MCA ARTBAR 'Futures' 08MCA ARTBAR 'Futures' 09MCA ARTBAR 'Futures' 10

Some photos by @mightandwonder

 

Observations

A few interesting things that happened on the night:

  • one of the computers decided to misbehave (no audio, no projector)
  • someone decided to walk about 2 metres away from the desk. The Oculus is plugged into a computer!
  • 2 people decided to look under the desk
  • a lot of people stood up to experience the artwork rather than sit down in the seats provided
  • People were amazed at the Oculus Rift
  • People found the experience soothing, calming and meditative.
  • Some people wanted to stay in side the artwork forever!
  • People asked a lot of questions about the technical aspects of the artwork
More

Futures also featured performances, lectures and music by: Vaughan O’Connor & Ben X TanMichaela GleaveJosh Harle, 110%, Mark Brown, Eddie Sharp & Mark Pesce, Andrew Frost, Claire Finneran & Alex Kiers, Kate MacDonald, Sitting & Smiling, Baden Pailthorpe, Hubert Clarke Jr, Polish Club and Ryan Saez.

Heres a game I made for the Oculus Rift with my colleagues at DAN.

http://thehungergame.com.au/

“I believe that there are many ways to be creative within the tools that we use.
Even though the tools are the same, there are still many possible outcomes.
The creativity comes from within.
The individual absorbs and consume the world around them, and their sub conscious processes all this.
The sounds you hear in your head.
The things you see with you eyes.
The emotions you feel with your heart.
All that combined with the technical knowledge of music and how to use your instruments, your software, your hardware etc.
Sometimes you have to detach yourself from the world too. Get away from the mainstream.
Sometimes you get caught in the flow and it grabs you and you can’t get away.
Go for a holiday. Live in another country. Listen to music that is nothing like what you normally listen to.
Same as before. Combine existing elements to make something new.
That is the essence of creativity.”

This is the email as is. Stream of consciousness with no editing.

Here are 2 musical works in progress. One from last week and one from today.

DOWNLOAD: (2012-08Aug-18) Hard Reset

This one has a bit of a new wave and synthpop kinda feel to it. Have some lyrics for it, but will wait until I’ve finished the whole song before sharing that on the Interweb.

DOWNLOAD: (2012-08Aug-26) Soft Reset

This ones some kinda blues/jazzy piece. Was playing an iPhone game made by a friend called Pong Beats (http://www.pongbeats.com/) and remembered how much I love the sound of acoustic piano, so decided to pull put my M-Audio Axiom 25 and connect it to Ableton Live and make some music

Image

I haven’t used it in a long time…and for some reason, the C# and G keys on both octaves don’t work anymore! After doing some googling, sounds like its a common problem. Someone’s solution was to turn it upside down and slap it. I tried it but it didn’t work. Well, I managed to make some music even without those 2 notes (4 keys all up), and I added a task to my TODO list to figure out whats wrong some other day. Hopefully its not a hardware problem!

Instead of making a stop motion video of my giant Lego man (you can see him in the video below) like I was planning to do, I ended up playing with Processing again. Basically I didn’t really have the right space to setup my stop motion, and I really needed to have a green screen to do what I wanted to do. But its all good, no time wasted as I was very productive anyway!

Firstly, heres the video of my creation today:

Now an explanation…

In summary, I am waving my hand around in the air like an idiot and it is controlling the music coming from my computer. The hardware and software components that are at play here are Microsoft Kinect -> Processing -> Ableton Live and Novation Launchpad. I had a chat to my friend DJ Gustavo Bravetti and he had some good tips for me on how to setup Ableton clips to make the transitions smoother and sound more musical. When I have time, I’ll set up a whole song and give a better performance!

A bit more detail…

Basically, the Kinect is sending the location of my hand to Processing which is in turn sending MIDI note on messages to both Ableton Live and the Novation Launchpad. In this version, I have separated the grid into 4 quadrants, each on playing a different MIDI note that is going into 2 channels in live. The first channel has an arpeggiator triggering an Impulse drum kit, and the second channel has an arpeggiated synth. The lights on the Launchpad are also set to light up each of the quadrants as they are triggered.

For those wanting to delve into the code, its not highly commented, but you should be able to get the idea of what I’m doing. Any issues, just leave a message here or send me an email on benxtan [at] gmail [dot] com.

Processing source code and Live set are available here (UPDATED):

http://benxtan.com/kmidic/kmidic_processing_v0.2.zip

http://benxtan.com/kmidic/kmidic_processing_v0.1.zip

You will need to install Processing, and the rwmidi and libfreenect libraries in the libraries folder of your Processing sketches.

Here are some links if you are after more information.

Software Links:
http://www.ableton.com/
http://processing.org/
http://ruinwesen.com/support-files/rwmidi-0.1c.zip
http://ruinwesen.com/support-files/rwmidi/documentation/RWMidi.html
https://github.com/shiffman/libfreenect/tree/master/wrappers/java/processing

Hardware Links:
http://www.xbox.com/kinect
http://www.novationmusic.com/products/midi_controller/launchpad

Once again, if you are a musician or music business in Australia, its free to sign up to http://rockstarhookups.com.au so go do it! I’m giving you free stuff, so help me out here ok? 🙂

Digital Art – Where design and visual art meets programming

Having all this free time is great. I finally have a chance to explore and play with things that have been sitting in my todo list, collecting (digital) dust.

Today’s topic of interest is digital/generative art and my weapon of choice is Processing. Check out the links below for some truly beautiful works of art.

Artworks:

Tools:

After doing some research, I started playing around with Processing. I began messing around with the built-in examples and started a program that uses the mouse speed to draw circles (speed affects radius) and generate a sine wave (speed affects volume and frequency). I soon tired of this and began the task of getting Processing to talk to Ableton and my Novation Launchpad. The idea is that eventually, I can make an aesthetically pleasing visual display that is controlled by a combination of pre-programmed audio coming from Ableton, and a live performance from me using the Launchpad via MIDI, all working together as one synchronised unit.

Heres a screenshot of a work in progress:

What it does:

  • Launchpad buttons light up when you press them
  • Launchpad buttons light up when pre-programmed drum samples are triggered
  • Processing screen lights up exactly the same as what is on the Launchpad
  • Processing screen shows what the buttons are mapped to (currently hard-coded…but wouldn’t it be awesome if it was automatic?!)

Sounds pretty simple…but it was a bit of a challenge getting all the settings right! I still have some issues to iron out, and hopefully try and simplify the system more.

I’m very keen to check out Open Frameworks and Max for Live as well. Theres a 30% discount on Max for Live that ends in 4 days, and I’m very very tempted to buy it.

Ableton Live, Processing, Max for Live, Launchpad, iPad, Kinect, Open Frameworks…all wonderful tools to explore and play with. Somewhere in there is just the right combination of hardware and software that will allow me to create something awesome. Something that combines my love of music and programming…

Oh, heres one more link. Its a TED talk on research being done on improvisation (jazz and rap). It doesn’t really reveal much that is new, but its an entertaining talk and opens up some interesting questions. http://www.ted.com/talks/charles_limb_your_brain_on_improv.html