Archive for the ‘C#’ Category

tldr; I did lots of steps and connected lots of pipes together to control audio and visuals in real time with a MIDI controller + Ableton + Unity + MIDI-OX (this was the secret sauce)

Intro

(FYI: This is a really really long explanation of my process with no code and no links to download it. Its mostly a record for myself, but maybe there is something here that will help you!)

I’ve been doing Houdini tutorials and was inspired by complex shapes I could make through combining simple functions like Subdivision, Taper, Twist & Copy To Points.

2021-03-28 – Clawed Ball, Houdini

From this, I wanted to manipulate the object in real time with a MIDI controller and decided to re-create the Houdini functions in Unity. The tweet below is the result:

There are 2 things I’d like to talk about here:

  1. Tendril Ball – The visuals
  2. MIDI Control – Using a MIDI controller to manipulate Ableton Live (audio) & Unity (visual) simultaneously

Tendril Ball

It’s a ball made of tendrils, hence the name.

First, I started by creating a mesh of a cube.
I used this site: http://ilkinulas.github.io/development/unity/2016/04/30/cube-mesh-in-unity3d.html

Then I added a parameter to add subdivisions to the cube along the Y-Axis.

Cube mesh with subdivisions.

Then I added a Twist parameter. Each “layer” starting from the bottom is rotated clockwise.

Next, I added a Taper parameter. Each “layer” starting from the bottom get smaller until the size is 0 at the top.

I now have a single “tendril”.

Tendril: Twisted and tapered.

Then I needed to find a sphere without too many vertices on it. I ended up using a “cube sphere” from https://catlikecoding.com/unity/tutorials/cube-sphere/ with a Grid Size of 2. This resulted in a sphere with 26 vertices.

Emulating Houdini’s Copy to Points node, I instantiated one tendril per vertex on the sphere, resulting in the Tendril Ball below.

Tendril ball: Copy to points

I then exposed 3 parameters:

  1. Distance
  2. Length
  3. Twist Amount

MIDI Control

For many years now I’ve been trying to find a good solution to send MIDI from a controller to Ableton Live and Unity simultaneously. Finally I found the missing piece of the puzzle and I have actually been on their website multiple times without realising the answer was always there! This is probably the main reason I wanted to write this blog post.

The answer is MIDI-OX (http://www.midiox.com/). It lets me take a signal from my MIDI controller (Behringer X-Touch Mini) and split/send it to 2 virtual MIDI ports. I am on Windows 10 and I use loopBe30 for virtual MIDI ports.

MIDI-OX

In Ableton, I receive MIDI input from the virtual port “01. Internal MIDI”. I loaded a Wavetable synth and mapped the macros to the first 3 knobs on the X-Touch Mini.

In Unity, I receive MIDI input from the virtual port “02. Internal MIDI”. I used Minis by keijiro (https://twitter.com/_kzr) to receive MIDI input. I map they first 3 knobs on the X-Touch Mini to control the 3 values on my Tendril Ball (Distance, Length & Twist Amount).

Behringer X-TOUCH MINI Midi Controller: Amazon.co.uk: Musical Instruments
Behringer X-Touch Mini

And thats it!

Sorry, not sharing any code because this is part of long ongoing project of mine and this is just one small piece of a much larger execution. I don’t really expect anyone to read all this anyway!

The next problem to solve is using the Push 2. The knobs are endless/relative encoders. Turning a knob doesn’t send a value from 0-127. Instead it sends a value saying whether it increased (< 0.5) or decreased (>0.5). I need to do more research into 2’s Compliment). When I turn the knob in one movement, Unity (Minis) is only receiving one event even though I can see many events being triggered in MIDI-OX and in the Unity Input Debugger. So if I turn a knob from 0 to max in one go, in Unity its only incrementing the value by a small amount.

Skyfall v0.1

Posted: March 6, 2016 in C#, Hardware, Software, Unity 3D, Virtual Reality

Insight

What if you couldĀ make things appear in front of you with the power of your voice?

Idea

Skyfall

Execution

I had an idea of making a virtual reality (VR) demo where you can say things and make them fall from the sky.

For the initial demo, the only objects that you can magically summon are:

  • a cube
  • a white car – Ferrari
  • a red car – Dodge Viper
  • a house
  • Deadpool
  • a Star Destroyer

The “car” command randomly picks between one of 2 cars.

You can also say the word “clear” to remove all the objects from the Ā scene.

Additionally, you can also move around the 3D space using an XBox controller or keyboard and mouse.

All this works within the Oculus Rift DK2.

To take this project further I can:

  • add more objects and commands
  • add Leap Motion support so you can interact with the objects using your hands

Tech Details

Problem

One of the roadblocks I hitĀ early on was that Unity 5.3.2 does not support the .NET speech recognition DLL’s since it is using Mono.

Solution

In order to get around this, I created 2 applications and used a TCP port for them to communicate with each other.

Components

1) Server: C# .NET application

The voice recognition is being done using System.Speech.Recognition.

AĀ GrammarBuilder object is created with the following words.

private string[] VALID_COMMANDS = {“clear”, “cube”, “car”, “house”, “deadpool”, “star destroyer” };

Once a word is recognised, a message is sent to a specified portĀ via TCP.

2) Client: Unity application

On the client side, there is a TCP listener running on a thread that listens for TCP messages.

IfĀ a word is receivedĀ (e.g. cube) the model named “Cube” is then cloned and added to the scene in front of and above where the user is looking.

First person controls and VR support was also added to make the experience more immersive.

 

Heres a game I made for the Oculus Rift with my colleagues at DAN.

http://thehungergame.com.au/

1. Woke up and checked stats. My most recent software creation, a Kinect MIDI Controller (KMIDIC) is on engadget in the US, Japan and Germany!

2. Managed to get a ticket to one of my favourite bands Kyuss for next year. So happy! Tickets sold out super quick!

3. Got to work, did a bunch of work, then my band’s (Throw Catch) freshly mastered debut EP arrived in the mail, and it sounds amazing! Big thanks to Dylan Adams (recording and mixing ) and Michael Lynch (mastering at shoehorse sound).

4. Did more work, finished work, and off I went to my first guitar lesson in 12 years…with Peter Northcote! I learnt sooo much in that one hour lesson, and have now made the commitment to pratice guitar at least 10 minutes a day.

5. Started practicing as soon as I got on the train and had a random conversation with a stranger, who is a guitarist and IT professional like myself, and I now know of Animals As Leaders. Fantastic band!

All in all, pretty much the BEST day ever!!!

Also…if you are a musician or a music business in Australia, check out Rockstar Hookups for your music classifieds needs!