Icarus

Posted: May 9, 2021 in Uncategorized
Icarus

Here is a new track I made over the weekend (8-9 May 2021).

I wanted to make a hardstyle track with guitar sections. Something loud, fast, aggressive and heavy with lots of distortion. I’ve also been watching sci-fi horror space movies recently (Life, Sunshine, Event Horizon) so the track is subliminally influenced by that.

I used:

Hope you like it!

tldr; I did lots of steps and connected lots of pipes together to control audio and visuals in real time with a MIDI controller + Ableton + Unity + MIDI-OX (this was the secret sauce)

Intro

(FYI: This is a really really long explanation of my process with no code and no links to download it. Its mostly a record for myself, but maybe there is something here that will help you!)

I’ve been doing Houdini tutorials and was inspired by complex shapes I could make through combining simple functions like Subdivision, Taper, Twist & Copy To Points.

2021-03-28 – Clawed Ball, Houdini

From this, I wanted to manipulate the object in real time with a MIDI controller and decided to re-create the Houdini functions in Unity. The tweet below is the result:

There are 2 things I’d like to talk about here:

  1. Tendril Ball – The visuals
  2. MIDI Control – Using a MIDI controller to manipulate Ableton Live (audio) & Unity (visual) simultaneously

Tendril Ball

It’s a ball made of tendrils, hence the name.

First, I started by creating a mesh of a cube.
I used this site: http://ilkinulas.github.io/development/unity/2016/04/30/cube-mesh-in-unity3d.html

Then I added a parameter to add subdivisions to the cube along the Y-Axis.

Cube mesh with subdivisions.

Then I added a Twist parameter. Each “layer” starting from the bottom is rotated clockwise.

Next, I added a Taper parameter. Each “layer” starting from the bottom get smaller until the size is 0 at the top.

I now have a single “tendril”.

Tendril: Twisted and tapered.

Then I needed to find a sphere without too many vertices on it. I ended up using a “cube sphere” from https://catlikecoding.com/unity/tutorials/cube-sphere/ with a Grid Size of 2. This resulted in a sphere with 26 vertices.

Emulating Houdini’s Copy to Points node, I instantiated one tendril per vertex on the sphere, resulting in the Tendril Ball below.

Tendril ball: Copy to points

I then exposed 3 parameters:

  1. Distance
  2. Length
  3. Twist Amount

MIDI Control

For many years now I’ve been trying to find a good solution to send MIDI from a controller to Ableton Live and Unity simultaneously. Finally I found the missing piece of the puzzle and I have actually been on their website multiple times without realising the answer was always there! This is probably the main reason I wanted to write this blog post.

The answer is MIDI-OX (http://www.midiox.com/). It lets me take a signal from my MIDI controller (Behringer X-Touch Mini) and split/send it to 2 virtual MIDI ports. I am on Windows 10 and I use loopBe30 for virtual MIDI ports.

MIDI-OX

In Ableton, I receive MIDI input from the virtual port “01. Internal MIDI”. I loaded a Wavetable synth and mapped the macros to the first 3 knobs on the X-Touch Mini.

In Unity, I receive MIDI input from the virtual port “02. Internal MIDI”. I used Minis by keijiro (https://twitter.com/_kzr) to receive MIDI input. I map they first 3 knobs on the X-Touch Mini to control the 3 values on my Tendril Ball (Distance, Length & Twist Amount).

Behringer X-TOUCH MINI Midi Controller: Amazon.co.uk: Musical Instruments
Behringer X-Touch Mini

And thats it!

Sorry, not sharing any code because this is part of long ongoing project of mine and this is just one small piece of a much larger execution. I don’t really expect anyone to read all this anyway!

The next problem to solve is using the Push 2. The knobs are endless/relative encoders. Turning a knob doesn’t send a value from 0-127. Instead it sends a value saying whether it increased (< 0.5) or decreased (>0.5). I need to do more research into 2’s Compliment). When I turn the knob in one movement, Unity (Minis) is only receiving one event even though I can see many events being triggered in MIDI-OX and in the Unity Input Debugger. So if I turn a knob from 0 to max in one go, in Unity its only incrementing the value by a small amount.

Annotation 2020-04-14 000719

Its been over 3 years since I posted anything here, but this was so frustrating, I thought I would post all the solutions in one place if it could potentially help you save a few hours of tearing your hair out.

Its Easter Monday. One of my tasks today, before the end of the long weekend was to create a soundscape for my friend Adam’s short animation. Sounds easy enough, but I wasted hours trying to get Ableton to load and export video.

I am running:

  • Windows 10 Home (1903) 64-bit
  • DirectX 12.0
  • Ableton Live 10.1.9
  • Laptop with:
    • NVIDIA Geforce GTX 1060
    • Intel Integrated Graphics 630

 

PROBLEM : Can’t load video in Ableton Live 10 using Windows 10
SOLUTION: Install Matroska Splitter and FFDShow.

When I tried to drag a video in to the timeline in Ableton, it would crash or show error messages like “The file could not be read. It may be corrupt or not licensed”.

See: https://help.ableton.com/hc/en-us/articles/209773125-Using-Video

The first option, CCCP did not work for me. Neither did K-Lite Codec Pack Mega.

Only Option 2, Matroska Splitter and FFDShow, solved this issue.

 

PROBLEM : Video not playing in Ableton Live 10 using Windows 10
SOLUTION: Set NVIDIA to Integrated Graphics

Now that I could load the video, it wouldn’t play in Live. Instead I would just get a black screen in the video player.

In my case, I have a laptop with:

  • NVIDIA Geforce GTX 1060
  • Intel Integrated Graphics 630

See: https://forum.ableton.com/viewtopic.php?t=232488

Open the NVIDIA Control Panel and select “Manage 3D Settings”.

Click on Program settings, and add Live into the list.

Set the preferred graphics processor to “Integrated graphics”.

 

PROBLEM : Can’t export video from Ableton Live 10 using Windows 10
SOLUTION: Set the Export PCM Bit Depth to 16

The final problem was that I couldn’t export the video. Live would export the WAV and then just crash.

See: https://forum.ableton.com/viewtopic.php?t=231960

  1. Click on File > Export Audio/Video
  2. Set the PCM > Bit Depth to 16
  3. Select the Video Encoder Settings as you wish
  4. Click Export
  5. Voila!

 

And that’s it. I hope this one consolidated is helpful to you. Seem like a lot of people have issues with Live and Video.

 

Cheers,

Your friendly neighbourhood Ben X Tan.

Here are some experiments in VR that I made after watching the Doctor Strange movie.

Visuals created using Unity, sounds created using Ableton Live.






johnny-mnemonic-blu-raycompre

Ponderings…

We are all used to text search. Google, Bing, Yahoo. In VR, you don’t have a keyboard.

We sometimes use voice search. Siri, Google, Cortana. Voice search is used in the Samsung Internet VR app to search the web.

With VR, will there be new ways to search?

In Steam you can scroll through lists with thumbnail images or VR apps and you can point at and select them with hand controllers. Other VR apps use gaze as an input mechanism.

What if you wanted to search the web where there are millions of results? What is the best way to represent this in a 3D space where you can have hand controllers, gaze, voice, gestures and eye tracking?

Google has 2D image search to find similar images. Will there be a 3D image search to find similar 3D images?

What if I wanted to do a meta-search for across multiple platforms? AltspaceVR, Second Life, Hi Fidelity, VTime, etc. What if I wanted to search for a location, a person, an object, a colour or an event across these platforms? Will there be new meta-apps that sit on top of other apps to enable this? VR search engines, VR crawlers, vrrobots.txt, VR cookies, etc.

Incognito mode where you can traverse the Metaverse discretely?

Will there eventually just be one Metaverse, just as there is only one Internet?

So many possibilities…

 

 

Skyfall v0.1

Posted: March 6, 2016 in C#, Hardware, Software, Unity 3D, Virtual Reality

Insight

What if you could make things appear in front of you with the power of your voice?

Idea

Skyfall

Execution

I had an idea of making a virtual reality (VR) demo where you can say things and make them fall from the sky.

For the initial demo, the only objects that you can magically summon are:

  • a cube
  • a white car – Ferrari
  • a red car – Dodge Viper
  • a house
  • Deadpool
  • a Star Destroyer

The “car” command randomly picks between one of 2 cars.

You can also say the word “clear” to remove all the objects from the  scene.

Additionally, you can also move around the 3D space using an XBox controller or keyboard and mouse.

All this works within the Oculus Rift DK2.

To take this project further I can:

  • add more objects and commands
  • add Leap Motion support so you can interact with the objects using your hands

Tech Details

Problem

One of the roadblocks I hit early on was that Unity 5.3.2 does not support the .NET speech recognition DLL’s since it is using Mono.

Solution

In order to get around this, I created 2 applications and used a TCP port for them to communicate with each other.

Components

1) Server: C# .NET application

The voice recognition is being done using System.Speech.Recognition.

GrammarBuilder object is created with the following words.

private string[] VALID_COMMANDS = {“clear”, “cube”, “car”, “house”, “deadpool”, “star destroyer” };

Once a word is recognised, a message is sent to a specified port via TCP.

2) Client: Unity application

On the client side, there is a TCP listener running on a thread that listens for TCP messages.

If a word is received (e.g. cube) the model named “Cube” is then cloned and added to the scene in front of and above where the user is looking.

First person controls and VR support was also added to make the experience more immersive.

 

On Friday the 26th of June 2015, a collaborative VR artwork between Vaughan O’Connor and myself was exhibited at the MCA. The 2 artworks, a holographic print and a Virtual Reality (VR) experience were both based on 3D scans of quartzite rocks from Oberon, NSW.

The artworks were on display at the Museum of Contemporary Art (MCA) ARTBAR – Futures, curated by Dara Gill. Facebook event link here.

I created the VR component of the artwork based on the scans that Vaughan supplied.

Inspiration

Inspiration came from lyrics to a song by Covenant called Bullet.

“As the water grinds the stone,
we rise and fall.
As our ashes turn to dust,
we shine like stars.”

Idea

The idea behind the artwork is that it is a place outside of time. Here, concepts such as gravity, space and time do not behave as you would expect. Tempus Incognito. The events that occur here, in the event horizon, cannot affect an outside observer. The 4 classical elements, fire, air, water and earth are present in this place, reflecting the essential building blocks for all things. Even though we are outside of time, matter still exists albeit in a rather astonishing manner.

Technology

The Quartzite VR artwork was created using Unity (a game engine) and experienced via the Oculus Rift DK2 (a virtual reality headset). The ambient soundscape was composed in Ableton Live and is binaural.

Photos
MCA ARTBAR 'Futures' 01MCA ARTBAR 'Futures' 11MCA ARTBAR 'Futures' 02MCA ARTBAR 'Futures' 03MCA ARTBAR 'Futures' 04MCA ARTBAR 'Futures' 06MCA ARTBAR 'Futures' 05MCA ARTBAR 'Futures' 07MCA ARTBAR 'Futures' 08MCA ARTBAR 'Futures' 09MCA ARTBAR 'Futures' 10

Some photos by @mightandwonder

 

Observations

A few interesting things that happened on the night:

  • one of the computers decided to misbehave (no audio, no projector)
  • someone decided to walk about 2 metres away from the desk. The Oculus is plugged into a computer!
  • 2 people decided to look under the desk
  • a lot of people stood up to experience the artwork rather than sit down in the seats provided
  • People were amazed at the Oculus Rift
  • People found the experience soothing, calming and meditative.
  • Some people wanted to stay in side the artwork forever!
  • People asked a lot of questions about the technical aspects of the artwork
More

Futures also featured performances, lectures and music by: Vaughan O’Connor & Ben X TanMichaela GleaveJosh Harle, 110%, Mark Brown, Eddie Sharp & Mark Pesce, Andrew Frost, Claire Finneran & Alex Kiers, Kate MacDonald, Sitting & Smiling, Baden Pailthorpe, Hubert Clarke Jr, Polish Club and Ryan Saez.






Made this guy today using 123D Creature on iPad. Ordered a 3D printed version of him too. Can’t wait to get it!

2013-12-02 Rahrah Bazaar

Image  —  Posted: December 2, 2013 in 3D, Visual Art