Annotation 2020-04-14 000719

Its been over 3 years since I posted anything here, but this was so frustrating, I thought I would post all the solutions in one place if it could potentially help you save a few hours of tearing your hair out.

Its Easter Monday. One of my tasks today, before the end of the long weekend was to create a soundscape for my friend Adam’s short animation. Sounds easy enough, but I wasted hours trying to get Ableton to load and export video.

I am running:

  • Windows 10 Home (1903) 64-bit
  • DirectX 12.0
  • Ableton Live 10.1.9
  • Laptop with:
    • NVIDIA Geforce GTX 1060
    • Intel Integrated Graphics 630

 

PROBLEM : Can’t load video in Ableton Live 10 using Windows 10
SOLUTION: Install Matroska Splitter and FFDShow.

When I tried to drag a video in to the timeline in Ableton, it would crash or show error messages like “The file could not be read. It may be corrupt or not licensed”.

See: https://help.ableton.com/hc/en-us/articles/209773125-Using-Video

The first option, CCCP did not work for me. Neither did K-Lite Codec Pack Mega.

Only Option 2, Matroska Splitter and FFDShow, solved this issue.

 

PROBLEM : Video not playing in Ableton Live 10 using Windows 10
SOLUTION: Set NVIDIA to Integrated Graphics

Now that I could load the video, it wouldn’t play in Live. Instead I would just get a black screen in the video player.

In my case, I have a laptop with:

  • NVIDIA Geforce GTX 1060
  • Intel Integrated Graphics 630

See: https://forum.ableton.com/viewtopic.php?t=232488

Open the NVIDIA Control Panel and select “Manage 3D Settings”.

Click on Program settings, and add Live into the list.

Set the preferred graphics processor to “Integrated graphics”.

 

PROBLEM : Can’t export video from Ableton Live 10 using Windows 10
SOLUTION: Set the Export PCM Bit Depth to 16

The final problem was that I couldn’t export the video. Live would export the WAV and then just crash.

See: https://forum.ableton.com/viewtopic.php?t=231960

  1. Click on File > Export Audio/Video
  2. Set the PCM > Bit Depth to 16
  3. Select the Video Encoder Settings as you wish
  4. Click Export
  5. Voila!

 

And that’s it. I hope this one consolidated is helpful to you. Seem like a lot of people have issues with Live and Video.

 

Cheers,

Your friendly neighbourhood Ben X Tan.

Here are some experiments in VR that I made after watching the Doctor Strange movie.

Visuals created using Unity, sounds created using Ableton Live.






johnny-mnemonic-blu-raycompre

Ponderings…

We are all used to text search. Google, Bing, Yahoo. In VR, you don’t have a keyboard.

We sometimes use voice search. Siri, Google, Cortana. Voice search is used in the Samsung Internet VR app to search the web.

With VR, will there be new ways to search?

In Steam you can scroll through lists with thumbnail images or VR apps and you can point at and select them with hand controllers. Other VR apps use gaze as an input mechanism.

What if you wanted to search the web where there are millions of results? What is the best way to represent this in a 3D space where you can have hand controllers, gaze, voice, gestures and eye tracking?

Google has 2D image search to find similar images. Will there be a 3D image search to find similar 3D images?

What if I wanted to do a meta-search for across multiple platforms? AltspaceVR, Second Life, Hi Fidelity, VTime, etc. What if I wanted to search for a location, a person, an object, a colour or an event across these platforms? Will there be new meta-apps that sit on top of other apps to enable this? VR search engines, VR crawlers, vrrobots.txt, VR cookies, etc.

Incognito mode where you can traverse the Metaverse discretely?

Will there eventually just be one Metaverse, just as there is only one Internet?

So many possibilities…

 

 

Skyfall v0.1

Posted: March 6, 2016 in C#, Hardware, Software, Unity 3D, Virtual Reality

Insight

What if you could make things appear in front of you with the power of your voice?

Idea

Skyfall

Execution

I had an idea of making a virtual reality (VR) demo where you can say things and make them fall from the sky.

For the initial demo, the only objects that you can magically summon are:

  • a cube
  • a white car – Ferrari
  • a red car – Dodge Viper
  • a house
  • Deadpool
  • a Star Destroyer

The “car” command randomly picks between one of 2 cars.

You can also say the word “clear” to remove all the objects from the  scene.

Additionally, you can also move around the 3D space using an XBox controller or keyboard and mouse.

All this works within the Oculus Rift DK2.

To take this project further I can:

  • add more objects and commands
  • add Leap Motion support so you can interact with the objects using your hands

Tech Details

Problem

One of the roadblocks I hit early on was that Unity 5.3.2 does not support the .NET speech recognition DLL’s since it is using Mono.

Solution

In order to get around this, I created 2 applications and used a TCP port for them to communicate with each other.

Components

1) Server: C# .NET application

The voice recognition is being done using System.Speech.Recognition.

GrammarBuilder object is created with the following words.

private string[] VALID_COMMANDS = {“clear”, “cube”, “car”, “house”, “deadpool”, “star destroyer” };

Once a word is recognised, a message is sent to a specified port via TCP.

2) Client: Unity application

On the client side, there is a TCP listener running on a thread that listens for TCP messages.

If a word is received (e.g. cube) the model named “Cube” is then cloned and added to the scene in front of and above where the user is looking.

First person controls and VR support was also added to make the experience more immersive.

 

On Friday the 26th of June 2015, a collaborative VR artwork between Vaughan O’Connor and myself was exhibited at the MCA. The 2 artworks, a holographic print and a Virtual Reality (VR) experience were both based on 3D scans of quartzite rocks from Oberon, NSW.

The artworks were on display at the Museum of Contemporary Art (MCA) ARTBAR – Futures, curated by Dara Gill. Facebook event link here.

I created the VR component of the artwork based on the scans that Vaughan supplied.

Inspiration

Inspiration came from lyrics to a song by Covenant called Bullet.

“As the water grinds the stone,
we rise and fall.
As our ashes turn to dust,
we shine like stars.”

Idea

The idea behind the artwork is that it is a place outside of time. Here, concepts such as gravity, space and time do not behave as you would expect. Tempus Incognito. The events that occur here, in the event horizon, cannot affect an outside observer. The 4 classical elements, fire, air, water and earth are present in this place, reflecting the essential building blocks for all things. Even though we are outside of time, matter still exists albeit in a rather astonishing manner.

Technology

The Quartzite VR artwork was created using Unity (a game engine) and experienced via the Oculus Rift DK2 (a virtual reality headset). The ambient soundscape was composed in Ableton Live and is binaural.

Photos
MCA ARTBAR 'Futures' 01MCA ARTBAR 'Futures' 11MCA ARTBAR 'Futures' 02MCA ARTBAR 'Futures' 03MCA ARTBAR 'Futures' 04MCA ARTBAR 'Futures' 06MCA ARTBAR 'Futures' 05MCA ARTBAR 'Futures' 07MCA ARTBAR 'Futures' 08MCA ARTBAR 'Futures' 09MCA ARTBAR 'Futures' 10

Some photos by @mightandwonder

 

Observations

A few interesting things that happened on the night:

  • one of the computers decided to misbehave (no audio, no projector)
  • someone decided to walk about 2 metres away from the desk. The Oculus is plugged into a computer!
  • 2 people decided to look under the desk
  • a lot of people stood up to experience the artwork rather than sit down in the seats provided
  • People were amazed at the Oculus Rift
  • People found the experience soothing, calming and meditative.
  • Some people wanted to stay in side the artwork forever!
  • People asked a lot of questions about the technical aspects of the artwork
More

Futures also featured performances, lectures and music by: Vaughan O’Connor & Ben X TanMichaela GleaveJosh Harle, 110%, Mark Brown, Eddie Sharp & Mark Pesce, Andrew Frost, Claire Finneran & Alex Kiers, Kate MacDonald, Sitting & Smiling, Baden Pailthorpe, Hubert Clarke Jr, Polish Club and Ryan Saez.






Made this guy today using 123D Creature on iPad. Ordered a 3D printed version of him too. Can’t wait to get it!

2013-12-02 Rahrah Bazaar

Image  —  Posted: December 2, 2013 in 3D, Visual Art

Heres a game I made for the Oculus Rift with my colleagues at DAN.

http://thehungergame.com.au/

Video  —  Posted: September 19, 2013 in Ableton Live, C#, Electronic, Game, Music, Software, Unity 3D
Tags: , , , , , , , , , , , ,

Cute Redhead

Posted: September 19, 2013 in Pen, Visual Art
Tags: , , , , ,

Black & White
Colour