apexvj
APEXvj 2.0 was a huge challenge for me in so many ways. Most of it is something I’ve learned during last six months or so. From technical point of view I feel like I have found new horizons to explore. And design wise… oh well same shiz as always :)

In this post I’m covering several topics. So if you find some of them boring just keep on scrolling down.

The Escape



Ten months ago I was in the forest picking blueberries with my wife and son, when my boss called me and reminded that my vacation was over. I knew exactly what that meant. I would go to the office for another year, solve issues caused by incompetence of some hand-swinger, create mediocre crap and most dreadfully would not evolve without yet again sacrificing all my personal time.

I looked at my son for a while and thought to my self: “I want to offer that boy the best possible start to his life. I cannot fuck-up our economy. But on other hand when that boy grow up I want to encourage him to be brave, fearlessly do what he feels is the right thing to do and most of all follow his heart.”

Later that day I went to the office and resigned.

I joined the Finnish start-up Sofanatics. It was a job where I would be able to use all skills from graphics to code and was thrilled to start working on a mobile version of the famous online stadium. Three weeks later Sofa went to bankruptcy. Their promising funding discussions broke down at the last minute.

In my 13 years of work history I had never looked for a job. CEOs always called me, served dinners, beers and whispered nice things to my ears. So this was all new situation for me. With determination that I would have a break from digital marketing I picked several companies from fields in which had no reputation. It turned out skepticism against people from marketing was notable. Also in most cases they were looking for a cheap one skill man. Or simply skill that I did not have.

In the summer of 2012 I visited Assembly, like I do every year, and saw a seminar about UNO & Realtime Studio by Anders Lassen and Bent Stamnes from Outracks Technologies. I was most impressed and because I had met Bent at some fuzzy bar in Amsterdam during FITC a couple of years ago I was brave enough to go say YO to them. It turned out they were looking for a pilot project to help with development of their products. By the grace of viking gods they accepted APEXvj.

For the next 5-6 months I developed what was to become APEXvj 2.0, learning UNO, working as a real test-case for Outracks and bouncing ideas, reporting bugs and posting feature requests.

”UNO,


Outracks Technologies are developing two products at the moment. The UNO Programming language and Realtime Studio. A take from outracks.com : “UNO is a new GPU-powered front-end programming language. It’s the world’s first hybrid graphics engine and compiler, designed ground-up for a new generation of GPU-driven user experiences. Uno enables programming the GPU and CPU from the same language, taking painful programming plumbing and platform-specific APIs out of the equation. Uno looks and feels like C# slimmed down to its essentials to make it truly portable and independent.”

This has been tried out many times before. One language and export to all platforms. However I think UNO and Realtime Studio does several things, that are important to me, extremely well:

A)

Native

export



D)

Performance

/>The



Uno contains a package called DefaultShading. It’s a basis of creating standard&modern shading effects. It’s something that isn’t mandatory to use though. You can create everything yourself as well. But it’s very useful and I used it a lot with APEXvj. I’m not into drag & drop tools, but that sort of work-flow is also possible with Realtime Studio. I’ve seen several incarnations of their Designer -tool and actually the one they are working on atm isn’t that bad. I could consider at least trying it. :) Gladly one can do everything with just code as well.

Realtime Studio and UNO is currently in closed beta and one can sign-up for it at outracks.com. After using UNO&RS for a couple of projects I must say it’s a working environment I would like to spent most of my days for now.


Making of the APEXvj 2.0


I would like to show more examples of UNO-code then this post contains but because it’s not public yet I’ll leave that to minimum. I’ll write another blog-post when UNO goes public.

When I started with UNO last Fall it was more then unfinished. There was bugs and the whole language got several major refactory rounds. Every time I internalised some concepts it was changed. It really made me to stay on the ball every day. You can image how hard it’s to create with new language and tool that changes every week. :) Most of the time I felt like I didn’t had what it takes to pull this all off. “I’m just not good enough” said demons in my head. And frankly I was right. I had to learn it all.

I would like to introduce all the effects of APEXvj and explain a bit what they actually contains. Lets start, so typically for us Finns, with failings. Here are some screenshots from visuals I had in December.

failA
failB

As you can see the quality of the rendering isn’t that nice. We got sawtooth all over the place. Also scenes are very simple and frankly boring. This lack of fidelity and quality was something Anders from Outracks had to point out to me several times before I realized it. My mind was still in “Flash-mode” and made me create too simple scenes. I didn’t realize how crazy I could go with UNO and WebGL without losing the performance. All these effects ended up to trash bin.

There are couple of ways to create post-process effects. One is to draw a scene and manipulate pixels afterwards. It’s the render to texture way we’ve used in Flash for years. This approach is quite fast, but one lose the anti-aliasing during the process. This causes sawtooth. To fix this there are some tricks like FXAA, but Anders thought me a another way.

Basically it goes like this: Build two Draw methods for every draw call. In first call draw the scene into small framebuffer (or texture) and manipulate that to create post-effects like blur, glow or radial. In second call draw the scene again in full resolution and add pixel from small framebuffer using the clip space. Now this would be a pain with AGAL. But in UNO one can use same GPU methods and properties (metaproperties) inside a class and just add the needed pixel in one line.

// DRAW FOR POST-PROCESS
public void Draw() {
        draw
        {
                PixelColor : Color;
        };
}

// SHARED METAPROPERTIES
float4 color : ColorDown + (ColorUp - ColorDown) *  Uno.Math.Min(1.0f, Uno.Math.Max(0.0f, Uv.Y));

// FINAL DRAW
public void Draw(framebuffer Buffer) {
        draw
        {
                pixel float2 Uv : ClipPosition.XY / ClipPosition.W * 0.5f + 0.5f;

                PixelColor : Color + sample(Buffer.ColorBuffer0, Uv, LinearClamp);
        };
}

This is an extremely nice approach since one can use different settings for those calls. For example the first one can be more optimized and have different colors or even vertex animation could be different. It also allows us to create a fake DOF and do many other funky moves.

As a result, the rendering quality is good as it can be on a modern GPU:

success1


Batching


All the effects are made by batching vertices and indices together. Before this was made easy (automatic) in UNO I had to write my own batcher. The idea of batching is to create a class where one can just dump in polygons and they can be rendered in one draw call. In other words they are baked into large Vertex and Index Buffers. This is obligatory to reach the full performance potential.

It’s not such an easy task as you might think. For example: in order to create buffers efficiently one must know the count of indices and vertices before creating buffers. Also adding polygons one by one is a lot of method calls and would take ages to initialize in large scenes. Most of APEXvj is created procedurally. Only some textures, rocks and the hand model are pre-made.

I ended up using several solutions. First for all I created a ProviderTriangle class that holds all the possible primitives and models I use in APEXvj. All the models have ability to cache their geometry so that if one geometry is already created it’s served from cache. The Batcher class itself is an abstract class that receive these models and pile them together. Basically I add all the models and then call Upload-method which generates the final buffers based on the data that was added. As a result I have classes where I can create geometry and write shaders all in same place.


Effects


For example: the Mushroom scene has only one draw call (plus one for the post-process as explained earlier). There are 10 x 10 grid of mushrooms all with limited random properties like sizes, hat angle, leg curve, radius etc.
fx_mushroom

The shading itself is most complex ever. The psychedelic mushroom texture is a combination of DiffuseMap, SpecularMap and NormalMap made from Perlin noise. I use some pretty random math to manipulate how light values are calculated to surfaces. Vertex tangent is a normalized vertex normal transformed with view and vertex binormal is a cross product of vertex tangent and vertex normal. Yeah I know pretty random and found by just trying what happens if I try this or that. This gave the mushrooms their weird poisonous look. In addition I use a cubemap to get nice reflections and lightmap to have variation of colors. There’s a bug in the latest OSX Chrome update and had to set mushrooms off on OSX side for now :) Works perfectly on Windows though.

To get mushroom forest to look foggy enough I added a radial blur effect in post processing. Radial only takes the background into account so the radial flare comes behind the mushrooms.


I had a privilege to have a 3D artist working a bit with me at some point. Jonny Ree was nice enough to build me some rock models and that cool hand. He also made me a couple of female legs that I hopefully use at some point :)
fx_rockhand
To get the rocks to blend nicely with the background I calculate the distance to camera on every vertex. I use that value to add background color in the model shader. Background color it self is a combination of two colors faded using UV.Y value. These values change based on sound and time. When combining these with the radial blurred scene, the light gets a nice foggy feel.


The rising spheres scene is my personal favorite. It contains 64 IcoSpheres with 3D ball physics behavior and a gravitation pull at the center of scene on top of the surface. The move energy is set to minimum once a sphere goes under the surface and y-coordinates are scaled according to distance to surface. Why? Because it makes balls do wicked moves when rising from the surface. It reminds me of insects hatching from water and was born simply because I’m into fly fishing.
fx_breakside
The shading uses a technique called SphereMapping which is drawn simply with the following line:

float4 ReflectEnv : sample(SphereMap, Vector.Normalize(Vector.Transform(ReflectedViewDirection, View).XYZ).XY * .5f + .5f);

fx_breakside2
The scene is yet again rendered with just one draw call.


Variation is very important for a music visualizer. One needs scenes with a serene feel as well as scenes with a sense of speed. The tube-scene is meant for speed. I made multiple versions until it worked. The problem with tubes is that shading looks stretched very easily. This causes yet again those ugly sawtooths. I ended up using very mild shading on the tubes and get the sense of speed with particles and rocks around them.
fx_tubes
The tubes are wrapped around a path. Here’s how creating tube-vertices and indices around a path looks in my UNO code:

for (int i = 0; i < parallels; i++)
        {
                x = (float)i / (parallels-1) * 2.0f;

                for (int j = 0; j < meridians; j++)
                {
                        dx = Uno.Math.Cos(-x*pi);
                        dz = Uno.Math.Sin(-x*pi);

                        factor = ((float)j) / ((float)(meridians-1));
                        step_from = (int)Uno.Math.Floor((float)lengthPath * factor);
                        step_from = (step_from < lengthPath) ? step_from : lengthPath;
                        step_to = step_from+1;

                        _a = path[step_from];
                        _b = path[step_to];

                        ax = _a.X; ay = _a.Y; az = _a.Z; aw = _a.W;
                        bx = _b.X; by = _b.Y; bz = _b.Z; bw = _b.W;

                        val = factor * lengthPath;
                        mv = val - Uno.Math.Floor(val);

                        stepPosition.X = ax + (bx - ax) * mv;
                        stepPosition.Y = ay + (by - ay) * mv;
                        stepPosition.Z = az + (bz - az) * mv;
                        stepPosition.W = aw + (bw - aw) * mv;

                        radius = stepPosition.W;

                        _dx = radius * dx + stepPosition.X;
                        _dy = stepPosition.Y;
                        _dz = radius * dz + stepPosition.Z;

                        vertexWriter.Write(float3(      _dx,
                                                        _dy,
                                                        _dz));
                }
        }

        int _i = addIndex;
        for (int k = 0; k < pstop; k++) {
                for (int h = 0; h < mstop; h++)
                {
                        indexWriter.Write((ushort)(_i));
                        indexWriter.Write((ushort)(_i+meridians+1));
                        indexWriter.Write((ushort)(_i+1));
                        indexWriter.Write((ushort)(_i+meridians));
                        indexWriter.Write((ushort)(_i+meridians+1));
                        indexWriter.Write((ushort)(_i++));
                }
                _i++;
        }


In several scenes I use the simple trick of drawing the waveform to a plane. It works great since it doesn’t steal the whole show yet changes the feel quite a bit depending on what is happening in a song. When the camera goes close to plane the rendering can get ugly. Also crossing the plane looks like death. I fixed this issue simply by fading the plane once the camera comes near enough. It’s one of those tricks that no-one realizes it’s happening and that’s a good thing. :)
fx_appendage
The reflections of spheres are made of CubeMap and created with line:

float4 reff : sample(TexCube,-ReflectedViewDirection);


In the eruption scene I wanted to capture the feel of collision and fission and squeeze that moment into a single loop. In order to do this I had to find a way to split a sphere. IcoSphere is perfect for this. It’s made from a regular Icosahedron by adding a vertice into the middle of every triangle and normalizing it.
fx_erapt
Now, using the triangles of the icosphere surface I create Tetrahedrons by adding one extra vertice to the center of the icosphere. Once I have only Tetrahedrons it’s easy to split them into more pieces. As a result I have a stack of Tetrahedrons in the form of a sphere. From there it’s just a matter of moving them in the vertex shader.


The stream or jet -effect is one the most agressive in APEXvj. There are 10 000 Tetrahedrons moving and rotating individually. I use the same trick that particle engines tend to use. For every vertice there’s a “MoveTo” float4. In vertexshader I simply tween between start and MoveTo coordinates.
fx_stream
Here’s how I made the movement in UNO metaproperties:

float Move :
{
        var ti = MoveTime + MoveTo.W;
        return ti - Math.Floor(ti);
};
               
float3 VertexPositionOffset : VertexPosition + (MoveTo.XYZ - VertexPosition) * Move;

WorldPosition : Uno.Vector.Transform(VertexPositionOffset, World);


I dig Tetrahedrons. (How geek is that? :) ) They are cheap in context of performance, act nicely with shading and are cool building blocks for larger models. For the shattered floor scene I needed to have a Delaunay triangulation implemented. As a base I used Nicolas Barradeau’s ActionScript3 code. Once I had a 2D plane made of triangles I could create tetrahedrons out of them.
fx_shattered_floor
For the shading I’m using a CubeMap but also a technique called LightMap (yet another trick Anders thought me). The light map is a 2D texture of simply nice colors sampled using WorldPosition. Very useful trick to get color variations. In this scene it also gave a nice swell effect.
fx_shattered_floor2

sample(LightMap, Uno.Vector.Normalize(WorldPosition.XY))


How can you have pudding if you don’t eat your meat and how can you have a demoscene product without a tunnel? :) The tunnel is basically a plane bent into the form of a cylinder. Now, for this particular tunnel I generated a stack of 2D points at random coordinates inside a rectangle, triangulated those points, then bent them into a tunnel using Sine and Cosine and at the end, turned those triangles into tetrahedrons.
fx_tunnel
To get a rough and aggressive feel I removed some of the tetrahedrons to create gaps in the tunnel. This also made the radial blur come out nicely from the background.


One of the first models I built using the Batcher was this tree.
fx_tree1
For a long time it was rotting at the bucket of failed ideas. It’s entirely procedurally generated and it took me quite a while to get my head around it. I really wanted to find a use for it in the visualizer. After several attempts I ended up using it as a part of the larger model. I duplicated then mirrored it, duplicated again and added a rotation of 90 degrees. With some scaling and more rotation, I was able to build a very weird-looking fractal zoom.
fx_tree
The best thing about the 3D fractal is that once you get the zoom look flawless you can add rotations and complicated camera moves, as long as you make sure you get back to some safe coordinates when the model switch happens.


As long as I’ve been coding there has been particles and APEXvj 2.0 is no exception. I used particles in two scenes. The first one, called super massive black hole, is basically a stack of ultra alpha dust particles moving into the center of the scene or out of it. The ultra-alpha and blending creates nice smoke type of look. The ultra-alpha and blending creates a nice smokey look. “Ultra-alpha” in this case means that the alpha value of a single particle is very close to zero.
fx_particles
The second particle scene is a combination of some dust- and 64 000 dot -particles. For the DOF-effect I used a really cheap – and I think – smart trick. The particles are scaled exponentially using the distance to camera -value and scale is limited. This way I don’t need to use large textures on the particles and with a bit of blur and alpha it looks just great.

float2 CornerPos: Data.XY * (1.0f + Uno.Math.Min(10.0f, (CameraDistance*CameraDistance*CameraDistance) * 8.0f));
ViewPosition: (Vector.Transform(float4(WorldPosition, 1), View) + float4(CornerPos.XY, 0)).XYZ;


The APEXvj Remote


In the very first APEXvj version we had a remote controller. It was made using the Union Platforms service. It worked nicely, but for this incarnation I wanted to build it all myself. All the cool kids in town are using NodeJS so I decided to try to be cool as well. How hard can it be? Very easy in fact.

I found a perfect service provider called Nodejitsu. They had a socket.io in their default set and since WebSockets work everywhere I didn’t had to look further.

remote

The remote is a HTML5 app that communicates through a node.js backend that sort of resembles a chat server. It allows users to search for songs and switch between effects and colors. The server makes sure that the visualizer and the remote stays in a synced state. Setting up the remote is easy: Just type in the same word on apexvj.com and in the remote – done.

App is available for iOS and Android.

The system supports multiple connections to the same keyword. This means that you can share the controller with friends. For this reason I’ve added the “Add to queue”- and “Up/Down vote” -features.


Next steps


I would really love to create a mobile version of APEXvj 2.0 using UNO. Shouldn’t be too much work, but I think for that I want to create yet more visuals. We’ll see.. In the meantime I’m looking for gigs where I can do realtime graphics and use these new skills I’ve learned. I have a little company page at www.oddcircle.com . It’s totally OK to share that URL :D

Hopefully you enjoyed this epic post and have a nice summer!

Go check out the official release track of APEXvj 2.0, Subsquare – Broken (feat. Jane Dawn)!

Cheers,
Simo