binary nightmare #2

Firstly allow me to apologise for this post being late - it was supposed to be available on the 9th of October. We had some unforeseen technical hitches along the way related to hosting the content and reliable access. This coupled with some Unity bugs which were fixed in a patch on the 9th lead to the delay. This is just an inevitable side effect of software development and sometimes things don't quite go as planned.

For the most reliable experience please make use of either Chrome, or Firefox web browsers. If you do have problems running Mimis, please let us know in the comments below! We've also provided a video of the app for people who experience any issues.

Looking into Mimirs Well

What is Mimis?

So what is “Mimis” exactly? It’s a simple web-based 3D model viewer, basically it allows you to view our 3d art - in 3d - through the use of a web-browser. It’s been developed using Unity 5 and makes use of WebGL 1.0.

The purpose of the program itself is not a new concept, there are other programs that provide the same functionality, but for consistency's sake we’ve developed our own - this will allow us to add features over time, and keep up to date with the latest version of Unity as WebGL support improves. The tool is aimed at desktop browsers, but high-end mobile devices may function but no development support has occurred for these devices. We won’t be locking out mobile, and tablet devices, but we do expect performance and battery life to be impacted. We hope that eventually we will have Mimis working reliable on mobile devices, but right now it is definitely not recommended.

What's its purpose?

Mimis also has a practical purpose for us internally, it allows us to develop shaders and test lighting, and the artwork within the Unity Engine. This basically allows us to rapidly get a look and feel for our artwork, without the overhead of loading a large project. That’s because we run “Mimis” natively, as well as having the express purpose of deploying to the web via WebGL for user interaction.

There is another reason we developed Mimis, a lot of large and small companies produce slick screenshots. These are commonly referred to as “Bullshots”, they’re mostly faked and somewhat unrepresentative of the final product - you may have noticed these out there in the world. Before you bring out your pitchforks - NOT EVERY COMPANY DOES THIS!

The process for creating bullshots can be varied and not every technique is implemented at every company. Often it involves rendering out the screenshot with settings that do not represent real-time results. Effects are turned up to 11, view distances manipulated, characters posed, along with the output image rendered in the region of 4x the standard HD resolution. The image is further manipulated in photoshop, shrinking it down to “HD resolution” causing natural anti-aliasing, and sometimes can see further manipulation by a concept artist.

We feel that this process is fundamentally dishonest, and as a result - Mimis allows you to see the artwork in as closely represented as possible via WebGL. This is technically slightly worse than what should be expected when running natively, as we downgrade textures, use compression more heavily etc...

Factors to keep in mind about WebGL?

It is important to note that WebGL does have limitations. WebGL only supports up to SM2.0 (Shader Model 2.0), relatively dated technology considering the latest Shader Model is 5.0. This means the results aren’t 100% the same as what we're intending to see natively. Unity's WebGL is still considered to be a preview this means that developments are constantly taking place. Our hope is that when WebGL 2.0 becomes commonplace (support for Shader Model 3.0), and Unity's WebGL support moves out from being a preview we'll be able to provide users with a better overall experience.

We had initially planned to use the Unity Web Player (which features better overall support within the Unity engine), but advances in browser technology have forced us down the WebGL route - browser are moving away from an old API called NPAPI, which the web player used for its plugin. Google had already deprecated the NPAPI by the time we began work on Mimis so we decided to build our infrastructure around WebGL.

We use heavily compressed textures, which do affect the quality of the result a little, this keeps the download size more reasonable. We've also downgraded our texture resolutions from our development size of 4k to 1k textures to help with this.

So what are the requirements?

There were a few requirements for Mimis that needed to fulfill.

  • Display a 3D Model - in the correct context.
  • Display information about the 3D Model.
  • Shader/Material representations.
    • Show final rendering results.
    • Show texture makeup.
  • Environment Manipulation.
    • Post-Processing Effects.
    • Skybox Environment.
  • Intuitive camera manipulation via mouse input.
    • Orbit model.
    • Move focal point.
  • Simplistic interface.
  • Ability to see wireframe of the model.
  • Fullscreen/Embedded
  • About Panel detailing the software.

We will be adding new features as development pushes forwards, to improve the experience users and better demonstrate the results in line with Unity’s webGL development.


So here it is, Dan was kind enough to knock up this little scene contain some D6 for the purpose of this demonstration. Why some dice? Why not actual artwork? well - I didn’t want to step on the “Tales from the Art Side” blog where Dan is showing the processes he goes through, and will eventually share his end results using Mimis - in fact we expect to share a lot of content through Mimis.

An Instructional video is provided for those who cannot run Mimis on their device, or just simply don’t wish to.

Wickerman Games: BN2

We use Unity 3D WebGL which is currently in it's early stages of development. If you experience any problems, please let use know.Load BN2




Whilst we have developed shaders, and the general user experience, everything else is handled by Unity, there wasn’t too much work involved in creating Mimis outside of the need to script the UI, and camera behaviour. One of the largest portions of work was related to integrating WebGL projects with our website, but that’s a post for later.

All this said though there was one feature that did require a little time and effort. Something that is purely down to WebGL’s limitations.

That feature was displaying wireframe results of a mesh. This struck me as fundamentally odd, It’s actually a problem for several platforms when targeting OpenGL ES2 (the 3D API available on a lot of mobile devices). A lot of mobile platforms suffer from this same issue, and whilst we could have wandered along to the Unity Asset Store, and used money to purchase something that would give us these results, I instead opted to solve the problem myself. It made this project a little more challenging and seemed like a good place to get to grips with Unity’s shader lab.

Unity and Wireframe

When it comes to rendering wireframe results of meshes in unity, there are several solutions available for the end user.

  • GL.wireframe.
  • GL.lines.
  • Unity Line Renderer.
  • Custom Shader.

The first thing I did, in my ignorance, was to implement a GL.wireframe solution, which is also the simplest solution. Except… It’s not compatible with WebGL, it also comes with the drawback of being applied to the whole scene, unless you use a secondary camera and cull anything in the scene that you're not interested in. It’s easy to implement and only requires that the user add a small script to unity’s camera implementing these functions. The behaviour can naturally be more complex, but at its core this is all that needs to be done.

void OnPreRender()
  GL.wireframe = true;
void OnPostRender()
  GL.wireframe = false;

Enter the next solution, GL.lines - which comes with the drawback of not being compatible with webGL. It’s also in my opinion a pretty poor solution, that requires the mesh to be drawn a second time on the meshes post render function. That said the solution is very simple and easy for anyone to implement, you just need to ensure that you've taken advantage of the OnRenderObject functionality to basically draw the lines by hand.

void Start()
  MeshFilter meshFilter = gameObject.GetComponent<MeshFilter>();
  Mesh mesh = filter.mesh;
  for (int i = 0; i+2 <mesh.triangles.Length; i+=3)
    vertexList.Add(mesh.vertices[triangles[i + 1]]);
    vertexList.Add(mesh.vertices[triangles[i + 2]]);
void OnRenderObject()
  for( int i = 0; i + 2 < vertexList.Count; i+=3 )
    Vector3 v1 = gameObject.transform.TransformPoint( vertexList[i] );
    Vector3 v2 = gameObject.transform.TransformPoint( vertexList[i + 1] );
    Vector3 v3 = gameObject.transform.TransformPoint( vertexList[i + 2] );
    GL.Vertex( v1 );
    GL.Vertex( v2 );
    GL.Vertex( v2 );
    GL.Vertex( v3 );
    GL.Vertex( v3 );
    GL.Vertex( v1 );

Now in theory the Unity’s line renderer seems perfect at first glance, it should work on webGL, it should provide the desired result, and give some control over line thickness. That said upon closer inspection, it’s quick to see that it is unsuitable. If anything because of the general nature of the line renderer in that all lines are orientated to face the camera. This will give some very odd results, and it’d be easy enough to witness this strange behaviour when orbiting the mesh. I dismissed the approach.

This lead me down the custom shader route, oddly enough - there are a few shader solutions out there, the unity asset store has some of these for sale, some are even free. There are some common solutions, all of which revolve around the concept of barycentric coordinates. Some others leverage the geometry shaders that are available on desktop hardware - you guessed it, this wasn’t going to work for webGL, but barycentric coordinates will, and do.

Barycentric Coordinates mapped to a triangle

 Barycentric Coordinates mapped to a triangle

Colour spectrum generated by giving each vertex a barycentric coordinate

Colour spectrum generated by giving each vertex a barycentric coordinate

It does come with its own caveats though, in order to map to the barycentric coordinates the information ideally needs to be baked into the mesh data - or generated in some way from the mesh data in the shader, it was clear to me why many solutions required the use of geometry shaders, well that fact, and this Nvidia whitepaper detailing the solution. Knowing that WebGL wouldn’t support this I set about implementing a solution that would work with SM2.0, and would provide the desired results.

At first I attempted to simply fill out the vertices with the correct r,g,b value, a fairly simple problem except for the small fact that shared vertices will exist in a mesh. Now it’s pretty easy to prove out why this can be a problem for the approach - graph theory demonstrates that it’s impossible under certain circumstances - at least without altering the mesh permanently, which is a completely valid approach, and would provide the best results, but we’d also be providing a sub-optimal mesh whilst wishing to not render any wireframe pass, so I decided I didn’t want to alter the core mesh data to provide a solution.

Results from trying to place barycentric coordinates on certain topologies.

 Results from trying to place barycentric coordinates on certain topologies

 Demonstration of resulting barycentric coordinate when filling polygon in fragment shader.

This leads us to what I actually implemented. I wrote a small tool which only cloned and generated the relevant mesh information, I haven’t optimized this process at the moment, I should actually only create extra vertices where necessary, opposed to my current solution which increases the vertex count more than is necessary- by making each vertex unique. It’s simple, quick and effective, but not the best result that can be had for performance, I’m likely to extend it to do this eventually, but it’s not that critical at the moment, and only affects the vertex throughput on the GPU slightly by increasing the count. It also means that anyone producing art wouldn’t have to roll this data in by hand.

The shader is then pretty straight forwards, pass the barycentric coordinate for a vert to the fragment shader, pick out how close to the edge we are, and colour the pixel based on the result.

We’re considering polishing up the wireframe shader/tool and/or Mimis for sale via the Unity Asset Store. Let us know if this would interest you by sending us an email.

What’s next

So what will be occurring next, well Dan will be making heavy use of Mimis in his “Tales from the Art Side” blog posts. In terms of features there are a couple of things which need to be added at some point - a first person “fly” camera will be useful for larger scenes, with multiple meshes, and later on it’ll require me to make the desired optimization of the wireframe mesh generation tool.

Anyways, I hope you’ve enjoyed getting a look at one of the tools we’ll be using to deliver content.