Quixel with Houdini & Blender

Six months ago, I was focused on learning more lighting techniques at the time and using a lot of primitive geometry to try out different lighting techniques. I tried creating interesting shadows and playing around with the angles of the lights. But things always seemed very smooth when using primitives, the computer would run faster with the low polygon geometry and I was able to control the shadows easily. But when it was time to switch the primitives for high quality models, nothing would end well. The scene would get too heavy to even move the viewport which was my worst nightmare. The shadows never appeared the same due to light bounces from other objects.

After switching primitives for higher quality models, the primitives come with specular textures and reflection materials, which all contribute to secondary light bouncing and dissolves my shadows. These were some of the obstacles I faced when composing a scene.

One day, I saw a Quixel trailer on my Youtube recommended list.

It blew my mind right away. The ease of use was just too good to turn down. You literally generate high quality scenes easily within a day or maybe hours if it’s a small enough scene.

The 3D assets they have in the library loads up very quickly in the scene and had high details with low poly count. I later on found out they used a lot of photogrammetry to scan all the assets in the Megascans library, which is what Quixel calls their library of CG assets. Photogrammetry loads very fast because all the details of the model are stored in the textures. If you read my previous post about light maps you’ll understand the benefits of rendering from textures.

Quixel’s Bridge User Interface

Quixel’s Bridge User Interface

Quixel’s Bridge

Quixel has 2 applications, Bridge and Mixer. Bridge is my favorite application with Quixel. You choose your asset in the Megascans library and then click on the export button and Quixel brings the whole asset with textures set up into Houdini! All I have to do in Houdini is position the asset in my scene and set up the lighting. So convenient!

Now there is a down side to it all, there’s not much info out there that shows you in detail how to use Quixel with Houdini, but it’s not hard to figure it out. I always wondered why Quixel didn’t have a detailed tutorial on how to use the Bridge with Houdini. The reason behind this might be because they support many different applications, not only Houdini and Bridge is still quite young so updates come frequent. And with every software update, you probably need to update all those tutorial videos as well.

The Bridge also supports Blender! This Blender update came about just a couple months ago from the time of writing this. It is very convenient not having to copy the same image texture node 5-6 times for each texture I import and some of my Blender materials may need 5-6 sets of textures that I blend together to make the material.

Scatter Feature

Houdini has a scatter feature built-in and you can scatter certain objects onto other objects. Bridge uses Houdini’s scatter feature and built it into Bridge’s export functionality. So when you export from Bridge, it’ll scatter it onto an automatically created grid and then you need to manually switch the grid for another object of your choosing.

Below is a screenshot of one of the first projects that I made using Quixel, Houdini, and rendered with Blender’s Cycles because at the time, I didn’t have Redshift yet. I used the Bridge’s export scatter functionality to place some mushrooms onto a rock.

Rock & Mushrooms from Quixel’s Bridge exported into Houdini and rendered with Blender’s Cycles

Rock & Mushrooms from Quixel’s Bridge exported into Houdini and rendered with Blender’s Cycles

The rock and mushroom are from assets found in the Megascans library. The ground and water geometry are generated in Houdini, some basic mountain SOP noise and water sim to fill the lower points on the ground. Then Blender to set up a material for the water and sand and Cycles to render. All this could have been done in Houdini, but at the time, I was obsessed with Cycles.

Quixel’s Mixer User Interface (screenshot taken from a project where I try to recreate the obsidian texture from one of Quixel’s tutorials)

Quixel’s Mixer User Interface (screenshot taken from a project where I try to recreate the obsidian texture from one of Quixel’s tutorials)

Quixel’s Mixer

Back months ago, Mixer wasn’t free, but Mixer became free just a few weeks back from the time of writing this and is currently available in Beta mode, free for use. Mixer is a texture generating software. You can use the Megascan library assets where they have existing textures and layer them together to generate new textures.

I didn’t find Mixer to be useful at the time, but when Quixel started introducing noise layers from an update in January 2019, just a few months ago, it became a game changer for me. I was able to produce very interesting surfaces by layering the Megascans’ existing textures using noise mattes or even noise matte to add solid colors or wet maps.

Layering, blending, and mixing textures together isn’t anything new. This has been done in building materials for any 3D software rendering engine for a long time. However, it’s the ease of use that really got me liking the application. Less technical jargon, and more energy left for creativity. It’s hard enough to create some that stands out in today’s market already, so convenience does have its value, but of course it comes at a price. I do discuss a little about the pricing for Quixel near the end of this post, titled “Pricing”. If you want to fast forward to that, just scroll down to the end.

Making New Geometry with Textures

I saw this awesome post on Twitter the other day and thought it was a clever idea. Generate a very basic mesh of simple shape and low poly. Then build a material using either Bridge or Mixer and rely on the displacement maps to do all the hard work to generate the details of your mesh.

I’m not totally sure if this is how the Colossus cliff is generated in the tweet. It’s only a guess. But it makes sense to me, because with a simple mesh, you benefit from the low poly count and still get the details at render time with the displacement map.

 

Displacement in Redshift

Redshift Tessellation_Circles.jpg

If you’re wondering how to displace the geometry with Redshift in Houdini, it’s actually really simple. On the geometry node of your mesh in the Object level context, there’s a Redshift OBJ tab. Then click on the Tessellation/Displacement ==> Check the “Enable Tessellation” ==> Check “Enable Displacement”.

Then when you render, the respective geometry will be displaced, but only in the render view, it won’t be displaced in the viewport. This uses the fast GPU cores to calculate the displacement on render time instead of using the CPU to subdivide the geometry when modeling.





Pricing

At the time of writing this, Quixel’s Personal plan is around $29 USD per month which will give you an indie license and if your revenue is under $100k USD you can qualify for this subscription plan. Which I do find it a bit pricey for today’s market, but its the ease of use that I find most valuable to me.