SkySaga:Infinite Isles is a voxel based, sandbox, single/multiplayer exploration and crafting game currently in closed Alpha state. It has a very distinct aesthetic with vivid, saturated colours and complex lighting. It additionally supports a day-night cycle, a weather system, translucent and solid shadows, clouds, lit transparencies, volumetric fog, and many dynamic lights. The game also features a variety of biome types, from sunny or frozen forests, to scorching deserts and underground towns hidden in fog just to name a few.
A few weeks ago I came across an interesting dissertation that talked about using tessellation with Direct3D11 class GPUs to render hair. This reminded me of my experiments in tessellation I was doing a few years ago when I started getting into D3D11 and more specifically a fur rendering one which was based on tessellation. I dug around and found the source and I decided to write a blog post about it and release it in case somebody finds it interesting.
Before I describe the method I will attempt a brief summary of tessellation, feel free to skip to next section if you are already familiar with it. Continue reading “Rendering Fur using Tessellation”
This is a quick one to share my recent experience with branching and texture sampling in a relatively heavy screen-space shader from our codebase. The (HLSL) shader loaded a texture and used a channel as a mask to early out to avoid the heavy computation and many texture reads that followed. Typically we’d use a discard to early out, but in that case the shader needed to output a meaningful value in all cases. Continue reading “Branches and texture sampling”
Over the past two years I’ve done quite a bit of reading on Physically Based Rendering (PBR) and I have collected a lot of references and links which I’ve always had in the back of my mind to share through this blog but never got around doing it. Christmas holidays is probably the best chance I’ll have so I might as well do it now. The list is by no means exhaustive, if you think that I have missed any important references please add them with a comment and I will update it. Continue reading “Readings on Physically Based Rendering”
Recently I had a discussion with an artist about Physically based rendering and the normalized BlinnPhong reflection model. He seemed to have some trouble visualising how it works and the impact it might have in-game.
So I dug into my shader toybox, where I keep lots of them in there and occasionally take them out to play, found a normalized BlinnPhong one and modified it a bit so as to add “switches” to its various components. Then I gave it to him to play with in FX composer and get a feeling of the impact of the various features. After a while he admitted that it helped him understand how a PBR-based reflection works a bit better, and also that a normalized specular model is better than a plain one. One artist down, a few thousands to convert! Continue reading “An educational, normalised, Blinn-Phong shader”
For one of my lunchtime projects some time ago I did a bit of research on how can objects with transparent materials be lit using a deferred renderer. Turns out there are a few of ways to do it: Continue reading “Lighting alpha objects in deferred rendering environments”
A nice and cheap technique to approximate translucency was presented some time ago at GDC. The original algorithm depended on calculating the “thickness” of the model offline and baking it in a texture (or maybe vertices). Dynamically calculating thickness is often more appealing though since, as in reality, the perceived thickness of an object depends on the view point (or the light’s viewpoint) and also it is easier to capture the thickness of varying volumetric bodies such as smoke and hair. Continue reading “Dual depth buffering for translucency rendering”