Monday, July 29, 2013

TressFX - Crystal Dynamics and AMD cover TressFX on SIGGRAPH

There were more talks about Confetti's work on TressFX on SIGGRAPH: One talk by Jason Lacroix was: "Adding More Life to Your Characters With TressFX".

Activision's head demo uses TressFX as well: "Digital Ira: High-Resolution Facial Performance Playback".

If you are a registered developer and you need XBOX One or PS4 implementations, send me an e-mail.

Thursday, July 25, 2013

SIGGRAPH 2013

I would like to highlight the talk "Crafting a Next-Gen Material Pipeline for The Order: 1886":

http://blog.selfshadow.com/publications/s2013-shading-course

The 3D Fabric Scanner is a fantastic idea and the results are awesome. Those are next-gen characters. Great work!

Monday, July 22, 2013

Tiled Resources / Partially Resident Textures / MegaTextures

One of the new features of DirectX 11.2 and now OpenGL 4.4 is Tiled Resources. Tiled Resources allow to manage one large texture in "hardware" tiles and implement a megatexture approach. The advantage of using the hardware for this compared to the software solution that was used before are:
- no dependent texture read necessary
- hardware filtering works including anisotropic filtering
AMD offers an OpenGL extension for this as well and it is available on all newer AMD GPUs. NVIDIA has shown it running on the build conference on DirectX 11.2. So there is a high chance that it is available on a large part of the console and PC market soon.
Let's step one step back and see what challenge a Megatexture is supposed to solve. In Open World games, we solve the challenge of having a high detail in textures with two techniques:
- on-going texture streaming: on a console you keep streaming from physical media all the time. This requires careful preparation of the layout of the physical media and a multi-core/multi-threaded texture streaming pipeline with -for example- priority queues.
- procedural generation of "large" textures: generating a large terrain texture is best done by generating it on the fly. That means stitching a "large" texture together out of smaller textures with one "control texture" that then also requires a dependent texture read.
The advantage of procedural texture generation is that it doesn't require a lot of "streaming" memory bandwidth, while one large texture or also many small textures eat into the amount of available "streaming" memory bandwidth.
Now with a MegaTexture there is the ability to store much more details in the large texture but it comes with the streaming cost. If you have an implementation that doesn't generate the terrain texture procedurally on the fly and you have to stream the terrain data, than the streaming cost might be similar to your current solution, so the MegaTexture might be a win here.
The biggest drawback of Partially Resident Textures / MegaTextures seems to be forgotten in the articles that I have seen so far: someone has to generate them. There might need to be an artists who fills a very large texture with a high amount of detail pixel-by-pixel. To relieve the workload, a technique that is called "Stamping" is used. As the name implies a kind of "stamp" is applied at several places onto the texture. Stamping also means giving up the opportunity to create unique pixels everywhere. In other words the main advantage of a MegaTexture, offering a huge amount of detail, is counteracted by stamping.
In practice this might lead to a situation where your MegaTexture doesn't hold much detail because artists would have to work a long time to add detail and this would be too expensive. Instead the level of detail that is applied to the texture is reduced to an economically feasible amount.
The overall scenario changes, when data exists that -for example- is generated from satellite images of the earth with high resolution. In that case a MegaTexture solution will offer the best possible quality with less art effort and you can build a workflow that directly gets the pre-generated data and brings it into your preferred format and layout.
For many game teams, the usage of MegaTextures will be too expensive. They can't afford the art time to generate the texture in case they can't rely on existing data.






Tuesday, July 2, 2013

Link Collection

I was looking through some of the links I saved for further reading today.

An article explaining BC compression formats with a lot of detail and clarity can be found here:

Understanding BCn Texture Compression Formats

There is an interesting blog post by Sebastien Sylvan. He writes about R trees a data structure that allows you for example to do spatially indexing of objects in your game.

A Random Walk Through Geek-Space

He also has other cool articles on hash maps and vector replacmenents.

We still need desktop PCs in the office to swap discrete GPUs whenever we need to. Because we also need them as portable as possible, we decided to build the following setup ourselves:

Maximum PC

So far we build two and they work well.

For Blackfoot Blade, we worked with a composer in Finland. I love the music he made and I wanted to share his website here:

TAPANI SIIRTOLA

Our friends at Bitsquid released a useful open-source library:

foundation

I quote from the description that describes the design of the library:

Library Design

foundation has been written with data-oriented programming in mind (POD data is preferred over complicated classes, flat arrays are the preferred data structure, etc). foundation is written in a "back-to-C" style of C++ programming. This means that there is a clear separation between data and code. Data defitions are found in _types.h header files. Function definitions are found in .h header files.

If you haven't found the DirectXTex texture library you need to check it out at

DirectXTex

MVP Award 2013

Yesterday Microsoft awarded me with a MVP award for Visual C++. Now that DirectX is part of Visual C++, I was moved into the Visual C++ category. I am super proud of that. Especially now that Visual C++ finally gets C99 support :-)