Final Dissertation Post

This post marks the last post I will write on the development of my rendering engine as a part of my Dissertation. This project has proven a gigantic learning curve, not only regarding C++ and DirectX but also maintaining a blog and documenting progress of a project.

The repository for my dissertation artefact can be found on Github Here:

When I set out my milestones I had very little knowledge of Graphics Programming so some of the milestones I set myself were trivial, they could have and should have been grouped together, such as the individual lighting tasks; which I completed all at the same time anyway. Knowing what I know now on this subject, when I continue development on the project I will be able to far more accurately judge what functionality and tasks constitute an achievable milestone.

An issue I had over the course of this project was finding the time between working on the other projects I had been part of to write up the progress I was making; I know that writing is not my strong point and even though I managed to write posts with ease, I had to teach myself to do it immediately after completing the work. There were many points where I spent 100 percent of my time working on my other work, neglecting this, only to the spend all my time on this, powering through work and neglecting the other projects. I am now far more aware of how I need to make sure I manage my time so that I can effectively work on multiple projects concurrently.

Furthermore, getting posts proof-read proved quite an issue due to the complexity of the project, on a few occasions I had posts backed up while I was trying to find proof readers. Unfortunately, this is something that comes with working on more niche subjects.

One task had me wasting many hours trying to find solutions to many problems and that was retrieving material data from my FBX files. I found the FBX SDK Documentation to be very unhelpful at times; demonstrations were often focussed on creating/exporting data rather than importing meshes et-cetera. I regret spending so long trying to get anywhere with the task instead of seeing it as something to come back to later once I had completed more approachable and urgent tasks such as adding Specular and Normal maps to my shader or the Scene Graph which is currently only half implemented.

Overall I am happy with how far I have gotten through the milestones I set out, I completed all the minimum tasks and two of the advanced. I had also started implementing the scene graph, only to stop working on it in favour of tidying and organizing my code.

Specular Maps

Specular maps are used in 3D Rendering and Games to specify what parts of a material are shiny or more reflective, as the Games industry has moved more towards Physically Based Rendering (PBR) which uses both roughness maps and reflectivity maps for more physically accurate results, however as a beginner I have decided to stick with what I know and implement the more basic model, leaving PBR as a feature to be added way into the future.

Adding a specular map to the shader was very simple. Similarly, to how the Diffuse texture is multiplied with the diffuse value of the light, we can control the specular intensity with a texture in the same way. To add the specular map to my shader, I added another texture to the pixel shader named gTexSpec which I assigned to the second texture register and updated it’s sub-resource in the same way as I did with the diffuse map, and using 1 as the first parameter, as I’m using slot 1 for the specular map.

To demonstrate the effect of the specular map, I have rendered a plane with a charcoal material from with and without the spec map.


This milestone took far less time to complete than I had anticipated back in October when I wrote my proposal as I had already learned everything I needed when I added in the support for diffuse textures. Had I realized this i could have completed this task at the same time similar to how I completed my lighting milestones, saving time and freeing up mode space in the advanced milestones for more tasks.

Reference (n.d.). Charred Wood. [online] Available at: [Accessed 5 May 2017].

Tidying Things Up

The last few commits started out as the beginning of creating the scene manager, however as I began to move code around I realised that my D3D11App class was far too complicated, it’s purpose was originally only to deal with DirectX matters but as time has passed it slowly turned into what can only be described as spaghetti code. Therefore, I decided it would be a far better decision to tidy up the code and try to keep all my classes only handling what they should and having them make sense from an OOP perspective.

An important change I have made is that I have renamed my D3D11App class to D3D11Graphicsand this class now deals only with initializing D3D and Drawing. All the remaining functionality that D3D11App had has been moved to the new NoobieEngine class (Noobie being the name I have settled on calling the Engine) and the Win32 code like the window creation and message handling is now Dealt with in the Source.cpp file with the entry point of the program.

My plan with this change is to not only make it easier for the programmer to use the engine when they add their own content to render but to also allow me to easily implement different rendering APIs like DirectX 12 or OpenGL as per my Advanced features list, where I would only have to add a new Graphics class and slightly modify the NoobyEngine class to accommodate the variations different APIs will bring without changing the workflow of programming the engine. In its current state, if the user wishes to add meshes, move objects or in any way modify the scene, all of this can be done within this new class without having to go anywhere near unrelated code.

Changes to Materials

At one point, each mesh had a pointer to a material however for some reason I had changed this and have been using a material pointer in D3D11App, I have now reached the point where the meshes that I am using may well require different meshes and eventually these will be imported with the FBX SDK, for this reason I added the pointer back into the mesh class and removed it from D3D11App.

Additionally, materials now have a pointer to their diffuse texture and I am no longer manually swapping the texture from mTextures. This means that my UpdateMaterial method now only requires the device context and a material. This is a far more sensible way of dealing with materials and textures as the imported textures are always being used with a material anyway.

When I start using bigger and bigger scenes that contain many materials I will be able to sort my renderables by the material they are using. Updating the GPU’s resources is a costly procedure so by grouping all meshes by material I will only need to update that data the minimum number of times necessary as I will only have to update the resources once we have finished rendering one material. Reducing the number of times update the GPU will, in theory, lead to increased performance.

In the future, as I begin to add support for more texture types in the material, I would like to have a reference to both a texture for each map as well as an alternative Colour like how material in Unity 3D work. This will allow me to set a colour for that slot if I do not intend to use a texture there, and choosing whether to use it depending on whether the texture is set.

Textured Sponza Render

In its current state, the Rendering engine can take in an FBX and use its Mesh data and load either 24 or 32 bit TARGA files to be used as textures. At this point I want to finally get a textured render of the Sponza scene and the only thing stopping that from happening is not being able to get texture data from my FBX file so for this instalment I decided to take the long and tedious route of separating the meshes and importing all the parts one by one.

Originally, I had intended on working with the FBX sdk to figure out how it’s materials and textures work, however after a few long hours wasted and going around in circles I had been getting nowhere.

The solution I eventually landed on involves the following process. (Sadly, I did not get any screenshots during the process)

Firstly, I matched up the meshes in the Max file with the textures they are currently using, I did this by dragging on the texture I thought was correct and using trial and error, then adding the texture name to the end of the object name. In my knowledge of 3DS Max, there is no way to populate the Material Editor with materials in the scene so this was the only way I knew how. It was an arduous task.

Secondly, I attached all the objects that were using the same texture together. This was easy to do using Select by Name in max and selecting all the objects with the same texture suffix that I set in the first step, then attaching them in the Editable Poly: Modify tab.

Finally, I exported all the meshes, that had been joined by texture, as FBX files with the texture name as the Suffix.


  • exported all individual meshes and named them according to the texture they use

After all of this, all that was left was adding the various lines of code to the start function of the program to load the meshes and their textures into their arrays (in the same order) and then adjust the rendering code to change the texture inside the for loop. Of course, not forgetting to place the all-important camera at a cool and jaunty angle and we are Finally ready to get a Render (or two).



Important Lessons

A major discovery I have made is why people check that a pointer is not NULL before deleting, if for some reason a destructor has been called before an object has been fully initialized, there is a chance that some pointers may be still uninitialized, throwing an error. I will at some point go through and add these checks to all my classes just in case.

    delete pObject;
    pObject = nullptr;

Additionally, I changed mTexture to be an array of textures, in the future, I may use either a list or dictionary and have texture index referenced in the materials however for this render I only needed the simple array.

Finally, I removed the need for my axis conversion function and the backwards polygon reading inside of the LoadFBX method by using FBX’s FbxAxisSystem::DirectX.ConvertScene(scene) function. This tells FBX to read all the axes using the ‘y up’, right hand axis system as opposed to whatever system it has been exported with. (This does make me wonder why when exporting to FBX into Unity3D, that does not perform the same conversion as I have always had to rotate my mesh’s pivots.)

Targa: 24-bit support

Targa: 24-bit Texture Support

At this point in the project I just want to have a render of the Sponza scene with all its textures applied as it’s now close to the end of the project and I am getting excited to see the whole thing come together

I ran into a problem with the provided textures which were all in JPEG format, which the engine cannot import, and I could only get GIMP to export to TARGA as 24 bit which was not supported. I chose the sensible choice and added the support for using 24 bit Targas.

The method

Initially I had to convert the textures to .tga in GIMP which took far longer than I had anticipated, were it in Photoshop I could have set up a simple macro however I rarely use GIMP so that was not something I know how to do so I had to do it the old-fashioned way and manually export each file.

Then I started on modifying the importer in my content manager. The first port of call was removing the if statement that threw an error if the texture’s bit depth was not 32, I replaced this with a switch block and placed the existing colour revering loop under the case: 32.

To make the data usable as a texture in DirectX I had to add alpha data to the texture, therefore inserting a new byte after every three existing ones. To do this I created a new array of chars of size width * height * 4, which I fill up using the data from the file, simultaneously swapping the blue and red channels as I do with the 32-bit files.

this ended up being very simple to do as I just iterate through a variable for the index in the 24-bit texture and the 32-bit texture at the same time adding 3 or 4 respectively, realizing I could do this in a for loop was helpful and I am sure I will be able to use it elsewhere in similar situations. I then just fill in the new larger array using the 3 existing bytes and setting the 4th byte to 255, completely opaque.

The loop is as follows:

for (size_t tex24 = 0, tex32 = 0;
    tex24 < width * height * 3;
    tex24+=3, tex32+=4)
    imgData[tex32 + 0] = rawData[tex24 + 2];
    imgData[tex32 + 1] = rawData[tex24 + 1];
    imgData[tex32 + 2] = rawData[tex24 + 0];
    imgData[tex32 + 3] = (char)255;

Side note Bugfix

I was running into an error that meant that if textures were a certain size then the width or height would end up being negative, this was due to the header not being unsigned and me negating the low bytes. To fix this I cast the individual parts to unsigned chars before performing the bitwise maths, this uses up more lines and involves 4 temporary variables but it makes it a lot clearer what is happening and fixes the bug.

unsigned char widthLo = buffer[12];
unsigned char widthHi = buffer[13];
unsigned char heightLo = buffer[14];
unsigned char heightHi = buffer[15];

/* unsigned chars */

header.width = widthHi << 8 | widthLo;
header.height = heightHi << 8 | heightLo;

Also, it is important that when exporting to Targa in GIMP that you deselect the compression button, leaving that checked lead to a few wasted hours of debugging the 24-bit converter as the resulting image looked like this:


The FBX SDK – FBX Importing

Autodesk’s FBX is a common file format used when exporting 3D Model data for use in Games and other 3D applications, as you’re reading this I would assume you have already worked with it so I shan’t go into detail about it. I have used it over the last few years with 3DS MAX, Unity 3D and Unreal Engine 4. Autodesk has provided the SDK on their website for free and this is what I have decided I will use to import meshes into the engine. Learning the basics was rather a steep learning curve as the documentation can be rather cryptic or hard to navigate so I would like to use this post as more of a tutorial showing you how I used it and what I did wrong.

Installation and Setup

The SKD installer can be found on Autodesk’s website. On the downloads page, grab the version appropriate for your project; I am using FBX SDK 2017.1 VS2015 for windows (vanilla, not UWP). For the sake of simplicity, I left it with the default installation location on my C drive

Linking and Including

To use the SDK, you must include it in your project. Open your project’s Properties window > VC++ Directories > Library Directories and Edit the string. Here you must add the location of whichever library is right for your build settings.


Here is the include I use for 32 bit/Debug but you can just repeat the process to add the other libraries for x64 and Release builds.

You must also have the provided libfbxsdk.dll in the build directory for the compiled code to run, At the end of this post there’s a little tip to automate this like with the linked libraries.

The FBX Format

The FBX format uses a node Hierarchy like how a scene in Unity3D is structured. There is a root/scene node which has a tree of other nodes in a hierarchical fashion. Nodes can have certain attributes such as lights meshes and splines among many, here we are interested in Meshes.

Node Hierarchy

The hierarchy looks roughly like this:

Node Hierarchy:

Scene ¬
    Node : Mesh ¬
        Node : Mesh
        Node : Mesh
    Node ¬
        Node : Camera ¬
            Node : Light

You can navigate through the tree using the nodes and see what attributes each node contains using Loops or recursion, whichever you are comfortable using. I Iterate through the hierarchy recursively to find all nodes with FBX Meshes attached and add them to an array as I am more comfortable using recursion in this case.

FBX Mesh

To keep this simple, say we only need to know the positions of the vertices.
we do not care about normals or UV co-ordinates for the time being however using
that data requires a small addition to the loops and the functions are very similar.

FBX uses “Control Points”, “Polygons” and “Polygon Vertices”


Vertices are an index of a control point.

Polygon Vertices
1 1, 3, 2
2 2, 3, 4
3 3, 5, 4
4 3, 6, 5
5 1, 6, 3

A simplified version of the code I use to copy the data is as follows:

  • Loop through all polygons
  • loop through polygon vertices
size_t pgCount = mesh->GetPolygonCount(); // Number of polygons
Vertex * vertices = new Vertex[pgCount * 3]; // Assume only triangles
controlPoints = mesh->GetControlPoints(); // list of position vectors

// Loop through all polygons
for (size_t currentPg = 0; currentPg < pgCount; currentPg++)
      // Loop through the vertices in the polygon
      for (size_t currentV = 0; currentV < 3; currentV++)
            // Get the position of the current vertex in the polygon
            vertices[currentPg * 3 + currentV].pos =
                  controlPoints[mesh->GetPolygonVertex(currentPg, 2 - currentV)];

I then construct a mesh and initialize it with this vertex data which is ready to render!

Resource Management

While working on this milestone I had to iterate my test meshes to fix rotations and scaling; Continually having to copy my textures and meshes into the build folder got very repetitive so after a quick google search and poking around in Preferences I discovered “Build Events”. In Visual Studio, you can add commands that can be executed either before, during or after compiling, I added a little command that will copy both my resources and my DLLs into the build directory after the code compiles. This also means I won’t forget to drag the updated files into the build folder after compiling leading to time sinks looking for non-existent bugs.

Simply adding the two commands xcopy /Y /E "$(ProjectDir)res" "$(OutDir)res" and xcopy /Y /E "$(ProjectDir)dll\x86\Debug\*" "$(OutDir)" to the post build events will copy the \res\ folder and the contents of the \dll\ folder into the build location. It will do this for both Debug and release.

This little trick is a convenient QOL addition that can be used in all sorts of projects and i9 thought it would be cool to share it if any readers have not seen it before.

PSA Check Function Parameters

I will finish with a short PSA. Do Not Forget when you change the parameters on a function! A change I made to the Render function on the Lit Shader was that instead of taking the World, View and Projection matrices, it now used the WorldViewProj, combined matrix; Inverse WorldViewProj and the World matrix by itself.

I made the mistake of reverting to the old setup while writing the new on-Render method, all three are still matrices however, understandably nothing was rendering as they completely the wrong things and I lost about 6 hours rewriting and debugging the import code before I found the cause of the issue.

Thank you

Thank you very much for reading so far and I hope that this little guide will help you.

If you have any questions feel free to write up a comment or contact me at my email (Listed on my GitHub Profile).