July 6, 2014

I Don't Care Anymore

This weekend, I went to a convention and bought a ton of stuff. This was unusual for me, because I am not normally someone who invests in materialism. Having a lot of stuff is not something that is important to me. Two things changed this year: I now have an obscene amount of disposable income, and I stopped caring about what other people think is important, because they don't care about what I think is important.

Respect is, inherently, a two-way street. When we are talking about things that are inherently subjective, like what kind of food I enjoy eating, or what books I like to read, I will not respect your opinion if you refuse to respect mine. There is absolutely no reason for me to care about what you think if you don't care about what I think. On the flipside, if you respect my opinion, but disagree with me, I will also respect your opinion, even if I disagree with it.

I buy art prints because I want to support artists. I want to support artists because I think art is more important than anything else, but few people share this view. Furthermore, most people don't care what I think, and expect me to conform to whatever vision of importance they subscribe to.

Before, I grudgingly tried to give society the benefit of the doubt. I tried to show respect to other people's admittedly bizarre concept of what is important in a human life, in the hopes that this respect would be reciprocated. It is now clear to me that society at large is dumber than a rock and isn't worth my time, so I'm not even trying to appease it. I simply don't care anymore.

I do not have time for meaningless debates about my life choices. If people don't understand how I spend my money or my time, it's because they don't see the world the way I see it. They don't value the same things I value. This does not automatically make me wrong, it makes me different, and I don't give a shit about stupid ideological bullcrap. I don't care about what they think is important because they don't care about what I think is important. They can sit there all day long, writing stupid comments about how I am wasting my talents, or how I should join a startup, or work for some company they love, or how I'm depriving humanity of some stupid thing I don't care about. I don't care.

I may like art that you think is stupid. I don't care, I like it, and I think it's important. I think every facet of human diversity is a beautiful thing that should be encouraged instead of brutally stamped out in elementary school. I think creativity is what makes us human, and what will ultimately be our last useful skill after robots have taken over everything else. I think we have better things to do then argue about shows for little girls.

I'm going to do everything in my power to support those artists, because other people won't. I will spend my entire life fighting with every fiber of my being for better welfare and support for artists that live in poverty. These artists aren't poor because they're lazy, they're poor because people won't support them. They're poor because society doesn't think they're important.

But I do, and you can be damn sure I'm going to do something about it.

June 28, 2014

How To Make Your Profiler 10x Faster

Frustrated with C profilers that are either so minimal as to be useless, or giant behemoths that require you to install device drivers, I started writing a lightweight profiler for my utility library. I already had a high precision timer class, so it was just a matter of using a radix trie that didn't blow up the cache. I was very careful about minimizing the impact the profiler had on the code, even going so far as to check if extended precision floating point calculations were slowing it down.

Of course, since I was writing a profiler, I could use the profiler to profile itself. By pretending to profile a random number added to a cache-murdering int stuck in the middle of an array, I could do a fairly good simulation of profiling a function, while also profiling the act of profiling the function. The difference between the two measurements is how much overhead the profiler has. Unfortunately, my initial results were... unfavorable, to say the least.
BSS Profiler Heat Output: 
[main.cpp:3851] test_PROFILE: 1370173 µs   [##########
  [code]: 545902.7 µs   [##########
  [main.cpp:3866] outer: 5530.022 ns   [....      
    [code]: 3872.883 ns   [...       
    [main.cpp:3868] inner: 1653.139 ns   [.         
  [main.cpp:3856] control: 1661.779 ns   [.         
  [main.cpp:3876] beginend: 1645.466 ns   [.         
The profiler had an overhead of almost 4 microseconds. When you're dealing with functions that are called thousands of times a second, you need to be aware of code speed on the scale of nanoseconds, and this profiler would completely ruin the code. At first, I thought it was my fault, but none of my tweaks seemed to have any measureable effect on the speed whatsoever. On a whim, I decided to comment out the actual _querytime function that was calling QueryPerformanceCounter, then run an external profiler on it.
Average control: 35 ns
What?! Well no wonder my tweaks weren't doing anything, all my code was taking a scant 35 nanoseconds to run. The other 99.9% of the time was spent on that single, stupid call, which also happened to be the one call I couldn't get rid of. However, that isn't the end of the story; _querytime() looks like this:
void cHighPrecisionTimer::_querytime(unsigned __int64* _pval)
{
  DWORD procmask=_getaffinity(); 
  HANDLE curthread = GetCurrentThread();
  SetThreadAffinityMask(curthread, 1);
  
  QueryPerformanceCounter((LARGE_INTEGER*)_pval);
  
  SetThreadAffinityMask(curthread, procmask);
}
Years ago, it was standard practice to wrap all calls to QueryPerformanceCounter in a CPU core mask to force it to operate on a single core due to potential glitches in the BIOS messing up your calculations. Microsoft itself had recommended it, and you could find this same code in almost any open-source library that was taking measurements. It turns out that this is no longer necessary:
Do I need to set the thread affinity to a single core to use QPC?
No. For more info, see Guidance for acquiring time stamps. This scenario is neither necessary nor desirable.
I couldn't get rid of the QueryPerformanceCounter call itself, but I could get rid of all that other crap it was doing. I commented it out, and voilà! The overhead had been reduced to a scant 340 nanoseconds, only a tenth of what it had been before. I'm still spending 90% of my calculation time calling that stupid function, but there isn't much I can do about that. Either way, it was a good reminder about the entire reason for using a profiler - bottlenecks tend to crop up in the most unexpected places.
BSS Profiler Heat Output: 
[main.cpp:3851] test_PROFILE: 142416 µs   [##########
  [code]: 56575.4 µs   [##########
  [main.cpp:3866] outer: 515.43 ns   [....      
    [code]: 343.465 ns   [...       
    [main.cpp:3868] inner: 171.965 ns   [.         
  [main.cpp:3876] beginend: 173.025 ns   [.         
  [main.cpp:3856] control: 169.954 ns   [.         
I also tried adding standard deviation measurements, but that ended up giving me ludicrous values of 342±27348 ns, which isn't very helpful. Apparently there's quite a lot of variance in function call times, so much so that while the averages always tend to be the same over time, the statistical variance goes through the roof. This is probably why most profilers don't include the standard deviation. I was able to add in accurate unprofiled code measurements, though, and the profiler uses a dynamic triple magnitude method of displaying how much time a function takes.

The Black Sphere Studios Utility Library for C/C++ is a collection of classes and functions designed for realtime, high-performance applications. It is lightweight, flat, and minimalistic, and if you only use header file specific template code, you won't even need to include the DLL in your project.

April 20, 2014

Complexity Is Irreversible

Once again, programming language flame wars are erupting over the internet. This latest one gives us a helpful list of "harmful things" and "less harmful things". Unfortunately, I felt that it was a little inaccurate, so I decided to improve it:

Harmful thingsLess Harmful Things
Things That ExistThings That Don't Exist

Just to clear up any confusion, here is a helpful diagram:

This alt intentionally left blank

As many very smart people have pointed out over the years, the ultimate enemy of software development is complexity. Unfortunately, you can't get rid of complexity if the thing you are making is inherently dependent on something complicated. I think most people will agree that real life is a pretty complicated place, and the human brain is an awfully complex irrational thing that is trying to interact with real life.

This can't end well, and it doesn't. When we try to do anything, we wind up with something complicated. This isn't because all problems are complicated. The issue here is that in order to push a button on your calculator application, you have to go through a UI thread to the software through 10 different functions each calling into the operating system which is on top of the kernel which is on top of the microkernel which is on top of the BIOS which is on top of the motherboard which is controlling the CPU which is built using microcode which is executed using a bunch of extremely tiny logic gates on a chip.

Now, we could build a computer whose entire purpose is to let us push a button on our calculator application, and it would be much simpler! All we would need is a button command built into the motherboard to a RISC CPU that operates a few logic gates on a chip. This mysterious object is called a calculator. You know, the real, physical calculators that nobody uses anymore.

The reason we don't use them anymore is that we need our computer to do a lot more than simply be a calculator. This is why everything is so complex - if you want something that does several different simple tasks, you will end up with something complicated, even if the tasks are themselves simple. We can, of course, still pretend that everything is simple, just as the OS abstracts away all the low-level nonsense required to make your computer work properly, like setting that goddamn A20 gate on the CPU every time it boots. However, simply pretending something doesn't exist does not make it go away, which is why that stupid A20 gate still causes mysterious boot errors. Hence, any simple program written in any language imaginable is still subject to mysterious bugs and memory leaks that don't make any sense because they are caused by the inherent complexity of the system they are built on top of. You can't un-complicate something. Complexity is irreversible.

Nothing you do will fix this. If you use C to try and keep everything simple, you'll destroy the entire internet's secure infrastructure because of a completely retarded mistake involving an asinine and ultimately useless memory allocator. But if you use a managed language, before long you're writing a UI in XAML and suddenly you're looking at a call stack the size of Mount Everest.
What the fuck
Despite this, there is no end to the deluge of misdirected attempts at "solving" or "reducing" complexity, despite the impossibility of the task. What happens is that a programmer makes a bunch of mistakes, and notices a pattern. Because they're a programmer, they immediately invent a language that prevents them from making those mistakes, and then claim it's the solution to all the world's problems, while conveniently forgetting that the rest of the world has completely different problems.

But, I digress. My point is that the only way to write a piece of software that doesn't break is to hire someone who's so incredibly smart they can deal with all this complexity.

The problem is, no one is that smart.

April 6, 2014

Fiction

I was reading a story on the plane today. It was the tale of a terrible war, a battle between two civilizations bent on the destruction of the other. It spoke of barbaric acts, of unspeakable horrors, of cruelty and pain on such a magnitude it could only exist in a place devoid of morality.

None of it is real. No one is really dying, no one is having their heart broken, no one's lives are really being destroyed. And yet, it bothers me. It bothers me because I know such tales of war were not composed in a vacuum. The power that story holds over me does not come from the imaginary characters it paints, but of the real people it is based off of. The lives that really were lost, the tragedies that tore us apart, the chapters of human history most of us would prefer to forget.

I sometimes find it difficult to keep reading, to discover the horrors I know are lying in wait for our beloved protagonist. With each tragedy that befalls them, I find myself feeling sorry for the character in the story, even though they aren't real. Yet, I'm also feeling sorry for the millions of people I have never met who suffered the same fate. It's difficult to continue because every chapter reminds me of the perils of human existence. I'm not sure there is a happy ending to this tale, or if there is, what the tremendous costs might be.

Perhaps we seek happy endings in our stories because we care about these imaginary characters that have been invented for our benefit. Or perhaps it is because we desperately want to believe that our lives also have a happy ending. We project our troubles and battles into what we read, wanting to believe that we can be the protagonist in our own tales.

This story speaks of tremendous struggles, of soldiers who lost everything and fought to save their nation from being lost to the sands of time. It is brutally effective in reminding us of what we truly hold dear, of what really matters in the end. It puts you in the armor of a new recruit who is suddenly left wondering if he should have spent more time enjoying life now that his could very well be extinguished in a moment. You watch a soldier die by the hand of his own commander simply because he refused to obey a command he knew was wrong.

You don't need a sword to tear the life out of your closest friend. A drunk driver could do the same. A rare disease. Cancer. We may frame heartbreak in many different ways, but the emotion stays the same. The pain of loss is something that transcends mere words, and we can feel it's power even when we're reading about people who never existed.

Sometimes, stories are hard to read because they remind us too much of the world we were trying to escape in the first place. They remind us of everything—and everyone—we could lose.

And those... those are the most precious stories of all.

March 18, 2014

The Problem With Photorealism

Many people assume that modern graphics technology is now capable of rendering photorealistic video games. If you define photorealistic as any still frame is indistinguishable from a real photo, then we can get pretty close. Unfortunately, the problem with video games is that they are not still frames - they move.

What people don't realize is that modern games rely on faking a lot of stuff, and that means they only look photorealistic in a very tight set of circumstances. They rely on you not paying close attention to environmental details so you don't notice that the grass is actually just painted on to the terrain. They precompute environmental convolution maps and bake ambient occlusion and radiance information into level architecture. You can't knock down a building in a game unless it is specifically programmed to be breakable and all the necessary preparations are made. Changes in levels are often scripted, with complex physical changes and graphical consequences being largely precomputed and simply triggered at the appropriate time.

Modern photorealism, like the 3D graphics of ages past, is smoke and mirrors, the result of very talented programmers and artists using tricks of the eye to convince you that a level is much more detailed and interactive than it really is. There's nothing wrong with this, but we're so good at doing it that people think we're a heck of a lot closer to photorealistic games then we really are.

If you want to go beyond simple photorealism and build a game that feels real, you have to deal with a lot of extremely difficult problems. Our best antialiasing methods are perceptual, because doing real antialiasing is prohibitively expensive. Global illumination is achieved by deconstructing a level's polygons into an octree and using the GPU to cubify moving objects in realtime. Many advanced graphical techniques in use today depend on precomputed values and static geometry. The assumption that most of the world is probably going to stay the same is a powerful one, and enables huge amounts of optimization. Unfortunately, as long as we make that assumption, none of it will ever feel truly real.

Trying to build a world that does not take anything for granted rapidly spirals out of control. Where do you draw the line? Does gravity always point down? Does the atmosphere always behave the same way? Is the sun always yellow? What counts as solid ground? What happens when you blow it up? Is the object you're standing on even a planet? Imagine trying to code an engine that can take into account all of these possibilities in realtime. This is clearly horrendously inefficient, and yet there is no other way to achieve a true dynamic environment. At some point, we will have to make assumptions about what will and will not change, and these sometimes have surprising consequences. A volcanic eruption, for example, drastically changes the atmospheric composition and completely messes up the ambient lighting and radiosity.

Ok, well, at least we have dynamic animations, right? Wrong. Almost all modern games still use precomputed animations. Some fancy technology can occasionally try to interpolate between them, but that's about it. We have no reliable method of generating animations on the fly that don't look horrendously awkward and stiff. It turns out that trying to calculate a limb's shortest path from point A to point B while avoiding awkward positions and obstacles amounts to solving the Euler-Lagrange equation over an n-dimensional manifold! As a result, it's incredibly difficult to create smooth animations, because our ability to fluidly shift from one animation to another is extremely limited. This is why we still have weird looking walk animations and occasional animation jumping.

The worst problem, however, is that of content creation. The simple fact is that at photorealistic detail levels, it takes way too long for a team of artists to build a believable world. Even if we had super amazing 3D modelers that would allow an artist to craft any small object in a matter of minutes (which we don't), artists aren't machines. Things look real because they have a history behind them, a reason for their current state of being. We can make photorealistic CGI for movies because each scene is scripted and has a well-defined scope. If you're building GTA V, you can't somehow manage to come up with three hundred unique histories for every single suburban house you're building.

Even if we did invent a way to render photorealistic graphics, it would all be for naught until we figured out a way to generate obscene amounts of content at incredibly high levels of detail. Older games weren't just easier to render, they were easier to make. There comes a point where no matter how many artists you hire, you simply can't build an expansive game world at a photorealistic level of detail in just 3 years.

People always talk about realtime raytracing as the holy grail of graphics programming without realizing just what is required to take advantage of it. Photorealism isn't just about processing power, it's about content.