/ Home  / Blog

RSS 2.0

Why Is My iPhone Next to a Lake?

Tue, 30 Aug 2011

Well, I left my iPhone next to a bathroom sink while camping and didn't notice for a couple hours. When I went back for it, it was gone.

Fortunately, I've done everything I could to prepare for this day by locking down my phone and setting up Mobile Me. (Read: I have the foresight to anticipate future lapses of intelligence, heh.) I was able to download Find my iPhone on Christine's iPhone and track it to the side of a lake about a kilometer away... huh?

We decided to go after it with the plan of remotely sounding the alarm when we got close. There is no way to stop that sound except for turning off the phone or entering the passcode and we knew the phone was on as it was reporting its position. We got into the car and set out to recover my property.

When we got there, it turned out it was a lost and found so that was the end of the hunt. If you have an iPhone, be sure to use the letter-based pass codes instead of the stupid numeric PIN one which does not encrypt your email and can be brute forced in seconds. They should get rid of that feature entirely as it generates a false sense of security.

And definitely set up Mobile Me and get Find my iPhone.

topic story | permlink

I'm Blogging Now

Tue, 30 Aug 2011

I love it when I come to someone else's blog and the only post they've put up in the past two years is something that says they're going to take blogging more seriously. But, wait! I am being more social all the time!

In fact, I've moved my developer-related blogging over to The Frogtoss Blog which I am striving to update with the most important insight I've encountered on each day. That makes this my personal blog where I will continue to update it at an extremely casual pace with stories about my personal life.

Follow this if you are interested in me personally and that if you are just interested in the brain of an independent game developer.

Oh, and this is the first time I've linked The Frogtoss Blog from anywhere. I'm getting a few articles under my belt and some refinements before I actually advertise it anywhere.

topic meta | permlink

Hardware is the basis of understanding rendering

Mon, 09 Aug 2010

Hardware is the basis of understanding rendering. Not numerical problems, not geometric problems and certainly not memorizing OpenGL or Direct3D APIs.

One of the most first questions that needs to be asked when deciding to implement new graphical features is what needs to be implemented on the general purpose CPU and what can be computed on a GPU (or even a set of SPEs). This question is impossible to answer without a fundamental understanding of the hardware you are programming for. The abstraction of your favorite API does less to mask this as more programmatic options become available. Consider:

Once you can competently speak on these points, you can start to devise a hypothesis about how to best divide your hardware resources to render a typical scene for your game.

At this point, you have a shot at guessing what data needs to be where in your pipeline and when it needs to be there. This is the point when numerical and geometric issues move into focus.

Learning OpenGL to learn graphics now considered harmful

OpenGL (with the exception of OpenGL ES) is a pool of functions, many of which superficially achieve the same results but with different approaches to dealing with bus latency. As memory latency becomes more of an acute problem in paring down hardware implemented pipeline stalls, literature promoting immediate mode and display lists continues to be the most prominent information on OpenGL in spite of the strong need for sizable data batches.

I'm always going to be a fan of getting stuff up on the screen quickly and programming some sample apps in OpenGL does give you something to mentally referenece when learning theory, but you aren't programming anything really interesting until you've understood the graphics pipeline and at least the timeline of how hardware acceleration has crept backwards through the graphics pipeline over the past ten years.

Recommended Reading

Jim Blinn's Corner: A Trip Down the Graphics Pipeline - An old one that covers software rendering, but gives you a basis of understanding of the graphical pipeline.

Real-Time Rendering - Currently in 3rd edition. The modern replacement for Foley and VanDam. The chapters on performance and hardware combine with the fundamental understanding of the graphics pipeline to help you really understand what's going on.

Technical briefs for hardware prepared by NVidia and ATI for target hardware. This information is made available on their respective sites.

topic code | permlink

The Secret Sauce is Ketchup

Fri, 20 Feb 2009

The secret sauce is ketchup. Really good ketchup.

What makes a game great? These days, it needs to do a lot of things very well, or the user will be annoyed. But, we don't choose our entertainment such that we avoid annoyance. Some of the greatest games of all time have had very annoying experiences associated with them. As core gamers, we overcome annoyances in order to access our preferred form of entertainment. (Ever tried to get a modem game of Doom working back in the 90s?)

The Quake games are great games. Id and John have been well praised for the renderers and tech that underpin these classics. As outdated as the assumptions underlying the Quake technology may be in 2009, the games feel much more responsive and enjoyable to play than, well, a handful of the top titles from this console generation.

The secret sauce to making a game great is making the game respond well to its users. It's obvious and simple. And, like the ketchup on you dining room table, the ingredients are right out there for you to inspect. Where, you ask? Some of the greatest movement and control code in the history of games has been GPL'd and released by Id in their three Quake releases.

For all of the praise Id gets for their tech, their movement code is a scant few hundred lines, overlooked by most. In response to mouse and keyboard input, it glides the user through the level seamlessly, giving the user the expected return for every input. That's the secret sauce.

Behind making a responsive game lies a handful of techniques and principles. There are too many for one blog entry, but I'll address some of the technical considerations here, moving on to design in a future post.

Aim for 60 fps.

60 frames per second just makes your game feel more responsive. If you are making a current gen console game, it is technically achievable, though it requires team wide discipline. In response for that discipline, the user receives a viscerally responsive feedback loop.

I understand that there are many development scenarios where 60 is not plausible. Maybe your platform doesn't refresh well (I had this experience while developing a Flash 9 arcade game). Maybe your publisher won't allow you to commit to a game design and art direction that lets you target sixty.

If that's the case, you need to vsync lock at 30 without wavering. If Gears of War can look that good and hit a steady 30, why can't you?

And, if you're making a PC game, just wait a year or two and your game will fly on a $500 machine. Just don't do the dumb thing and lock your renderer at 30 so it can't take advantage of it, eh?

Commit to locomotion based movement at your own peril.

Use velocity-based movement unless you have strong animation talent. If your character moves through the world by calculating the model space displacement of its animations, your animation team is in charge of character movement and your programming team is not.

If you're doing an involved interaction system with doors to open, cover to hide behind and detailed reload animations, you need to iteratively work with your animation team to ensure the animations are interruptible, blend very quickly and can be updated quickly to help your team.

Alternatively, a velocity-based movement scheme allows you to assign acceleration impulses to an entity. On each tick, you extrapolate the current position using the velocity, executing collision and response. This is a procedural approach to character movement and it can feel very organic. With this approach, your animations do not change the rate of your player's movement.

I'm not going to say locomotion based movement is bad. It can look better than velocity based movement. You just need to seat your animator and your programmer next to each other. Remain analytical and vigilant and you will come out on top.

As a bottom line, you need to make sure the player gets the movement response he wants and expects from all input at any time.

Sub-sample your digital input to allow for light taps on buttons.

This is a technique that isn't immediately obvious, but is difficult to argue against once you've considered the ramifications.

In a given tick, you can have up to two samples for a single button press whether it's a keyboard button, an Xbox gamepad button or a mouse button. Test for button down and button up states — cut your output velocity by half if you get both.

There are two cases where this is useful:

First, consider that the user wants to move forward lightly. If a key is tested for down and not up state, you multiply the velocity by 1.0. However, if a key was tapped down and up in a tick, you multiply the velocity by 0.5. You can use this to allow half-height jumps in response to a light tap, for example.

The second point is more important. When a user's framerate dips to, say, 20 fps, he is likely to overshoot his goal by walking off a ledge or firing for too long. By cutting his velocity in half, you assist in reeling in the negative effects of the framerate dip. Your user would thank you for doing this, but he'll never know you did it. He'll just inexplicably like your game better than the competition's.

Interpolate analog input, but only if your framerate is good.

Different mice and gamepads send analog updates at different speeds. How do you make sure you have a stick or mouse movement update ready for processing when it's time to queue up a new frame? Ya can't.

Even if your first person shooter is vertically synced and running at 120 FPS on a beautiful 24 inch widescreen CRT, you can pan a crappy Radioshack serial mouse around your scene and watch the camera chunk at 20 updates per second. The rest of your in-game animations may seem smooth, but you won't notice because the panning is rate is as crappy as your mouse.

A common approach to dealing with this problem is interpolating the mouse from the last known input and the one before it. This means you are always a percentage of the way between 2 known mouse samples.

If you are running at 60 fps, this is acceptable for 99.5% of the population. Even if you're 20 years old and hyperperceptive, you're not going to be able to determine that you are a quarter of a mouse delta away from the truth.

If you're running at 25 fps, a mouse or gamepad starts to feel laggy. 25 fps is a new frame every 42 milliseconds and you are partially between the mouse delta received 42 milliseconds ago and the one received 84 milliseconds ago. That sucks, and perceptibly so.

Unfortunately, unless your framerate is a steady 60 fps, you do not want to filter your inputs. If you are making a PC game, make input filtering an advanced user choice. And, please, think about it - it does not make sense to automatically disable filtering on a framerate threshold or the user will perceive a small mouse leap when you toggle the filtering.

Oh, and if you're a gamer, buy mice that refresh quickly. This is basically resolved by buying a quality, modern gamer mouse. In the old days, we had to seek out specific mice and run custom programs to bump up the refresh rate. Good riddance.

Avoid uneven framerates by avoiding an uneven distribution of work.

When developing a game, there are lots of temptations to halve the processor demand for a calculation by performing it every other frame.

"It's taking our AI 5 milliseconds to assess all of the threats in the world. Let's run that on even frames only!" This is a naive but common suggestion heard in game programming.

Unfortunately, if your AI runs its threat test on every other frame, your framerate fluxuates by 5 milliseconds. Do that enough and the game will have a difficult to describe hitchy feel.

Some modern game perf tools and profilers work on a per-frame basis and this is an excellent way to have your code evade performance analysis tests.

Find another way.

Conclusion - Part One

In this first entry, I've discussed techniques you should use to make your game feel more responsive. The next part of this entry will go over design principles.

Amongst other points, we will look at how huge game productions miss the point of blood and damage effects in a shooter and miss the opportunity to make their game a beloved classic.

topic code | permlink

Scons Reaches 1.0

Sun, 24 Aug 2008

Scons, the Pythonic build tool, has reached an Open Source 1.0. Congratulations to the team who has obviously worked very hard to make this happen.

I use Scons to achieve cross-platform C++ build scripting. Because it is written in Python, I can have all of my non-trivial projects derive from base classes that eloquently incorporate sets of libraries.

I fully recommend Scons for cross platform and reference builds of your software.

Here is a quote from their website:

"SCons is a fantastic build system, written in Python (1.5.2) that does lots of nice things like automated dependencies, cross platform operation, configuration, and other great stuff. I would have to say that it is probably going to be the best thing for building C/C++ projects in the near future."

topic code | permlink

RSS 2.0 [ RSS | RSS 2.0 | ATOM 1.0 ]