Reynolds - A couple of lines wouldn't be enough time to explain it all, but I'll try to keep it short. The 'General Network Setting Bundle' (GNSB) is how I try to manage my connection depending on the server, class I'm playing, and ping. You'll see how 'Interp Stepping' fits with the GNSB.
**Warning: This turned out to be pretty long even with the oversimplifications, skip to the bottom if you don't care about the why/what.
Here's the quick and dirty of it. There's a delta between packets received from server and client display. Which means something is happening according to the server, the client has not been told yet and thus can't display it. Interp assists in the prediction of the action during these brief information gaps. The easy way to imagine it is that interp tells the client how far to predict actions in the future (ie: where someone is moving or the location of a rocket on its path). The default value is .1, which means that the client is essentially predicting roughly 100ms into the future. This is why one sometimes get hit in the face with a rocket seemingly dodged... while in fact it was not dodged, the prediction of it by the client was wrong, not by trajectory but by time. The examples are endless.
This is all gross oversimplification, but it paints a clear picture. Knowing the effects of interp, we can look at why we would want to use different values.
Generally speaking, projectile classes want to use the lowest interp possible (.0152 is the lowest but rounding to .02 helps relieve player skipping) so that the client representation of their own pipes/rockets arrive to the their target when expected. This has the side effect of players not always being exactly where they appear but within close proximity; however proximity to splash damage and a close hitbox-to-model relationship negate the minute warping perceived of player positions being corrected by the low "prediction" time.
Hitscan classes (immediate damage - ie: scout, sniper, heavy) are a whole different ballgame. When one fires many of the basic guns, (shotgun, scattergun, rifle and et cetera) their calculated direction is exactly where the client crosshair was aimed at that instant, the flecks one sees are not accurate representations since they take time to travel from the player gun to their impact. Understanding hitscan leads one to want the most accurate hitbox-to-model relationship possble. Sparing the math, logic, and arguements for different values I'll present my usage. On a perfect server with a perfect connection and a constant 66+ fps one could use .0152; being that those conditions are never met, Approximating as closely as possible is ideal. Using .02 is viable, but generally only when latency to the server is low, since the "prediction" is low and latency is also low there will be fewer errors in prediction/placement. I personally try to use .033 since it accounts for the server dropping every other frame sent to me (which is horrible and is rare except for bad pubs with a ton of people) and is still fairly accurate on a constant 66tick server, even if it is dropping due to player congestion. Any interp value higher is often used to alleviate issues perceived by the client due to poor server performance or high latency.
Looking at the netgraph one will see LERP (think of it as server side fps) in the middle of all the information presented. It will appear white, orange, or yellow, where white and orange are usually acceptable but one does not want yellow. Generally speaking white LERP means that the client is predicting ahead of estimated need (which can be bad as we've discussed but is not necessarily a 'problem'). Orange means that the client is essentially below default values (*<2/updaterate) and ahead of the server frames, but the prediction essentially matches the servers and is only problematic if on low bandwith or extended packet loss. The last option is pointless to explain since one should never play with a LERP indicator often appearing yellow, even if it's just flashing back and forth with white.
**Cliff Notes:
Projectile classes desire low interp, as low as possible without yellow LERP. Which is .02 if possible.
Hitscan classes desire low interp, but accurate hitboxes without yellow LERP. Which is approx. .033 if possible.
Since not all pc's, connections, and servers are equal, one might find that .02 and .033 aren't achieveable (usually bad servers or giant pubs).
With the netgraph open, use the 'Interp Stepping' to decrease (-) or increase (=) interp/LERP until it's as low as possible without being yellow. (Remember, this assumes that .02 and .033 are already yellow).
Beyond the technical you want what feels the most comfortable for you. Most every soldier and demo I know ,that compete on a high level, use .02 or .0152. High level scouts all have their own preference with .02 being used for near LAN ping, .033 for most comp servers, and .05 or .1 for bad servers/laggy pubs/high ping.
To fully experience the difference interp settings provide, download this map:
http://www.fpsbanana.com/maps/72782
Create your own server and shoot at the bots using an interp of .02 and .1 (default). The difference in perceived rocket/pipe launch and contact with the bots model should be stark.