|There are blue ones too!|
My first guess was that it's just a simple high-pass-filter overlay on the video signal. Things that don't move (like the grass, yard markers, and Andy Reid) get the line projected on top of them, while moving objects like players and those damn zip-lining cameras appear in front of it. Unfortunately, that doesn't answer the question of how they're able to project the line from nearly any vantage point (including moving shots), and why it very rarely glitches and appears over players, who after all are stationary quite a bit too. So some combination of a dynamic 3D model of the field and a video overlay? I don't know nearly enough about video editing to know the answer.
The official name for the Thing What Makes The Lines Appear On My Teevee Football is the "1st and Ten Graphics System," and it's quite a beast. A great deal of prep work, modeling, and image processing goes on both before and during the game just to ensure you don't have to squint at the sidelines and try to guess where the refs with the markers are standing.
The first thing you need to be able to do to make this work is know exactly where each point on the field falls relative to every other point. This is accomplished by 3D-mapping every single field used in the NFL, since they're all a little different, and updating the models every year or so. The models' spatial resolution doesn't have to be all that great, since the line projection is usually ~1ft. wide.
So now we've stored that 3D model in a reasonably fast computer somewhere and we're ready to film the game. Said computer is going to be receiving inputs from one or more cameras (in most games, there are three main cameras used for broadcasting). Since a camera can be moved around, refocused, etc., you also need a separate "instrumentation" signal running from each camera to the computer that tells it what the camera's current rotation, focus, tilt, and other settings are. The computer, via what I would imagine is some fairly nasty image processing, then takes all three cameras' images, along with their instrumentation-status signals, and maps them onto the pre-existing 3D model.
So now that the camera view is mapped onto the 3D model,the computer can calculate any position on the field from a single reference point, allowing it to do things like draw a line from boundary to boundary with only a yard-line number as a reference. At this point all the hard work is done; all we need to do now is figure out where we want our line. So what clever image-processing routine do the use to figure that out? Something with grabbing the orange markers on the side of the field, or possibly even some more sophisticated method of measuring the actual spot of the ball?
Unfortunately it's nothing that clever. There's a dude sitting in the broadcast truck with a monitor showing one of the broadcast cameras in front of him. His job is to click on the ball before each play starts; that position becomes the line of scrimmage, and (on first down) the one ten yards ahead of it becomes the first down line. I'd assume there's also some kind of manual override for all the weird first-and-whatever situations that can come up via penalties and such. The guys who do this must be pros, because I'd never be able to resist the temptation to subtly fuck around with the marker positions if I had their job.
The system takes those two coordinates and uses them to draw blue (scrimmage) and yellow (first down) lines across the field on all three of its broadcast images. This is a little tricky: we want to draw the lines over the field, but not over the players, ball, or other things people want to be able to see. Luckily the color of most fields is pretty monochromatic (some shade of green), so you can use a simple color filter (called chroma-keying) to make sure the grass gets drawn over but nothing else does. The two situations that can glitch this part are the weather and the Green Bay Packers, interestingly. In outdoor settings, the exact color of the grass on the field can change pretty quickly due to clouds, snow, mud, etc. On cloudy days, the system has to continuously re-reference the field color, and in more severe situations additional colors like white (snow) or brown (mud) have to be added to the color filter. As you'd imagine, the more colors the filter is allowed to draw over, the more times you'll see it "glitch" and over-draw the ball or players.
The other big problem with the system is the Packers, who inconsiderately picked uniform colors that are almost exactly grass-colored in parts because they are jerks with a highly overrated defense. As a result, you'll see the lines getting colored over bits of Packers during play fairly often, particularly on bright sunny days. Since nobody stays on the first down line for long and the line of scrimmage is semi-transparent though, it isn't noticeable unless you're looking for it.
Once play starts, the cameras are going to be moving, zooming, tilting, etc to follow the action. This is where the "instrumentation signal" from the cameras comes in; as long as the computer can keep track of what each camera is doing, exactly, it can modify its video-to-model map and keep the lines in the same place no matter what. As computers have gotten faster and the system has improved, it's even become possible to use this system with the "wire-cams" that glide around on cables, although apparently the processing for that is way more complex than the fixed-camera system.
So that's how that happens. The first generations of the system required a crew of 3-4 people to operate, but they've gotten it down to that single guy in the truck who doesn't have to do much more than update the marker locations and tweak the color filtering on the lines when necessary. As I said, the total amount of image processing involved here is fairly nasty and involved; NFL games are actually broadcast with an approximately 0.66 second delay (which doesn't seem like much but is basically forever in computer years) to allow for processing time. I'd imagine, with the way image processing keeps improving exponentially, they'll eventually get to the point where the system can do its own color adjustment and probably even figure out the spot of the ball, thus removing all human involvement from the system and effectively ceding control of our NFL broadcasts to Skynet, which can only end well.
Wikipedia was vague but thorough and had some good links, so I'll just credit them as usual.