It has to do with the fact that GPUs and monitors draw and count frames differently from the way TVs and movie projectors do.
TVs and projectors are no different at all from a computer monitor, no idea where that idea came from. The effect movies and TV use is called motion blur. GPUs use motion blur when they play movies too, Its just how the frames are saved and have nothing to do with what is playing them. There is no magical difference between a GPU and the video out on a dvd player, its all about what is being played. S-Video is a nice example of this since DVD players and some older GPUs both have it. Or the fact TVs have VGA and DVI ports.
Motion blur is the concept that the human eye sees changes, or motion, rather then each frame on its own. This is also why you don't get a clear picture when you pause a movie during anything involving movement, and why 24fps seems smooth in a movie and not a game. They do it to save data amount and bandwidth which leads to point number two...
TV broadcast be as fluid as the 120 Hz TV can display it.
How certain are you that the TV signal is 120hz? I can pretty much guarantee it is not. TV also uses motion blur, and I would be amazed if any company decided to use about four times the bandwidth for little to no gain.
Also, not all TVs are 'true' 120hz TVs, but simply refresh the same frame twice. As far as I know, you would need HDMI, DVI or DisplayPort input from a computer to get 'true' 120hz anyway since again, movies use motion blur and still only play at ~30fps, 60 if 3D.
I will not comment on theater projectors though, cause I've never seen what they use. For some reason I doubt they use the same bluray disk everyone else uses.
and **** multiple monitors i want one of those hemispherical projection screens
I love my four screens, but hell yes!