NTSC standard dictates 525 lines, but not all lines contain visual elements, so you get 480 lines. Depending on the TV, it may have the same or less lines
displayed. The horizontal pixel measurement depends on the TV since that part of the signal was always analog, and each device reads and displays it differently. NTSC uses 30 frames per second, with 60Hz interlaced; one frame is made of two fields, so 30 fps shows smoothly on 60Hz.
Televisions use interlaced scanning to make up for the low frame rate of a broadcasted signal depending on whether it's NTSC or PAL (60Hz in NTSC to smoothly match the 30fps of films, 25 fps for PAL's 50Hz); two fields, containing one vsync pulse each, create one frame.
In modern hardware following the VGA standard, interlaced scanning is avoided as much as possible, so you have what a TV would call "progressive scan." PC monitors often do not use interlaced scanning, and so the picture quality is higher.
The lowest refresh rate I have seen in PC graphics settings is 56Hz, though often it's set to 60Hz. I use 85Hz because my eyes are sensitive enough that I can see the screen refreshing (looks like rippling light); however, when I use an LCD I set the refresh rate to 60Hz as a liquid crystal display does not use an electron gun to produce the image, but rather a control matrix and a panel of thin film transistor (TFT) pixels. The refresh rate in a game--especially if you use vsync--can be computationally intensive, and that's why you need a better computer to do higher refresh rates and higher resolutions in games with vsync on.
VGA (640x480, 16 colors) is comparable to 480i on a TV, for a point of reference.
The fuzziness could come from a faulty shadow mask in the tube, or low
dot pitch. I have found that CRT monitors with low dot pitch--such as my Compaq P900 19"--have a crisper image because the pixels are closer together.
----------------
Now playing:
Ambient Wonder - Martin O'Donnell & Michael Salvatorivia
FoxyTunes