Everything above 30FPS is playable, since the human eye can't tell the diferene between 1000 FPS and 30FPS anyway.
The human eye (if I remember correctly) can usually see 24 frames per second.
60 fps used to be the lowest playable speed (30 vertical frames, 30 horizontal frames).
<EDIT>
There's no real difference in terms of speed between 60 fps on older games and 30 fps on newer ones.
Actually, 24 fps is about at the limits between a slideshow and something that is actually recognized as moving picture by the visual cortex.
As CP said, eyes actually send input a lot faster than 24 images to the brain being analog devices; it's the visual cortex that either discerns the signals as a series of images, or a smoothly moving picture. In fact, eyes don't send "frames" to the visual cortex, they are more like continuously updating the changes in what they receive and what is projected to the visual cortex matrix [in case you didn't know, the area where the visual nerves end in the brains is actually shaped like the projection of the field of vision, so different parts of retina connect to corresponding physical location on the visual cortex - now how's that about interesting stuff], and as you might know, different parts of retina have different abilities. The edges with rod cells have a relatively fast response time as well as sensitivity to small amounts of light, and that's why they are well suited at seeing fast movements as well as vibrations (which are pretty much the same thing anyway as far as the eye is concerned). The center with biggest concentration of photoreceptor cells, most being cone cells which give us RGB colour vision but they are consequently less sensitive to minimal amounts of light and also have somewhat longer response time.
And actually, the accurate center part of the vision is responsible for sharp vision for almost all of the field of our vision; our eyes continuously move very fast to different parts of the field of vision, sending accurate vision updates for the visual cortex, which fools our conscious perception to think that all of our vision is as accurate as the center part. You can try this out if you can concentrate long enough - keep staring at some point and try to move your eyes as little as possible. After a minute or so, the edges of the field of vision start becoming fuzzier and fuzzier, as the visual cortex starts to forget about the surrounding stuff in lack of accurate updates. In the end you might be able to see how inaccurate the edges of our vision actually are compared to the "yellow spot" in the middle of the retina.
Ehmmm, where was I.
24 fps is, as said, slightly above the limits of what the brain starts to interpret as a movie instead of a bunch of consequential images. Of course, there are also other factors that affect the interpretation, such as how fast the movement in the movie is across the screen and so forth. For example, an object moving across the screen in one second with 24 fps will move 1/24th of the screen between the frames, and that's definitely gonna show as stuttering image and might even break the illusion of a moving image.
I can pretty much always see the 60Hz flicker in most tube TV's before they warm up properly, and even afterwards it's usually visible, just not prominent.
And at movies I'm pretty much constantly annoyed by the low FPS which causes frames to appear jumpy any time when the camera is panning or characters/stuff move across the screen as I said.
Oh, and the demo seems to work pretty well on 1024x768 with all settings on low, and I have AMD64 3200+, 3GB DDR-SDRAM and a factory overclocked 7600GT. Haven't tried to increase any settings yet, might be interesting to experiment which of the settings have the most impact. Also, it feels pretty playable although the weapons feel horribly inaccurate, even in aimsight mode.