Also, remember the "minimum requirements" are usually "the crappiest rig we tested it on, and it ran without a hitch". You can take your chances if you're below the requirements in some areas. 64bit is likely mandatory (if they're not working with 32bit, they can make it take full advantage of 64bit architecture) and it's hard to argue with HD space as well. Same goes for RAM (though if it's well-written, you might get away with 4GB). But I don't think that it's dependent specifically on the CPU being a quadcore and graphics card being a 7870 or later. It all depends on how much slideshow/uglyness you're able to stomach. I played ArmA II (singleplayer only, multi obviously wouldn't work) on a rig that should have been incapable of running it, and it was still a fun experience (though it didn't look too good, and was very lag-prone). 20FPS is what most movies are played at. If you can run it at that rate, with occasional lag spikes, then you should be able to play it.
That said, The Witcher 2 did use more resources than it really should, so YMMV. It wasn't exactly the best coded game I played. Perhaps they will have good optimization this time, but if they don't, there's a chance the actual minimum requirements to make the thing playable will be higher than official ones.
Anyway, I'll likely be looking into a new rig when it's out. I'll be moving to Windows 10 64bit, with at least 8GB (if not 16) of RAM and a quadcore CPU.