... 8 Jaguar (AKA, 2 quad-core netbook-level CPUs) cores, plus 8GB and a 7850. It's not exactly blowing away PCs right now, not to mention all the next-gen stuff coming out in the next few weeks, let alone by the end of the year. In fact, it's middle of the pack, and almost exactly what I would tell someone with a medium ($550-650 plus OS) budget to buy, which is convenient, because it'll probably cost somewhere in that ballpark.
To be fair though, we actually expect 1080p/60fps or better with PCs, while consoles get away with 720 and/or 30 fps. It works out.
I think you're underestimating a few things. One is the insane amount of memory bandwidth the PS4 has. Two, having the GPU and CPU sitting on the same bus, reading the same memory has a rather big advantage in that you can save a lot of otherwise necessary copy operations. Three, Consoles in general do not have abstraction layers like the Windows Display Driver Model, or DirectX, meaning that it's very much possible to program these machines very close to the metal. While the specs do not seem to be that impressive when compared to the kind of i7/GTX TITAN monster rigs it is possible to build in PC land, there's a lot more potential for programmers to really exploit.
Given that we're currently in a phase where graphics programming has hit a point of diminishing returns (Higher levels of fidelity do not offer the same kind of jump in quality as in earlier generations), and given that the industry can't really afford another budget increase like the one we saw going from Xbox/PS2 to 360/PS3, the fact that these systems are not going to compete against the aforementioned monster rigs for raw performance is less relevant than you seem to think.
Both the SportBox One and the PS4 have enough juice to do 1080p at 30 FPS, and doing 1080p at 60 is just a matter of optimization.
I understand HSA\HUMA, I've been keeping an eye on it for a while now. I understand coding directly for hardware too.
I also recognize mid-range parts when I see them, unlike some people apparently. The PS3 and 360 actually had, for their generation, higher-end things then this.
Oh, and when coding for bare metal, it doesn't work like PC. They do not actually target 1080p 60fps. If there were able to double the FPS, they would use that extra power they have available for other things, not framerate. They also tend to favor 720 becasue it's on a TV far away, and again, they want that power for other things, be it graphics, physics, whatever.
"It can" and "They will" are two completely different things. Technically the PS3 is capable of 1080p 3d 60fps. How many games do you see that do that?
The PS4 seems like it has a real technical advantage in the way the PS3 didn't, maybe even over current and near future PCs. Interested to see how the status quo mixes up.
... 8 Jaguar (AKA, 2 quad-core netbook-level CPUs) cores, plus 8GB and a 7850. It's not exactly blowing away PCs right now, not to mention all the next-gen stuff coming out in the next few weeks, let alone by the end of the year. In fact, it's middle of the pack, and almost exactly what I would tell someone with a medium ($550-650 plus OS) budget to buy, which is convenient, because it'll probably cost somewhere in that ballpark.
To be fair though, we actually expect 1080p/60fps or better with PCs, while consoles get away with 720 and/or 30 fps. It works out.
The reason the PS4 may outperform PCs is the huge amount of unified memory and memory bandwidth. PC GPUs don't really have this amount of VRAM to work with. In the long run PCs will obviously pull ahead again, but it should be an interesting next couple years. Console developers always have the advantage of being able to work directly with a fixed hardware set.
It won't outperform anything over the $1200 mark, even with HSA behind it. You guys really need to actually read some about the parts going into these, not just assume HSA is the end-all-be-all. It's good, and very likely the future, but it isn't all powerful.
You all seem to forget that we already have a good chunk of this technology. Anyone with an APU already knows just about everything the new consoles will bring. The GPU half NEEDS that much bandwidth, and it isn't even that massive to begin with, being only a little bit (15%) more than a 7850's actual bandwidth. Hint again, the 7850 is a mid-range card... A single 7970 puls out 264GB/s to the PS4's 176, just by having a 384-bit bus instead of a 256-bit one.
Seriously, know where the hardware stands...
Also, ya, we do have GPUs with that much VRAM... dedicated to the card no less, not shared with the CPU.

Oh, and you better pray they don't go using all 7GB the devs get for the GPU. 1080 doesn't need more then 2GB, even with higher res textures, so they at least should be putting that extra RAM into making properly larger levels to reduce loading screens.