Author Topic: Ryzen  (Read 10523 times)

0 Members and 1 Guest are viewing this topic.

Offline Luis Dias

  • 211
Very dead.

I still can't wrap my head around it. Why then go all the way down to 10 nm in the first place? Why all the talks about 7 and 5 nm? If it's all about just optimization, then why even bother.

 
DISREGARD THAT I SUCK COCKS
« Last Edit: January 07, 2017, 02:00:30 pm by Phantom Hoover »
The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.

 

Offline Spoon

  • 212
  • ヾ(´︶`♡)ノ
Yeah no idea either, maybe smaller nm numbers are just kind of akin to blast processing level of marketing talk nowadays.
"Look guys, we have the smallest numbers, buy our latest generation of processors that runs on 10 nm!"
Urutorahappī!!

[02:42] <@Axem> spoon somethings wrong
[02:42] <@Axem> critically wrong
[02:42] <@Axem> im happy with these missions now
[02:44] <@Axem> well
[02:44] <@Axem> with 2 of them

 
Maybe. Intel switched from 22nm to 14nm architecture in Q2 2015, more than 1.5 years ago. And we haven't really seen noticeable performance increases from that switch. Or any increases at all, the 5th and 6th gen of intel i7s are weaker than the 4th, a bit weird since they're on 14nm and 4th gen is on 22nm.
[19:31] <MatthTheGeek> you all high up on your mointain looking down at everyone who doesn't beam everything on insane blindfolded

 
Process nanometre numbers are essentially completely made up these days, they don't correspond to the size of any feature on the actual chip.
The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.

 

Offline Mito [PL]

  • 210
  • Proud Member of Slavicus Mechanicus
The fact is that smaller transistors need less power to operate and more of them can be present in the same volume of space. And since it seems that current technology has bumped into a wall regarding (reasonable) clock speeds, the way to improve the raw processing power of a CPU is to place more data processing units in it and coordinate them properly via architecture and software. Using smaller transistors allows for placing more of them in the same volume of space and having the same power usage as before.

And I also think that this lack of performance improvements comes directly from Intel's monopoly in high-performance and high-price part of CPU market. Why would you modernise antything when what you've got now is still only a dream for about 90% of your customers?
Oh, and 22-core hyperthreaded Xeons for proffesional usage...
How do you kill a hydra?

You starve it to death.

 
And I also think that this lack of performance improvements comes directly from Intel's monopoly in high-performance and high-price part of CPU market. Why would you modernise antything when what you've got now is still only a dream for about 90% of your customers?

You could say the same thing about the GPU market, most people can't afford a GTX 1080 or 1070. Most people couldn't afford a 980 2 years ago yet modern GPUs have advanced since then. Something that was bleeding edge 3 years ago should be mid-end today and that's how it is with GPUs. The new RX480 is 200$ and has almost the same power as the 550$ 290X did 3 years ago.

If AMD was pressing Intel to actually improve their tech year by year a 4790K wouldn't be "a dream". You'd be able to buy a processor with similar power for a reasonable price and the top-end 350$ i7 would walk all over it in benchmarks.
[19:31] <MatthTheGeek> you all high up on your mointain looking down at everyone who doesn't beam everything on insane blindfolded

 
You're comparing apples to oranges, GPUs are substantially different from CPUs, haven't hit physical diminishing returns nearly as hard and are able to mix up their architecture much more freely than Intel are.
The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.

 

Offline Mito [PL]

  • 210
  • Proud Member of Slavicus Mechanicus
Well, I'd say that GPU's are much further away from hitting the maximum frequency limitations for silicon than CPU's. I didn't ever see a 3GHz clocked GPU.

Plus, it would seem that most of new game engines give better results for medicore CPU & monster GPU than monster CPU & medicore GPU. Of course, there are exceptions (Minecraft?). Investing more in GPU than CPU seems to be simply a better option if you have to choose because you can't have both for cash reasons.
How do you kill a hydra?

You starve it to death.

 
Oh, sure, there are limitations that prevent rapid advancement of performance. But if AMD was actually competitive Intel would probably find a way to improve performance a bit more. Or if they couldn't they'd have to start lowering the prices to match AMD.

Both of those are much better options than the current intel monopoly where they resell the same performance for the same price for 3 years in a row.
[19:31] <MatthTheGeek> you all high up on your mointain looking down at everyone who doesn't beam everything on insane blindfolded

 
Plus, it would seem that most of new game engines give better results for medicore CPU & monster GPU than monster CPU & medicore GPU. Of course, there are exceptions (Minecraft?). Investing more in GPU than CPU seems to be simply a better option if you have to choose because you can't have both for cash reasons.

If you hit a GPU bottleneck in a game you can just turn the graphics down. If you hit a CPU bottleneck there's generally nothing you can do about it without compromising the gameplay.
The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.

 

Offline Gortef

  • 210
  • A meat popsicle
All I can say that I am honestly really hyped for Ryzen. All the bits and pieces so far seems to indicate that AMD does have something good coming.
Only problem so far seems to be people who either accidentally or intentionally over hype things for whatever reason. Kind of like what happened with RX480.

But anyway, I am hopeful that around the end of February I have a full AMD rig after a decade.
Habeeb it...

 
Sadly for me, at this point with my resources, I'd be glad just to a socket AM3 board and 8GB DDR3 so that I could take full advantage of my Phenom II X4. This would also let me massively upgrade my wife's rig, as she'd inherit my Phenom X3, the current board, and six gigs of DDR2 800.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.

  

Offline jr2

  • The Mail Man
  • 212
  • It's prounounced jayartoo 0x6A7232
    • Steam
Can probably get that pretty cheap on eBay especially after Ryzen comes out, no?

 

Offline Dragon

  • Citation needed
  • 212
  • The sky is the limit.
Very dead.

I still can't wrap my head around it. Why then go all the way down to 10 nm in the first place? Why all the talks about 7 and 5 nm? If it's all about just optimization, then why even bother.
As far as I can tell, we're hitting the point at which Moore's law is running head-first into physics constraints. It might be less due to Intel getting complacent and more about the fact that that it may not even be possible to squeeze more performance out of a single chip. Intel has simply hit the limits for silicon electronics and is just incrementing numbers for marketing purposes (along with some minimal architecture optimizations).

Photonics seem like the only way to progress in that case. Using light instead of electrons could allow higher speeds and even smaller sizes, but this field is relatively recent, so we don't know if it could actually be a practical solution for a desktop computer.

BTW, would there be any benefits to using multiple CPUs? IIRC, there are "processor cards" on market, but they seem to be meant for professional applications such as data analysis and scientific calculations. I'd imagine they'd have the same problems as multicore processors, only worse.

 

Offline jr2

  • The Mail Man
  • 212
  • It's prounounced jayartoo 0x6A7232
    • Steam
IIRC multiple CPUs back when dual CPU mobos were a thing would only allow for ~30% more speed?  Of course this may have changed since the dual PIII days.

 
Photonics? That name is so meh to me. But then, being a Trekkie, I would want to see it called optronics, and have actual ODN relays and such. That's just the kind of nerd I am.

BTW, I thought they were using copper transistors now, not silicon. Hence, IIRC, the code name of the initial generation of chips that did it being "Coppermine."
There are only 10 kinds of people in the world;
those who understand binary and those who don't.

 

Offline Dragon

  • Citation needed
  • 212
  • The sky is the limit.
As long as they're using electrons for doing the logic, they're going up against the same limits imposed by electrons' charge and wavelength. Copper might help with heat dissipation, but not by much, I think.

And yeah, optronics sounds nice, but I'm afraid people would think that it's a short for optoelectronics (photodiodes and such) which is another thing entirely.

 

Offline Axem

  • 211
Copper transistors? :confused: That wouldn't work at all (unless you're reworking the laws of physics (in which case give me your secrets!)). Silicon is semiconductor, which means you can basically control if its conducting or not. That's what makes the transistor into a switch and makes computers tick. Copper is a conductor, a very very good one at that, so making copper transistors wouldn't do much other than generate heat. (But it does carry heat away very nicely!)

 
It might have been that copper was being used for the doping material. I'm a little fuzzy on the details.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.