Author Topic: Ryzen  (Read 10461 times)

0 Members and 1 Guest are viewing this topic.

Offline Gee1337

  • 27
  • Sh!tlord/Human Garbage
Okay, if this is the wrong board to post this in then please move it, but I thought here was as good as any and I haven't seen any other threads on this.

Anyway, to the topic...

So AMD held their New Horizon event the other day and revealed their upcoming CPU. The signs are promising and I was wondering what everyone elses' take on the event was and how they feel the CPU will impact the market and if you are excited. Or even if you have criticisms or worries.

My take on it is that Intel will have some competition again and whether you are an Intel/AMD user/fanboy, then it doesn't matter as we will all benefit from this CPU even if you don't use it. I'm looking at building a high end rig early next year, so Ryzen and Vega couldn't come at a better time for me.

What does everybody reckon?
I do not feel... I think!

 

Offline jr2

  • The Mail Man
  • 212
  • It's prounounced jayartoo 0x6A7232
    • Steam
I would love to see AMD actually give competition to Intel, so that they were on more even footing, as then both companies would need to actually innovate and keep prices decent, instead of just AMD.  I have a feeling that even if AMD were to come out with processors that topped Intel's for 75% of the cost, Intel would just take a loss for a couple of quarters and slash prices to stomp AMD down to their previously manageable level.  :ick:

 

Offline Klaustrophobia

  • 210
  • the REAL Nuke of HLP
    • North Carolina Tigers
Pretty much, yeah.  Even if AMD manages to beat Intel on the tech side, they've been written off for far too long now to really recover to a true rival.  AMD could come up with something that tripled performance over Intel and they still probably won't take back any notable market share.  Intel can just sit back and watch AMD struggle, then release their answer a few months later.
I like to stare at the sun.

 
Don't be so sure. That's what everyone thought when the Athlon series got released, and then AMD was the first chip maker to crack the 1GHz barrier on top of that.

On the graphics side of things, while I can't say as I like AMD graphics just yet, mainly due to issues I had when AMD's graphics card division was still the company ATi and linux drivers were a PITA to try to find and get working, I'm willing to give them a new shot since I recently learned that nVidia has been going outside of the true OpenGL standard so much that they have caused developers to create issues with the competition due to programming for the nVidia misuse of OpenGL.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Pretty much, yeah.  Even if AMD manages to beat Intel on the tech side, they've been written off for far too long now to really recover to a true rival.  AMD could come up with something that tripled performance over Intel and they still probably won't take back any notable market share.  Intel can just sit back and watch AMD struggle, then release their answer a few months later.

I wouldn't be so sure. The first tests for the 7k-series Intel CPUs are coming out, and they show absolutely no improvement over Skylake when run at the same clock speed. This is actually the best chance AMD have had in a long while to catch up; all they need is a range of chips that can perform within a couple percentage points of an i7 or i5 at a lower price point. Right now, all the enthusiast press is warning people off of Intel and recommending to wait until Ryzen drops; this is definitely the best position AMD's been in for years.

On the graphics side of things, while I can't say as I like AMD graphics just yet, mainly due to issues I had when AMD's graphics card division was still the company ATi and linux drivers were a PITA to try to find and get working, I'm willing to give them a new shot since I recently learned that nVidia has been going outside of the true OpenGL standard so much that they have caused developers to create issues with the competition due to programming for the nVidia misuse of OpenGL.

As far as I am aware (and I've been using AMD Desktop GPUs in various forms, from a 7850 to an R9-285 to an R9-380 to an RX-480) AMD's Windows drivers are right now very stable and do not exhibit any surprising behaviour when it comes to OpenGL. There was a bit of wonkiness in the early days of their GCN architecture, but it seems they've worked out most of the bugs by now.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline AdmiralRalwood

  • 211
  • The Cthulhu programmer himself!
    • Skype
    • Steam
    • Twitter
As far as I am aware (and I've been using AMD Desktop GPUs in various forms, from a 7850 to an R9-285 to an R9-380 to an RX-480) AMD's Windows drivers are right now very stable and do not exhibit any surprising behaviour when it comes to OpenGL. There was a bit of wonkiness in the early days of their GCN architecture, but it seems they've worked out most of the bugs by now.
Well, I did run into a bit of weirdness where FSO was being told by the driver that it supported a certain OpenGL extension when it actually didn't, but that became irrelevant when we moved to OpenGL Core (and would only have been relevant before that if we decided to use explicit version declarations in our shaders on Windows, which we didn't).
Ph'nglui mglw'nafh Codethulhu GitHub wgah'nagl fhtagn.

schrödinbug (noun) - a bug that manifests itself in running software after a programmer notices that the code should never have worked in the first place.

When you gaze long into BMPMAN, BMPMAN also gazes into you.

"I am one of the best FREDders on Earth" -General Battuta

<Aesaar> literary criticism is vladimir putin

<MageKing17> "There's probably a reason the code is the way it is" is a very dangerous line of thought. :P
<MageKing17> Because the "reason" often turns out to be "nobody noticed it was wrong".
(the very next day)
<MageKing17> this ****ing code did it to me again
<MageKing17> "That doesn't really make sense to me, but I'll assume it was being done for a reason."
<MageKing17> **** ME
<MageKing17> THE REASON IS PEOPLE ARE STUPID
<MageKing17> ESPECIALLY ME

<MageKing17> God damn, I do not understand how this is breaking.
<MageKing17> Everything points to "this should work fine", and yet it's clearly not working.
<MjnMixael> 2 hours later... "God damn, how did this ever work at all?!"
(...)
<MageKing17> so
<MageKing17> more than two hours
<MageKing17> but once again we have reached the inevitable conclusion
<MageKing17> How did this code ever work in the first place!?

<@The_E> Welcome to OpenGL, where standards compliance is optional, and error reporting inconsistent

<MageKing17> It was all working perfectly until I actually tried it on an actual mission.

<IronWorks> I am useful for FSO stuff again. This is a red-letter day!
* z64555 erases "Thursday" and rewrites it in red ink

<MageKing17> TIL the entire homing code is held up by shoestrings and duct tape, basically.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Yeah, I bet that had more to do with us relying on compatibility behaviour.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 
I was talking about issues with developers programming for nVidia abuses of the OpenGL standard and issues(from 20 years ago, good Lord in Heaven I feel old) with ATi support in Linux, not issues with AMD drivers on Windows.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Can you post a link to those issues? It's kinda surprising for nVidia to screw up OpenGL support; they're generally the ones who stick the closest to the standard.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 
That's not what I heard. The information I was given indicates that they liked to "extend" the standard even if the other members of the OpenGL group didn't want to extend it just the way nVidia would like. Over time, because they were so heavily promoted for gaming, developers supposedly got to where they coded for these nVidia only additions and other brands of cards would then have issues with these games, at least during development or perhaps even until the first post-release patch, because the developers only used nVidia in house and other manufacturers stayed with core, and perhaps with agreed upon extensions, only. I don't have the links at present, though, and during the wee hours of the morning my search-fu is weak at best.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
A lot of the difficulty we had back when Valathil was working on shadows and deferred rendering was due to AMD not actually sticking with the behaviour as defined by the OpenGL standard (or rather, AMD made choices that in some cases caused their OpenGL implementation to zig where nVidia would zag because of the fuzziness with which OpenGL is defined in some cases). As far as I am aware, that is still the case when it comes to OpenGL Core behaviour, while AMD has gotten much better about it, nVidia's OpenGL implementation is still considered the gold standard.

This post is recommended reading in this regard. The bottom line today is this: it doesn't really matter these days whether you use AMD, nVidia or Intel GPUs, as long as your code doesn't try to use vendor-specific extensions. If you do, chances are you're going to see bad things on systems not using cards made by that vendor.

The issue you mention is more an effect of nVidia making a very strong effort to get game developers to use their proprietary software packages, which to my mind is less of a mark against nVidia and more of a mark against those studios.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 

Offline Luis Dias

  • 211
I wouldn't be so sure. The first tests for the 7k-series Intel CPUs are coming out, and they show absolutely no improvement over Skylake when run at the same clock speed. This is actually the best chance AMD have had in a long while to catch up; all they need is a range of chips that can perform within a couple percentage points of an i7 or i5 at a lower price point. Right now, all the enthusiast press is warning people off of Intel and recommending to wait until Ryzen drops; this is definitely the best position AMD's been in for years.

I don't understand. Why would intel start releasing these chips that have zero improvement year on year? What are they doing at all?

 
They're a little more efficient and amenable to higher clocks, I think, but this generation has definitely hit diminishing returns hard.
The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.

 

Offline The E

  • He's Ebeneezer Goode
  • 213
  • Nothing personal, just tech support.
    • Steam
    • Twitter
Yeah, it's a combination of Intel not being able to make another large architectural revision (like what they did going from Broadwell/Haswell to Skylake) and not being able to ship 10nm chips this year.
If I'm just aching this can't go on
I came from chasing dreams to feel alone
There must be changes, miss to feel strong
I really need lifе to touch me
--Evergrey, Where August Mourns

 
Also the fact that AMD's CPUs haven't been competitive for a rather long time has given Intel much more room to sit on their ass and delay real performance upgrades a while more. Intel hasn't done anything since 2013.
My 4th gen i7-4790K has almost identical specs to a shiny new 7th gen i7-7700K. The differences are extremely minor and most come from the integrated graphics, something almost nobody who buys an i7 will actually use. The 7700K has a 100Mhz clock speed advantage at factory clocks. That's almost nothing considering the 2.5 year gap.
With the same rig they put up the same numbers in benchmarks, the 2.5 years newer model is only slightly better in terms of power efficiency and thermals.

I know diminishing returns make it harder but it seems like they're not even trying to push performance in their consumer models.
And no, the 1700$ 6950X or the 1000$ 6900K do not count as consumer models.


Ryzen is at a massive disadvantage though because Intel has been trashing AMD for so long. Anyone with a 1151 or 2011 socket can upgrade to a newer Intel model, but they'd have to replace the mobo to upgrade to Ryzen. So unless AMD can provide a reasonable competitor at a lower price AND prove that they can keep releasing reasonable processors for the next 2-3 years it'll be rather hard to convince people to build or upgrade to an AMD rig as they might have to switch back to intel for their next upgrade and then they've just wasted money on an AMD motherboard.
« Last Edit: January 06, 2017, 10:02:48 am by FrikgFeek »
[19:31] <MatthTheGeek> you all high up on your mointain looking down at everyone who doesn't beam everything on insane blindfolded

 

Offline Spoon

  • 212
  • ヾ(´︶`♡)ノ
Also the fact that AMD's CPUs haven't been competitive for a rather long time has given Intel much more room to sit on their ass and delay real performance upgrades a while more. Intel hasn't done anything since 2013.
My 4th gen i7-4790K has almost identical specs to a shiny new 7th gen i7-7700K. The differences are extremely minor and most come from the integrated graphics, something almost nobody who buys an i7 will actually use. The 7700K has a 100Mhz clock speed advantage at factory clocks. That's almost nothing considering the 2.5 year gap.
With the same rig they put up the same numbers in benchmarks, the 2.5 years newer model is only slightly better in terms of power efficiency and thermals.

I know diminishing returns make it harder but it seems like they're not even trying to push performance in their consumer models.
And no, the 1700$ 6950X or the 1000$ 6900K do not count as consumer models.
I looked up where my cpu is at nowadays, a i7-2600k: https://www.cpubenchmark.net/common_cpus.html
It's #14 on the first chart. That's a second generation cpu released 6 years ago. (january 2011) It's outperforming every i3 and i5 that has been released since... And your 4790 is #4 on that list, as you already said, there barely seems to be an improvement in 2.5 years there.
I suppose its the inevitable senario when the only direct competitor has not really been competing. It just makes upgrading not all that appealing right now.
Urutorahappī!!

[02:42] <@Axem> spoon somethings wrong
[02:42] <@Axem> critically wrong
[02:42] <@Axem> im happy with these missions now
[02:44] <@Axem> well
[02:44] <@Axem> with 2 of them

 
It's even worse when you look at the single-threaded performance, which is pretty important for gaming. The 2.5 year old 4790K is 2nd only to the new 7700K and they're very close.
In terms of single-core clock speeds processors have hardly advanced since 2004. That's 12 ****ing years since the Pentium 4 570J was released with a base clock of 3.8Ghz. That's still in the very high-end today.
Of course modern processors have bigger caches and more efficient architecture, and faster bus speeds and would annihilate the 570J in single-thread benchmarks but it's a bit concerning that clock speeds haven't moved much in 12 years.

The first AMD CPU on that list is probably not even in the top 100.

The 4790K was a high-end consumer CPU but it wasn't some supermonster that costs you an arm and a leg. The fact that an old ~300€ CPU is near the top in both multi-thread and single-thread benchmarks of common CPUs shows that Intel really hasn't done much in the past 3 years.
The biggest performance increases come from the onboard HD graphics chips which is just mind boggling as anyone buying an i7 is either building a gaming rig and will therefore have a much more powerful dedicated GPU or doesn't care about graphics at all and just needs a fast processor. And even those aren't all that huge in terms of performance gains.

Nobody buys i7s for the onboard graphics.
« Last Edit: January 06, 2017, 12:51:46 pm by FrikgFeek »
[19:31] <MatthTheGeek> you all high up on your mointain looking down at everyone who doesn't beam everything on insane blindfolded

 
 

Offline Spoon

  • 212
  • ヾ(´︶`♡)ノ
Urutorahappī!!

[02:42] <@Axem> spoon somethings wrong
[02:42] <@Axem> critically wrong
[02:42] <@Axem> im happy with these missions now
[02:44] <@Axem> well
[02:44] <@Axem> with 2 of them

 
...as long as your code doesn't try to use vendor-specific extensions. If you do, chances are you're going to see bad things on systems not using cards made by that vendor.

The issue you mention is more an effect of nVidia making a very strong effort to get game developers to use their proprietary software packages, which to my mind is less of a mark against nVidia and more of a mark against those studios.

Aha. There it is. Vendor specific extensions and proprietary software packages on what is supposed to be an open standard. VSEs shouldn't even be a thing on an open standard. And if they are allowed, they should be warned about up front so that developers know they'll need conditionals to identify vendor specific hardware and only engage any VSEs used for hardware from that vendor. And while one can argue that it is more a mark against the developers than nVidia that the devs aren't careful enough about this, it's not like nVidia did anything to encourage caution in the matter. And, from a market share standpoint, why should they? If popular games have issues on competing GPUs, why, that just means more sales for them when people get fed up with how "broken" ATi/AMD and Intel HD GPUs are and buy a new nVidia card so they can play any of the games "the way it's meant to be played," to use an nVidia marketing slogan.
There are only 10 kinds of people in the world;
those who understand binary and those who don't.