PDA

View Full Version : 32 CORE AMD zen CPU'S ARE COMING -



vonpietro
08-18-2016, 04:06 PM
THats right its amd's new offering a 32 core CPU called zen.
they are also offering a new 8 core cpu to compete with intels current line up.

Its looking good
here's a review of it all
http://www.pcworld.com/article/3109327/hardware/let-the-cpu-wars-begin-amd-shows-its-zen-cpu-can-compete-with-intels-best.html

i've been waiting for new AMD processors - and they are finally here - this is big big news.

it will drop the prices of of some of the intel cpu offerings now hopefully.

after reading it i wasn't sure if the 32 core cpu was running as a dual set up - 64 cores or a single 32 core cpu.
dual would be ideal of course - thats some nice computing power - a mini render farm in one computer.
=)

gamedesign1
08-18-2016, 04:58 PM
: )

rustythe1
08-18-2016, 05:24 PM
well, got excited until I read the second paragraph, they said it was as fast as the current i7 8 core, but intel already have a faster 8 core extreme, and also now a 10 core extreme (also they said they had to lower the intel clock speed to 3ghz to make the test fair, but the intel will happily run at 4.2 on air cooled, the 32 core cpu is a server chip and intel already have 72 core 288 thread chips, but the amd chips are not shipping until next year

js33
08-18-2016, 05:54 PM
THats right its amd's new offering a 32 core CPU called zen.
they are also offering a new 8 core cpu to compete with intels current line up.

Its looking good
here's a review of it all
http://www.pcworld.com/article/3109327/hardware/let-the-cpu-wars-begin-amd-shows-its-zen-cpu-can-compete-with-intels-best.html

i've been waiting for new AMD processors - and they are finally here - this is big big news.

it will drop the prices of of some of the intel cpu offerings now hopefully.

after reading it i wasn't sure if the 32 core cpu was running as a dual set up - 64 cores or a single 32 core cpu.
dual would be ideal of course - thats some nice computing power - a mini render farm in one computer.
=)

You have to remember its AMD which to me has always meant subpar quality compared to Intel/nVidia. I borrowed a friends Quad core AMD PC once to help with rendering but it always crashed after a few frames where my other 2 Intel PCs ran fine and never crashed.

vonpietro
08-18-2016, 09:30 PM
amd has always been rock solid for me - it may have been a poor build

js33
08-18-2016, 09:46 PM
It was a Lenovo machine so that might be true. But I have never been impressed by anything AMD has made, processor or graphic card.

m.d.
08-18-2016, 10:31 PM
There was a time (quite a while ago) when AMD dominated Intel for a short period....ATI has always been competitive with Nvidia, bang for the buck always better, but because of CUDA lost out on a lot of pro apps (till now)

I look forward to AMD heating up the horsepower wars....cause Intel has been spanking them ever since core.

js33
08-18-2016, 10:37 PM
Yeah it's good to have competition nipping at their heals otherwise all Intel CPUs would start at $1000 and all nVidia GPUs would start at $800.

ActionBob
08-19-2016, 07:24 AM
Core number is impressive, but I don't have much faith in AMD centered machines.

My real day time job has me in a position where I take care of the office equipment in a decent sized office.

We have over 100 HPs with AMD processors in them here. I have had to call in warranty repair on more than 10 - can't remember exact number at the moment - I could look it up, but I am not in the mood.

These were all CPU related issues where the CPU needed to be replaced due to failure - every single one...

That's about a 10% failure rate (within a year) on machines that get light use out of them.

Perhaps it was just the family of chips - or a bad run, but I have contacted other offices and they had a similar experience.

Maybe their top end chips are good, but my experience with these particular machines has not been good - could be a HP motherboard issue causing issue - but all of our intel machines are running fine.

-Adrian

MichaelT
08-19-2016, 08:07 AM
I have never replaced an Intel CPU. Not ever in 30 years. Not even when I took care of school computers at a university. AMD on the other hand... So no, I don't have much confidence with AMD when it comes to CPUs. In fact that extends to GPUs too. I have only one failed nVidia gpu (the very first GeForce 256, where the capacitors for the video output finally gave out after 8 years) but with AMD, I have replaced three cards within two years. So when it comes to AMD making anything that holds the test of time... the confidence is pretty low. But hey.. that is my experience. For you, it might be entirely different. Could be a number of reasons.. maybe the electrical current where I live is affecting the components. Maybe the air.. or both. Don't know. But for me it is Intel all the way. And no, I am really not exaggerating.. I have literally never replaced an Intel CPU.

m.d.
08-19-2016, 09:58 AM
only owned about 20 computers....and only went AMD CPU maybe 3 times, but no problems.
I did have 1 intel CPU go bad on me once (P4) and lost 3 nvidia cards, and 1 AMD laptop GPU...the laptop GPU was AMD pushing the binning and really should not have been clocked that high, as I investigated an the entire line (8750 mobile) was having 20% failure rates at the factory

So its a little bit subjective...I think all things being equal, Intel and Nvidia are the way to go....however AMD is looking like they are learning there lesson.

Exclaim
08-19-2016, 10:18 AM
I like Intel, and AMD. Recent machine is an AMD for more cores at a lower price. If you go AMD , build your on desktop. Their stock chips are nowhere near as powerful as the separate component selection.

Dan Ritchie
08-19-2016, 10:43 AM
well, got excited until I read the second paragraph

AMD hasn't announced what final clock speeds will be. They usually keep that quiet until final release because they are still working on it. It may well be 4 ghz when released. What they were showing was that it was clock for clock competitive.
I've been using AMD for many, many years and never had CPU problems. There was a trend for a couple of years where OEMs tended to pack in cheaper equipment in their machines with AMD processors. It's settled down a bit lately. I had some keyboards and touchpads that didn't work. I can say the same for intel machines though, speakers not working, etc.

rustythe1
08-19-2016, 11:32 AM
only 8mb of l2/l3 cache, bit lower than intels 25mb (20 for the 8 core), and that's one of the areas that impact things like lightwave render time.

js33
08-19-2016, 02:01 PM
I have never replaced an Intel CPU. Not ever in 30 years. Not even when I took care of school computers at a university. AMD on the other hand... So no, I don't have much confidence with AMD when it comes to CPUs. In fact that extends to GPUs too. I have only one failed nVidia gpu (the very first GeForce 256, where the capacitors for the video output finally gave out after 8 years) but with AMD, I have replaced three cards within two years. So when it comes to AMD making anything that holds the test of time... the confidence is pretty low. But hey.. that is my experience. For you, it might be entirely different. Could be a number of reasons.. maybe the electrical current where I live is affecting the components. Maybe the air.. or both. Don't know. But for me it is Intel all the way. And no, I am really not exaggerating.. I have literally never replaced an Intel CPU.

AMD stuff always runs hot and wears out faster, that is if it even works at all for a said purpose. I tried a friends AMD quad core once to help with rendering and it would render 2 frames then crash for the 3 times I tried it. My 2 Intel pcs ran the scenes rock solid and never crashed. So there is also something in the hardware with AMD that makes them not 100% Intel compatible.

Dan Ritchie
08-19-2016, 02:12 PM
I've had 1 AMD and 1 Nvidea card go out in the last 30 years.

js33
08-19-2016, 02:21 PM
I've had 1 AMD and 1 Nvidea card go out in the last 30 years.

I've never had an Intel proc fail on me. My wife and friends have all had AMD fail on them. I had one nVidia GPU blow once but that is mainly because I only use nVidia.

You are a mainly a programmer thought right? So if you aren't rendering all the time you aren't pushing the CPU very hard.

m.d.
08-19-2016, 05:12 PM
everybody here is forming an opinion on extremely small sample sizes

Microsoft released a report on overclocking stability from over 1 million crashes and although they didn't publish names....they show crashes were 1 in 400 for company A, and 1 in 390 for company B (stock speeds)
So at stock speeds they are relatively the same within 2.5% reliability....however, once overclocked one manufacturer is 20x more likely to crash than the other. I am pretty certain that is AMD (they don't say)

http://www.extremetech.com/gaming/131739-microsoft-analyzes-over-a-million-pc-failures-results-shatter-enthusiast-myths

I think AMD pushes their clock speeds as far as they can go to try to keep up, and there is little headroom to go past.
I did experience this with the 8750 mobile GPU, where they were getting huge failure rates and low yields from the factory....trying to push the binning a little more then they should have

What a lot of people dont realize is the binning process.
For example....3 different intel CPU are likely the exact same die...same wafer (some 6 and 4 cores are even the same die with cores disabled) Intel and AMD will then test the CPU's and based on their performance, bin them and accordingly (i3, i5, i7)set their clockspeeds and other limitations.


When the competition is ahead, there is probably a tendency to push the binning a little higher than normal....this would explain Intel's overclocking abilities

js33
08-19-2016, 05:51 PM
Unless I can see reliability and performance gains vs. price from AMD vs Intel I will never buy AMD.

MichaelT
08-19-2016, 05:55 PM
Still I don't have actual failure rate figures, but I think it is below 1% for both. Per year that is. So it could definitely be a case of AMD running their things closer to the edge. It is a risk like any other. But it is a highly competitive market (which is good for use I suppose :) ) and now even ARM is becoming more relevant for each year that passes. Their CPU is already at the PS3 level of performance, but at a fraction of the power needed to do it. No wonder Intel licensed the right to make ARM chips of their own. I don't think nVidia is ignoring them either.

vonpietro
08-19-2016, 08:36 PM
i know you would buy amd to overclock - and of course when you overclock - your running hotter. (unless you got a water pump)

js33
08-19-2016, 09:41 PM
AMD stocks speeds are usually almost at the limit so when people try to OC them the failure rate is high. Intel can OC a lot more without failure.

zapper1998
08-21-2016, 03:37 AM
They named it after the Shampoo Zen... wow


32 core Naples cpu.. Coolness 32+32+32+32= 128 cores wow coollllllll


Have to have the Water Cooling for those puppies...

Dan Ritchie
08-22-2016, 10:13 AM
You are a mainly a programmer thought right? So if you aren't rendering all the time you aren't pushing the CPU very hard

Yeah, I spent 7 years as an animator. I worked on Star Trek and the Borg ride film, Mystic Knights of Tir Na Nog, dan dare, max steel, a number of games, (everyone does Barbie games sooner or later, right?) and I did a shot on one of those Spiderman movies, so yeah, animator too. These days, I just want to render pretty landscapes.

BTW, I think it's 8 megs per core, not total.

Dan Ritchie
08-25-2016, 09:43 AM
I was wrong, it's 8 megs per execution unit, or ever 4 cores gets 8 megs

zapper1998
08-31-2016, 04:38 AM
now I want to build a new workstation....

zapper1998
09-04-2016, 03:46 PM
and the new cpu's are Win 10 only dangit

jwiede
09-04-2016, 07:39 PM
You are a mainly a programmer thought right? So if you aren't rendering all the time you aren't pushing the CPU very hard.

There are plenty of digital design (hw & sw) analysis and simulation workloads that can fully occupy as many cores and cycles as a CPU can offer, generating at least as much persistent load as most 3D rendering tasks. There's nothing special about the compute workloads from 3D rendering in that regard.

js33
09-04-2016, 08:58 PM
There are plenty of digital design (hw & sw) analysis and simulation workloads that can fully occupy as many cores and cycles as a CPU can offer, generating at least as much persistent load as most 3D rendering tasks. There's nothing special about the compute workloads from 3D rendering in that regard.

Except that your programming analysis is probably never running for 24-72 hours or more continuously like a 3D scene can. Now if you were doing high end weather and scientific modeling I could see those running for a while although the places where that is done usually use supercomputers.

vonpietro
09-05-2016, 12:09 AM
i have not seen anything about the 32 core zen yet- except some misgivings about its size and heat generation - fitting 32 cores on a die seems to make it a very huge die, and there was some issue to how much heat that would generate.
have not seen any pricing yet either.
or reviews.

MichaelT
09-05-2016, 01:02 AM
I would be very happy if they release it. With an ok performance, and don't charge a gazillion for it. Then it would force Intel to stop procrastinating about delivering more cores, at a reasonable price. Wouldn't help me now, since I just got an upgrade. But for people in general.

zapper1998
09-05-2016, 09:42 AM
it will be cheaper than an intel by far...

OFF
09-05-2016, 03:40 PM
Very strange test result from Zen:

134338

http://browser.primatelabs.com/v4/cpu/105227

Dan Ritchie
09-05-2016, 03:56 PM
Not sure what to read into that. Test was at 1.4 ghz, and we have not a lot to go on for a product that's 6 months away.

jwiede
09-06-2016, 01:11 PM
Except that your programming analysis is probably never running for 24-72 hours or more continuously like a 3D scene can.

Actually, they do. Even moderately complex FPGA simulation runs can easily soak up a single 4- or even 8-core PC system for such extended periods, depending on duration of sim time that needs to be modeled. We have _huge_ high-performance clusters at work wholly dedicated to doing sim and analysis runs for chip designs explicitly so we can get the turn-around times on the individual jobs down to (barely) less than a day. It's the exact same motivation why studios maintain in-house render farms: To have enough core/hours available to reduce job run times to periods which allow viable work scheduling.

Gate- and cycle-accurate chip and system simulations, esp. when simulating human-measurable time periods (such as "boot to UI") where a single simulated second yields hundreds of millions to billions of cycles that need to be simulated for each entity across a chip or system worth of different entities, are actually quite computationally equivalent to the scales and types of computation required by 3D rendering. If you think a bit about the similarities in physical scales for what each is simulating (esp. as relative orders of magnitude), and relative complexity for each simulation, that really shouldn't be so surprising.

js33
09-06-2016, 01:22 PM
Do you work at Intel or AMD? Sure there are some things that will equate to 3D rendering in other areas of computing. I was thinking of an average programmer not a super industrial chip making facility.

jwiede
09-06-2016, 02:01 PM
Oh, and in case you think only hw devs encounter such scales of work, take a look at this site (www.spinroot.com) and in particular at the numbers of different states that commonly arise in even small programs. SPIN analysis and other formal verification tools are now becoming common toolchain elements in most moderate- to large-scale software development efforts, and "suffer" from the similarly-ridiculous scale of elements and computational-complexity and scope as seen in HW gate-/signal-accurate simulations.

As I said before, the notion that 3D rendering-similar loads are uncommon in development is simply no longer the case.

js33
09-06-2016, 10:56 PM
I was comparing the average 3D user to the average computer user initially and you turned it into the average 3D user vs. Intel and the NSA. :D

jwiede
09-07-2016, 02:13 PM
I was comparing the average 3D user to the average computer user initially and you turned it into the average 3D user vs. Intel and the NSA. :D

I don't work for Intel (and MS doesn't do nearly as much hw, yet still runs such sims constantly enough to merit expensive high-perf clusters, etc), and formal verification analysis use in sw is vastly more common than just NSA-type organizations. If you choose to believe otherwise, so be it. (shrug) BTW, that maker-type web shops have begun selling VLSI-scale FPGAs hints at ubiquity.

js33
09-07-2016, 03:12 PM
I not disagreeing with you about software developers but you are talking about high end development and not someone in their basement cranking out iPhone apps and plugins.