... for most of the benchmarks Intel chips performed better than the Opterons, don't know why Intel should get scared from these, they can safely wait for Barcelona. Didn't really understand why you have out it as AMD is still in game with these in the 4S space.
intel got scared because they dont want to see the real result from AMD + ATI.
the longer intel lets AMD lives, the more dangerous intel will be.
that's why you guys can see Intel is attacking AMD really really hard at this meantime... just to kick AMD out of the game.
Check Intel own pricing lists, and you will see that Intel has already pre-empted some of these cuts with their Xeon X5355 at $744 or Xeon E5345 at $455 and the "official" Xeon X5365 should be cout soon if not already...
I know nothing about 4S servers. But what's the essence of this article? Surely not that NetBurst is crap? We've known that for years. Is the real story here that Intel doesn't really give a s*** about 4S, otherwise they would have moved on to the core 2 architecture long ago? Just guessing.
Xeon 7300 Series based on the Tigerton core which is a 4 Socket Capable Kentsfield/Clovertown derivatives is arriving in Sepetember this year, so Intel does care in becoming more competitive in the 4S space, but it is just taking some time.
They decided to concentrate on the high volume 2S sector is all first, since Intel has massive capacity, going for the high volume sector first makes sense.
Yes and no, actually to have two intel quads running on a single FSB was a serious technical problem.
Therefore they had to wait for 4-FSB chipset to be able to get them out the door. Not to mention the qualification times which are a bit onger for 4S platforms that 2S.
AMD does not have these obstacles as 8xxx series are essentially 2xxx series from stability/reliability POW.
People built 3.0GHz - 3.33GHz E4300 & E4400 systems six months ago that cost roughly $135 for the CPU. Others went for an E6300 or more recently an E6320, both again under $200.
They were all relatively easy overclocks.
Why does anyone with any skill in building their own computer care about an $800+ CPU again?
Why don't Ford Mustangs use a small engine, overclocked to hell? Like an inline 4 2.0l with turbo, and a high rpm instead of their huge 4+ liter engines?
Why do trucks use those big engines, when they could get the same power from a smaller, gasoline, turbocharged engine?
People pay $800+ for processors that work in multiprocessor systems (your run of the mill Athlon64 or E4300 won't run). Also, they use error checking (and usually error correcting) memory in their systems - again, Athlon64 doesn't do this. They also use registered DDR in order to access more memory banks - your Athlon64 again falls short. On the E4300 side, the chipset is responsible with those things, so you could use such a processor in a server chassis - if the socket fits.
good analogy there, except that mustangs (and various other cars) use pickup truck engines for cost reasons. large trucks use larger engines (often diesels) because they offer considerably more torque at much lower RPM than a smaller gasoline engine; and thus provide more pulling power.
these are not regular consumer cpus, but intended for use in commercial servers and workstations. they and their motherboards cost more because they support features such as multiple sockets (so in addition to having multiple cores on one chip you can also have multiple chips on one motherboard).
quote: Intel has a clear lead in the rendering market. If you are rendering complex high resolutions images, the quad core Xeon is clearly the best choice.
they win 1 of 2 tests, and it is clear they are the winner ? Why ? Because they won the software rendering also ? Anyone interrested enough in rendering, and HAVING to have this sort of hardware for it is NOT going to bother with software . . .
This means your conclusion on this point is incorrect, and in which case, it boils down to which application the rendering machine is going to do.
Man you guys come to the wierdest conclusions based on your own data, and I am not even the first to notice/mention this sort of thing . . .
The Quadcore wins all high resolution rendering tests. Where do you see the DC opterons win against the Quadcore Intel in high resolution rendering? Show me a rendering engine where a 3 GHz K8 DC core is faster in high resolution renderering than a 2.33 GHz Quadcore. All decent and used in the realworld rendering engines will more or less show the same picture.
In fact, the "rendering performance" situation will get worse for the K8 as SSE-2 tuning will get more common. All Intel CPUs since core and all AMD CPUs since Barcelona will show (or are already showing) high performance boost from using better SSE-2 code.
Ok, I see now with the graphs 'lower is better' on 3ds max, I missed that with the tables, which is actually what I meant this morning 'table obfustication'. I personally do not mind tables, but when the data is not in a uniform spot, it confuses/makes it harder to read at a glance.
Anyhow, I was tired when I posted this morning, cranky, and was overly harsh I think. However it *is* much easier for me personaly to read the graphs at a glance (I cannot speak for everyone though).
Oh, and while on the subject, you guys here at anandtech have lately mastered the art of graph obfustication. Is it really THAT hard leaving items in the same rows / columns for different tests ? Are we trying to confuse the results, or is there some other reason this happens, and has gone completely over my head ?
The only reason is that until very recently I didn't master the graphing engine. I got some weird error messages and gave up. But I have found the error, and you should see some nice graphs which don't obfusticate...
the gif on page 2 is non-looping, so after a very quick jump from 1ghz -> 2.8ghz (why??) -> 3.2ghz , it stays put on the 3.2ghz image. If reading the article, by the time the reader sees the image, it's already 5 minutes on the last image and staying there, making it for all intents and purposes a static image instead of an animated one
Thanks, fixed that. The reason to show 2.8 GHz is that for example Specjbb and other applications sometimes don't completely stress the CPU and then the cpu dynamically goes back to 2.8 GHz. It are simply the 3 stages I saw the most, and found the most interesting to show.
Thanks for the clarification, I was under the impression the only real states were idle (1ghz) and full tilt (3.2ghz). Never seen any other states but all I ever use are the desktop chips, I wasn't aware CnQ could be more dynamic than that.
It really isn't. The were demonstrating the new 3.2ghz opteron. Also, this was a dual socket setup, and anand said, and everyone who monitors the server world knows, that the opterons come out ahead overall in the 4S environment.
The more sockets, the more performance advantage opterons have on intel in the server space. This is well known. The purpose of this was to show it in the dual socket environment.
confused, no it is the stupidity of people like you that think that all Intel offerings are better then the ones for AMD.
@anand, you're conclusion of the database world that the quadcore still rules..... where are the benchmarks?
now it is nice to see all these benches next to each other, when are you going to combine benches, no longer servers are used for one application, they are more combined these days with more apps. Maybe its time you also have a look at vmware esx etc.... will probably give you a different look at the offerings of AMD these days.
You don't have to get hostile because he does have a point. In the desktop market, Intel is clearly better unless we're talking about low end. Server market, it's still a toss up but Intel still has a lead.
Um, you guys obviously have not been paying much attention have you ?
1) AMD CPUs=cheaper
2) AMD CPUs of comparrible speed perform nearly as good if not as good or better than their Intel counterparts. ie: I think you better check the last benchmarks anandtech post 'homie', because I saw a lot of AMD on top of the game benches. (6000+ vs e6600).
3) Yes, a C2D *may* overclock better, and if it is you intention to overclock, it makes perfect sense to buy one, just be prepared to pay more for the CPU.
4) Up until recently, or possibly still happening into the near future, AMD system boards availible often offered more features for less cost. It does seem however with the P35 Chipset, vendors are starting to come around.
5) last, but not least, THIS article IS NOT about desktop hardware now IS IT ?! why bring some stupid lame ass coment into some place that it does not even fit ? GOd, and I thought I needed a new life . . .
English is my 2nd language so I sometimes can't always express what I want to say.
What i meant to say is that Intel's new line of cpu's based on Core 2 duo tech. are better-(more advanced) then those based on K8 technology. If this is not true then there should not be a reason to introduce the K10 later this year to counterattack core 2 duo/quad.
Core2Duo technology from Intel is better overall than the K8 technology from AMD - this includes basic architecture, current improvements on the initial architecture (K8 is older and has more of those small improvements), and process/production technology.
However, Intel lagged in introduction of Core2 based server processors, and even now their FBDIMM technology is slower and hotter (power hungry) than AMD's Opteron/DDR. Until this changes, AMD still has a market in servers, albeit not as good as before the Core2Duo Xeon processors.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
30 Comments
Back to Article
2ManyOptions - Monday, August 6, 2007 - link
... for most of the benchmarks Intel chips performed better than the Opterons, don't know why Intel should get scared from these, they can safely wait for Barcelona. Didn't really understand why you have out it as AMD is still in game with these in the 4S space.baby5121926 - Monday, August 6, 2007 - link
intel got scared because they dont want to see the real result from AMD + ATI.the longer intel lets AMD lives, the more dangerous intel will be.
that's why you guys can see Intel is attacking AMD really really hard at this meantime... just to kick AMD out of the game.
Justin Case - Monday, August 6, 2007 - link
What are the units in the WinRAR results table?coldpower27 - Monday, August 6, 2007 - link
Check Intel own pricing lists, and you will see that Intel has already pre-empted some of these cuts with their Xeon X5355 at $744 or Xeon E5345 at $455 and the "official" Xeon X5365 should be cout soon if not already...http://www.intel.com/intel/finance/pricelist/proce...">http://www.intel.com/intel/finance/pric...rice_lis...
TheOtherRizzo - Monday, August 6, 2007 - link
I know nothing about 4S servers. But what's the essence of this article? Surely not that NetBurst is crap? We've known that for years. Is the real story here that Intel doesn't really give a s*** about 4S, otherwise they would have moved on to the core 2 architecture long ago? Just guessing.coldpower27 - Monday, August 6, 2007 - link
Xeon 7300 Series based on the Tigerton core which is a 4 Socket Capable Kentsfield/Clovertown derivatives is arriving in Sepetember this year, so Intel does care in becoming more competitive in the 4S space, but it is just taking some time.They decided to concentrate on the high volume 2S sector is all first, since Intel has massive capacity, going for the high volume sector first makes sense.
mino - Monday, August 13, 2007 - link
Yes and no, actually to have two intel quads running on a single FSB was a serious technical problem.Therefore they had to wait for 4-FSB chipset to be able to get them out the door. Not to mention the qualification times which are a bit onger for 4S platforms that 2S.
AMD does not have these obstacles as 8xxx series are essentially 2xxx series from stability/reliability POW.
Calin - Monday, August 6, 2007 - link
The 5160 processor is Core2 unit, not a NetBurst one. Also, the 5345 is a quad core based on Core2jay401 - Monday, August 6, 2007 - link
People built 3.0GHz - 3.33GHz E4300 & E4400 systems six months ago that cost roughly $135 for the CPU. Others went for an E6300 or more recently an E6320, both again under $200.They were all relatively easy overclocks.
Why does anyone with any skill in building their own computer care about an $800+ CPU again?
Calin - Monday, August 6, 2007 - link
Why don't Ford Mustangs use a small engine, overclocked to hell? Like an inline 4 2.0l with turbo, and a high rpm instead of their huge 4+ liter engines?Why do trucks use those big engines, when they could get the same power from a smaller, gasoline, turbocharged engine?
People pay $800+ for processors that work in multiprocessor systems (your run of the mill Athlon64 or E4300 won't run). Also, they use error checking (and usually error correcting) memory in their systems - again, Athlon64 doesn't do this. They also use registered DDR in order to access more memory banks - your Athlon64 again falls short. On the E4300 side, the chipset is responsible with those things, so you could use such a processor in a server chassis - if the socket fits.
piroroadkill - Tuesday, August 7, 2007 - link
it is a car analogyGul Westfale - Monday, August 6, 2007 - link
good analogy there, except that mustangs (and various other cars) use pickup truck engines for cost reasons. large trucks use larger engines (often diesels) because they offer considerably more torque at much lower RPM than a smaller gasoline engine; and thus provide more pulling power.Gul Westfale - Monday, August 6, 2007 - link
these are not regular consumer cpus, but intended for use in commercial servers and workstations. they and their motherboards cost more because they support features such as multiple sockets (so in addition to having multiple cores on one chip you can also have multiple chips on one motherboard).yyrkoon - Monday, August 6, 2007 - link
they win 1 of 2 tests, and it is clear they are the winner ? Why ? Because they won the software rendering also ? Anyone interrested enough in rendering, and HAVING to have this sort of hardware for it is NOT going to bother with software . . .
This means your conclusion on this point is incorrect, and in which case, it boils down to which application the rendering machine is going to do.
Man you guys come to the wierdest conclusions based on your own data, and I am not even the first to notice/mention this sort of thing . . .
JohanAnandtech - Monday, August 6, 2007 - link
The Quadcore wins all high resolution rendering tests. Where do you see the DC opterons win against the Quadcore Intel in high resolution rendering? Show me a rendering engine where a 3 GHz K8 DC core is faster in high resolution renderering than a 2.33 GHz Quadcore. All decent and used in the realworld rendering engines will more or less show the same picture.In fact, the "rendering performance" situation will get worse for the K8 as SSE-2 tuning will get more common. All Intel CPUs since core and all AMD CPUs since Barcelona will show (or are already showing) high performance boost from using better SSE-2 code.
yyrkoon - Monday, August 6, 2007 - link
Ok, I see now with the graphs 'lower is better' on 3ds max, I missed that with the tables, which is actually what I meant this morning 'table obfustication'. I personally do not mind tables, but when the data is not in a uniform spot, it confuses/makes it harder to read at a glance.Anyhow, I was tired when I posted this morning, cranky, and was overly harsh I think. However it *is* much easier for me personaly to read the graphs at a glance (I cannot speak for everyone though).
yyrkoon - Monday, August 6, 2007 - link
Oh, and while on the subject, you guys here at anandtech have lately mastered the art of graph obfustication. Is it really THAT hard leaving items in the same rows / columns for different tests ? Are we trying to confuse the results, or is there some other reason this happens, and has gone completely over my head ?JohanAnandtech - Monday, August 6, 2007 - link
The only reason is that until very recently I didn't master the graphing engine. I got some weird error messages and gave up. But I have found the error, and you should see some nice graphs which don't obfusticate...Spoelie - Monday, August 6, 2007 - link
the gif on page 2 is non-looping, so after a very quick jump from 1ghz -> 2.8ghz (why??) -> 3.2ghz , it stays put on the 3.2ghz image. If reading the article, by the time the reader sees the image, it's already 5 minutes on the last image and staying there, making it for all intents and purposes a static image instead of an animated one:)
JohanAnandtech - Monday, August 6, 2007 - link
Thanks, fixed that. The reason to show 2.8 GHz is that for example Specjbb and other applications sometimes don't completely stress the CPU and then the cpu dynamically goes back to 2.8 GHz. It are simply the 3 stages I saw the most, and found the most interesting to show.Spoelie - Monday, August 6, 2007 - link
Thanks for the clarification, I was under the impression the only real states were idle (1ghz) and full tilt (3.2ghz). Never seen any other states but all I ever use are the desktop chips, I wasn't aware CnQ could be more dynamic than that.yuchai - Monday, August 6, 2007 - link
I believe all A64 chips including the desktop ones have the different power states. For example my X2 4200+ has 4 states. 1.0, 1.8, 2.0 and 2.2 Ghz.ButterFlyEffect78 - Monday, August 6, 2007 - link
Are they talking about the barcelona?If not, then this is old news.
I'm sure everyone by now knows that intels new cpu's are better then the current AMD opterons.
KingofFah - Monday, August 6, 2007 - link
It really isn't. The were demonstrating the new 3.2ghz opteron. Also, this was a dual socket setup, and anand said, and everyone who monitors the server world knows, that the opterons come out ahead overall in the 4S environment.The more sockets, the more performance advantage opterons have on intel in the server space. This is well known. The purpose of this was to show it in the dual socket environment.
duploxxx - Monday, August 6, 2007 - link
confused, no it is the stupidity of people like you that think that all Intel offerings are better then the ones for AMD.@anand, you're conclusion of the database world that the quadcore still rules..... where are the benchmarks?
now it is nice to see all these benches next to each other, when are you going to combine benches, no longer servers are used for one application, they are more combined these days with more apps. Maybe its time you also have a look at vmware esx etc.... will probably give you a different look at the offerings of AMD these days.
clairvoyant129 - Monday, August 6, 2007 - link
You don't have to get hostile because he does have a point. In the desktop market, Intel is clearly better unless we're talking about low end. Server market, it's still a toss up but Intel still has a lead.yyrkoon - Monday, August 6, 2007 - link
Um, you guys obviously have not been paying much attention have you ?1) AMD CPUs=cheaper
2) AMD CPUs of comparrible speed perform nearly as good if not as good or better than their Intel counterparts. ie: I think you better check the last benchmarks anandtech post 'homie', because I saw a lot of AMD on top of the game benches. (6000+ vs e6600).
3) Yes, a C2D *may* overclock better, and if it is you intention to overclock, it makes perfect sense to buy one, just be prepared to pay more for the CPU.
4) Up until recently, or possibly still happening into the near future, AMD system boards availible often offered more features for less cost. It does seem however with the P35 Chipset, vendors are starting to come around.
5) last, but not least, THIS article IS NOT about desktop hardware now IS IT ?! why bring some stupid lame ass coment into some place that it does not even fit ? GOd, and I thought I needed a new life . . .
Final Hamlet - Monday, August 6, 2007 - link
It is these "but"s, that make the difference.If they exist, you can't state "all Intel CPUs" anymore, because there are exceptions.
ButterFlyEffect78 - Monday, August 6, 2007 - link
I'm sorry everybody.English is my 2nd language so I sometimes can't always express what I want to say.
What i meant to say is that Intel's new line of cpu's based on Core 2 duo tech. are better-(more advanced) then those based on K8 technology. If this is not true then there should not be a reason to introduce the K10 later this year to counterattack core 2 duo/quad.
But again, I could be wrong.
Calin - Monday, August 6, 2007 - link
Core2Duo technology from Intel is better overall than the K8 technology from AMD - this includes basic architecture, current improvements on the initial architecture (K8 is older and has more of those small improvements), and process/production technology.However, Intel lagged in introduction of Core2 based server processors, and even now their FBDIMM technology is slower and hotter (power hungry) than AMD's Opteron/DDR. Until this changes, AMD still has a market in servers, albeit not as good as before the Core2Duo Xeon processors.