Might upgrade CPU later this year

  • 59 results
  • 1
  • 2

This topic is locked from further discussion.

#1 Posted by Chris_53 (5105 posts) -

Ok so later on in the year, when the "next gen" type games start appearing, I may decide to upgrade my CPU/Motherboard. Also, I may finally decide to switch from AMD to Intel, since it has to be said, their CPUs are significantly faster comparted to their AMD counterparts. 

I currently have an AMD Phenom X4 955 which I have overclocked to 3.7GHz (not 3.8 like in my sig, I did have it at that but it wasnt too stable and I dont think ive changed my sig yet)

For current games, this CPU seems to cope fine, though it can just about handle Crysis 3 at near max settings, I normally get around 35fps. 

However, I dont feel confident that my CPU will fair too well with new games that come out later in the year. 

My question is, would upgrading from my current CPU to an Intel Core i5 3570k be worthwhile? 

PS: I have a GTX570 GPU :)

Thanks in advance!

#2 Posted by soolkiki (1737 posts) -

I think that you should! I'm not SUPER familiar with AMD, so I dont know how old your CPU is, but I just got my i5 3570k, and it's been absolutely wonderful! I'll be good for upgrades for a few years down the road yet! Not to mention you can get one for about $230 on Newegg. I think that it would be a very worthwhile upgrade. These are what I have:

 http://www.newegg.com/Product/Product.aspx?Item=N82E16813157293


http://www.newegg.com/Product/Product.aspx?Item=N82E16819116504 

Overclockable and wonderful! :) The mobo may be different for you, however, depending on your setup.

#3 Posted by ionusX (25715 posts) -

Ok so later on in the year, when the "next gen" type games start appearing, I may decide to upgrade my CPU/Motherboard. Also, I may finally decide to switch from AMD to Intel, since it has to be said, their CPUs are significantly faster comparted to their AMD counterparts. 

I currently have an AMD Phenom X4 955 which I have overclocked to 3.7GHz (not 3.8 like in my sig, I did have it at that but it wasnt too stable and I dont think ive changed my sig yet)

For current games, this CPU seems to cope fine, though it can just about handle Crysis 3 at near max settings, I normally get around 35fps. 

However, I dont feel confident that my CPU will fair too well with new games that come out later in the year. 

My question is, would upgrading from my current CPU to an Intel Core i5 3570k be worthwhile? 

PS: I have a GTX570 GPU :)

Thanks in advance!

Chris_53

wait to see what richland has in store for piledriver 2.0 if amd has an improvement here over vishera that may be a potential rout if you want to stay AMD. if you want to go intel the 3570k is looking awefully tempting as haswell seems to be a flop preformance wise.

#4 Posted by 5SI-GonePostal (355 posts) -

as haswell seems to be a flop preformance wise.

ionusX

Have you got more info on this as all i have read has suggested you are getting the normal 10%ish increase as per most CPUs in similar socket but just with a new generation along with much better power comsumption and intergrated graphics that rival a 650........

#5 Posted by ionusX (25715 posts) -

[QUOTE="ionusX"]

as haswell seems to be a flop preformance wise.

5SI-GonePostal

Have you got more info on this as all i have read has suggested you are getting the normal 10%ish increase as per most CPUs in similar socket but just with a new generation along with much better power comsumption and intergrated graphics that rival a 650........

well 10% at this level isnt exactly alot. 10% at this bracket of preformance is a grain of salt in a beach and isnt worth any kind of investment for anyone with a cpu from the past two gens that was in the higher end or even in the mid-tier. if intel wants to attract new customers and ya know crush AMD like i know many want (including them) intel needs to offer more than simply a mild improvement.

as for this hd 5100 series intel gpu i have a serious doubt about its preformance. and even if im wrong AMD"s offering on richland is = hd 6790 so all this work by intel jsut for amd's apu gpu side to start taking off

#6 Posted by 04dcarraher (19171 posts) -

[QUOTE="ionusX"]

as haswell seems to be a flop preformance wise.

5SI-GonePostal

Have you got more info on this as all i have read has suggested you are getting the normal 10%ish increase as per most CPUs in similar socket but just with a new generation along with much better power comsumption and intergrated graphics that rival a 650........

Power consumption isnt that mach better nor will it rival a 650m, the IGP is only suppose to be 2x faster over the Ivy's IGP's. even with 2x the performance dont expect much more then GT 640m performances.

#7 Posted by 04dcarraher (19171 posts) -

Ok so later on in the year, when the "next gen" type games start appearing, I may decide to upgrade my CPU/Motherboard. Also, I may finally decide to switch from AMD to Intel, since it has to be said, their CPUs are significantly faster comparted to their AMD counterparts. 

I currently have an AMD Phenom X4 955 which I have overclocked to 3.7GHz (not 3.8 like in my sig, I did have it at that but it wasnt too stable and I dont think ive changed my sig yet)

For current games, this CPU seems to cope fine, though it can just about handle Crysis 3 at near max settings, I normally get around 35fps. 

However, I dont feel confident that my CPU will fair too well with new games that come out later in the year. 

My question is, would upgrading from my current CPU to an Intel Core i5 3570k be worthwhile? 

PS: I have a GTX570 GPU :)

Thanks in advance!

Chris_53
I would hold off, until Intel and AMD releases their next line of cpu's, also wait see see what the next generation of consoles bring cpu wise too. I think Crysis 3 when its officially released its cpu performance will equalize like in Crysis 2 where any modern quad core will similar. beta is just that a beta.
#8 Posted by 5SI-GonePostal (355 posts) -

So what you both said is exactly what i thought - Haswell is to Ivy, what Ivy was to Sandy.  So if you have either a Sandy or an Ivy its not worth upgrading unless you are coming from a low end to a high end and even then needing to buy a new mobo is a pain.  Which also means if the price of 4570 is similar to 3570 (as it was with the 2500 for the first 6 months) and a z87 mobo to a z77 and you are wanting to upgrade then Haswell is the way to go.

#9 Posted by Chris_53 (5105 posts) -
[QUOTE="Chris_53"]

Ok so later on in the year, when the "next gen" type games start appearing, I may decide to upgrade my CPU/Motherboard. Also, I may finally decide to switch from AMD to Intel, since it has to be said, their CPUs are significantly faster comparted to their AMD counterparts. 

I currently have an AMD Phenom X4 955 which I have overclocked to 3.7GHz (not 3.8 like in my sig, I did have it at that but it wasnt too stable and I dont think ive changed my sig yet)

For current games, this CPU seems to cope fine, though it can just about handle Crysis 3 at near max settings, I normally get around 35fps. 

However, I dont feel confident that my CPU will fair too well with new games that come out later in the year. 

My question is, would upgrading from my current CPU to an Intel Core i5 3570k be worthwhile? 

PS: I have a GTX570 GPU :)

Thanks in advance!

04dcarraher
I would hold off, until Intel and AMD releases their next line of cpu's, also wait see see what the next generation of consoles bring cpu wise too. I think Crysis 3 when its officially released its cpu performance will equalize like in Crysis 2 where any modern quad core will similar. beta is just that a beta.

Ok cool, thanks for the advice, il wait and see what the future has in stall. As I said before, this is an upgrade il make later in the year.
#10 Posted by hartsickdiscipl (14787 posts) -

Ok so later on in the year, when the "next gen" type games start appearing, I may decide to upgrade my CPU/Motherboard. Also, I may finally decide to switch from AMD to Intel, since it has to be said, their CPUs are significantly faster comparted to their AMD counterparts. 

I currently have an AMD Phenom X4 955 which I have overclocked to 3.7GHz (not 3.8 like in my sig, I did have it at that but it wasnt too stable and I dont think ive changed my sig yet)

For current games, this CPU seems to cope fine, though it can just about handle Crysis 3 at near max settings, I normally get around 35fps. 

However, I dont feel confident that my CPU will fair too well with new games that come out later in the year. 

My question is, would upgrading from my current CPU to an Intel Core i5 3570k be worthwhile? 

PS: I have a GTX570 GPU :)

Thanks in advance!

Chris_53

 

I feel like one of the most well-qualified people to answer this question, since my situation 2 months ago was nearly identical to yours.  I had a 3.7ghz PII X4 955 and a Gigabyte GTX 560 Ti SOC, which trades blows with a stock GTX 570 in most games.  I kept my GPU and upgraded to a 3570k, then OC'd it to 4.5ghz.  

The answer is YES.  You will see a small to moderate performance gain in most games, but it will be huge in others.  The 3570k is much better-suited to handling CPU-intensive games, and will push more data to your GPU.  When you eventually upgrade your GPU, the difference will get even bigger.  I'm very happy with my decision to upgrade.  

#11 Posted by FaustArp (1038 posts) -

I'll likely upgrade to a high-end Haswell when they're out.

Should be a nice step up from my 2600k.

#12 Posted by 04dcarraher (19171 posts) -

I'll likely upgrade to a high-end Haswell when they're out.

Should be a nice step up from my 2600k.

FaustArp
Nope, only expect 10-15% improvement over ivy.
#13 Posted by hartsickdiscipl (14787 posts) -

I'll likely upgrade to a high-end Haswell when they're out.

Should be a nice step up from my 2600k.

FaustArp

 

It doesn't seem like it will be much of a step-up, from what I've read.  

#14 Posted by 5SI-GonePostal (355 posts) -

I'll likely upgrade to a high-end Haswell when they're out.

Should be a nice step up from my 2600k.

FaustArp

Nope - you will see very little + you need a new mobo so not cost effective either

#15 Posted by FaustArp (1038 posts) -

Where are you guys reading up on Haswell performance?

I haven't seen anything yet.

Thanks.  :)

#16 Posted by 04dcarraher (19171 posts) -

Where are you guys reading up on Haswell performance?

I haven't seen anything yet.

Thanks.  :)

FaustArp
From intel themselves, their focusing the IGP portion with haswell giving it nearly 2x the performance over ivy's IGP "HD 4000's"
#17 Posted by FaustArp (1038 posts) -

Well that sucks, I was kinda hoping for a CPU upgrade this year.  

#18 Posted by 04dcarraher (19171 posts) -

Well that sucks, I was kinda hoping for a CPU upgrade this year.  

FaustArp
See what happens when you buy an i7? :P
#19 Posted by FaustArp (1038 posts) -

[QUOTE="FaustArp"]

Well that sucks, I was kinda hoping for a CPU upgrade this year.  

04dcarraher

See what happens when you buy an i7? :P

Lol...

#20 Posted by Chris_53 (5105 posts) -
Thanks for the replies guys and gals. As I said before, I may do this upgrade later on in the year. Do you guys think there will be a price drop on the i5 3570K later in the year?
#21 Posted by TellDaddy (249 posts) -

Very few games are that CPU intensive in the sense that the CPU you have now at that clock speed wouldn't be able to handle them at a very solid frame rate. If you upgrade the CPU you will also need a new motherboard, all in you are probably looking at close to $400. There are some RTS games and a few demanding others where you will gain 20-25 FPS but keep in mind that the gain is going to be from 60FPS to 80 FPS which in reality is a very meaningless bump. Now if we are talking about going from 35FPS to 60FPS that is a huge difference. If you have a good GPU this is not going to be the case, IMO you are going to be throwing away money that you should spend in a year or two. There isn't a game out there that I know of that your CPU is going to hold you back from playing at atleast 45-50 FPS which is more than acceptable. You have to decide if it's worth $400 of your dollars to play games at 75 FPS over 50 FPS and in most cases the gain won't even be that unless you have a top end GPU like a 7970 or 680.

#22 Posted by way2funny (4569 posts) -

I would wait until haswell comes out. I'd say wait until the benchmarks to see how much of an improvement haswell is.

#23 Posted by 04dcarraher (19171 posts) -

I would wait until haswell comes out. I'd say wait until the benchmarks to see how much of an improvement haswell is.

way2funny
Intel already stated dont expect more then 15% on average over ivy
#24 Posted by way2funny (4569 posts) -

[QUOTE="way2funny"]

I would wait until haswell comes out. I'd say wait until the benchmarks to see how much of an improvement haswell is.

04dcarraher

Intel already stated dont expect more then 15% on average over ivy

So if your gonna get an Ivy why not wait and get that extra 15%? Its not like he'd be waiting a long time, its set to come out before summer

#25 Posted by 04dcarraher (19171 posts) -

[QUOTE="04dcarraher"][QUOTE="way2funny"]

I would wait until haswell comes out. I'd say wait until the benchmarks to see how much of an improvement haswell is.

way2funny

Intel already stated dont expect more then 15% on average over ivy

So if your gonna get an Ivy why not wait and get that extra 15%? Its not like he'd be waiting a long time, its set to come out before summer

however if you have a sandy or ivy its almost pointless.
#26 Posted by ionusX (25715 posts) -

[QUOTE="way2funny"]

[QUOTE="04dcarraher"] Intel already stated dont expect more then 15% on average over ivy04dcarraher

So if your gonna get an Ivy why not wait and get that extra 15%? Its not like he'd be waiting a long time, its set to come out before summer

however if you have a sandy or ivy its almost pointless.

indeed 15% on top of a 3770k or 3570k is nothing. basically their shrinking the power draw, replacing the gpu and then manually ocing the 3770k and calling it a new flagship cpu. and thats being optimistic about the whole thing.

amd rpomising a much better result but their behind. imo steamroller will be a heck of a show. we may even see AMD catch up to ivybridge right before haswell shows up on da scene. they defied the odds and beat neahelm and have given sandy a run for her money. now we will see in june if AMD can finally match wits with the megalith that is intel's current gen cpu's if nothing more than in preformance

#27 Posted by FaustArp (1038 posts) -

I myself am kinda torn on the whole Haswell business.  I want to upgrade my 2600k.  There is no point in upgrading to 3770k, the difference is minuscule.

So I'm gonna wait to see what Haswell brings to the table.  I won't upgrade if I can't get a 20% (or more) difference though, so I might just have to stay with my 2600k.  Which kinda bums me out, I want to upgrade.

We'll see I guess.

#28 Posted by blaznwiipspman1 (6028 posts) -

[QUOTE="FaustArp"]

Where are you guys reading up on Haswell performance?

I haven't seen anything yet.

Thanks.  :)

04dcarraher

From intel themselves, their focusing the IGP portion with haswell giving it nearly 2x the performance over ivy's IGP "HD 4000's"

 

that is almost a useless quest.  The only ones going to benefit from this are laptop users, which I admit is pretty awesome to have a laptop with intel igpu capable of playing alot of great games out there at medium settings.  But for their desktop users this means nothing.  I will be interested in getting an ivy laptop, but my lenovo x120e is still plenty for my uses.

#29 Posted by 04dcarraher (19171 posts) -

[QUOTE="04dcarraher"][QUOTE="FaustArp"]

Where are you guys reading up on Haswell performance?

I haven't seen anything yet.

Thanks.  :)

blaznwiipspman1

From intel themselves, their focusing the IGP portion with haswell giving it nearly 2x the performance over ivy's IGP "HD 4000's"

 

that is almost a useless quest.  The only ones going to benefit from this are laptop users, which I admit is pretty awesome to have a laptop with intel igpu capable of playing alot of great games out there at medium settings.  But for their desktop users this means nothing.  I will be interested in getting an ivy laptop, but my lenovo x120e is still plenty for my uses.

AMD's upcoming APU's are going to see 7750 type gpu performance so intel will continue to lose the IGP war.
#30 Posted by hartsickdiscipl (14787 posts) -

[QUOTE="blaznwiipspman1"]

[QUOTE="04dcarraher"] From intel themselves, their focusing the IGP portion with haswell giving it nearly 2x the performance over ivy's IGP "HD 4000's"04dcarraher

 

that is almost a useless quest.  The only ones going to benefit from this are laptop users, which I admit is pretty awesome to have a laptop with intel igpu capable of playing alot of great games out there at medium settings.  But for their desktop users this means nothing.  I will be interested in getting an ivy laptop, but my lenovo x120e is still plenty for my uses.

AMD's upcoming APU's are going to see 7750 type gpu performance so intel will continue to lose the IGP war.

 

That is DAMN impressive.  People will be able to build a PC for sub-1080p gaming using just an APU and no discreet GPU.  Obviously they will have to sacrifice on some detail settings, but still..  That's quite an accomplishment for the red team.  

#31 Posted by bigfoot2045 (773 posts) -

Wait for the new consoles. The new consoles will most likely render anything you could buy right now completely obsolete. 

#32 Posted by 04dcarraher (19171 posts) -

Wait for the new consoles. The new consoles will most likely render anything you could buy right now completely obsolete. 

bigfoot2045

Na, the next xbox is said to have a 1.6 ghz 8 core amd cpu.

#33 Posted by hartsickdiscipl (14787 posts) -

Wait for the new consoles. The new consoles will most likely render anything you could buy right now completely obsolete. 

bigfoot2045

 

Not from what we've seen about them.  

#34 Posted by FaustArp (1038 posts) -

Wait for the new consoles. The new consoles will most likely render anything you could buy right now completely obsolete. 

bigfoot2045
LMAO
#35 Posted by mitu123 (153911 posts) -

Wait for the new consoles. The new consoles will most likely render anything you could buy right now completely obsolete. 

bigfoot2045

Wrong forum.

#36 Posted by bigfoot2045 (773 posts) -

You guys do realize that you need much more powerful PC hardware to achieve similar results to the consoles? They have minimal OS's and the games are much better optimized. You don't want to end up like people who built Pentium D/Geforce 7800 systems in 2005 only to realize that wasn't going to cut it, or people who built Pentium 3/Geforce systems in 2000. I've seen this all before, and guys like you are always proven wrong. 

The best time to build a new system is usually about a year or two after the consoles launch. Then you know exactly what you'll need to run games, and by that point PC hardware will have far surpassed what the consoles are running to the point where you'll be able to achieve much better results on PC. 

Right now is a horrible time to build a new system, especially if what you have is already decent like the OP. 

#37 Posted by 04dcarraher (19171 posts) -

You guys do realize that you need much more powerful PC hardware to achieve similar results to the consoles? They have minimal OS's and the games are much better optimized. You don't want to end up like people who built Pentium D/Geforce 7800 systems in 2005 only to realize that wasn't going to cut it, or people who built Pentium 3/Geforce systems in 2000. I've seen this all before, and guys like you are always proven wrong. 

The best time to build a new system is usually about a year or two after the consoles launch. Then you know exactly what you'll need to run games, and by that point PC hardware will have far surpassed what the consoles are running to the point where you'll be able to achieve much better results on PC. 

Right now is a horrible time to build a new system, especially if what you have is already decent like the OP. 

bigfoot2045

False, next consoles are not using full custom hardware they are using off the shelf pc hardware with minor tweaks "semi custom" stated by AMD. also all coding for API for directx 11 and opengl are from Pc which means if any improvements are made on console pc will see them too. Also I like how you ignore the fact that next xbox is suppose to allocated 3gb just for its OS and features. Fact is that optimization includes downgrading aspects to allow a set condition that the dev want. Now The believable rumored specs have the next console's gpu performance ranging from a 7770 to a full blown 7850.

Sorry to burst your bubble that no amount of "magic" will allow a 7850 to match a 7970 or GTX 680 let alone any gpu release in the near future or around consoles release dates. At best with some tweaks and work you can only expect them to gain 15-25% more performance with coding. And for the cpu's they are AMD based being a 8 core at 1.6 ghz or four core being at 3.2 ghz which means no real gains over pc.

Now fact is that even with a 7850 type of gpu a 7970ghz is almost 2.5x faster, which you could compare it to the 360's Xenos to the 8800, where the gpu have well over 2x the performabce over 2x the memory and bandwidth and is able cream the 360 to this day with multiplats.

#38 Posted by bigfoot2045 (773 posts) -

The 8800 also came out a year after the 360. The 360 launched at the end of 2005, while the Geforce 8800 came out at the end of 2006. You couldn't even get a PC GPU with unified shaders like the 360 when it launched. You kind of just proved my point. It's better to wait at least a year after the consoles come out before building something. 

#39 Posted by darksusperia (6898 posts) -

The 8800 also came out a year after the 360. The 360 launched at the end of 2005, while the Geforce 8800 came out at the end of 2006. You couldn't even get a PC GPU with unified shaders like the 360 when it launched. You kind of just proved my point. It's better to wait at least a year after the consoles come out before building something. 

bigfoot2045

its already stated that this next gen console round will not be ahead of pc's like it was last time. They are using CURRENT off the shelf parts.

 

http://www.pcgamer.com/2012/12/20/next-gen-consoles-will-struggle-to-beat-pc-say-industry-insiders/

#40 Posted by bigfoot2045 (773 posts) -

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

And as I recall, there were very similar articles on tech sites last time around saying that the Xenon CPU and Cell CPU would be weaker than Pentium 4s. There was even an article like that on Anandtech of all places, which they've long since removed. Those kinds of articles look laughable in restrospect. 

#41 Posted by 04dcarraher (19171 posts) -

The 8800 also came out a year after the 360. The 360 launched at the end of 2005, while the Geforce 8800 came out at the end of 2006. You couldn't even get a PC GPU with unified shaders like the 360 when it launched. You kind of just proved my point. It's better to wait at least a year after the consoles come out before building something. 

bigfoot2045
These upcoming consoles are not going to to use top tier hardware or going to be using prototype tech based on upcoming standards. They will not even surpass high ended gpu's from 2011 . Even when the 360 launched the 7800GTX was able to match or surpass the 360's abilities in multiplatform games all the way into 2007. Before being phased out.. The 8800 series was just the last nail in the coffin when it comes to outclassing the consoles. And from the rumors of the next Nvidia gpu is suppose to be nearly 2x faster then the GTX 680 will put the consoles in the same boat as it was a few years ago with the gap between hardware abilities
#42 Posted by darksusperia (6898 posts) -

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

bigfoot2045
awful close to announcements with details like these popping up dont you think?  http://www.tweaktown.com/news/28310/leakedtt-xbox-720-gpu-specifications-reportedly-leaked-online/index.html .
#43 Posted by 04dcarraher (19171 posts) -

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

And as I recall, there were very similar articles on tech sites last time around saying that the Xenon CPU and Cell CPU would be weaker than Pentium 4s. There was even an article like that on Anandtech of all places, which they've long since removed. Those kinds of articles look laughable in restrospect. 

bigfoot2045

if you have any knowledge about computer hardware you know that they will not put in a gpu that has more then a 130w TDP, or ignore the statements and visual slides made and shown by AMD relating to the next consoles. There is no new standard coming out on the horizon that will change unified shaders for quite some time. The PS3's Cell was a POS for gaming(taking more time and money to use SPE's to allow the PS3 to match 360), Pentium 4's core performance is actually faster then one of Xenon's core's and the Cell's PPE"main cpu" . However your forgetting that in 2005 you had the Athlon X2 which was faster then ether console cpu.

#44 Posted by 04dcarraher (19171 posts) -

[QUOTE="bigfoot2045"]

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

darksusperia

awful close to announcements with details like these popping up dont you think?  http://www.tweaktown.com/news/28310/leakedtt-xbox-720-gpu-specifications-reportedly-leaked-online/index.html .

Which makes that gpu for the next xbox fall in line with AMD's prototype model of the 7850 with 12 CU 768 shader processors. However slimed down to save power and reduce heat.

GPU-Z_HD7830-1.jpg

#45 Posted by bigfoot2045 (773 posts) -

[QUOTE="bigfoot2045"]

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

And as I recall, there were very similar articles on tech sites last time around saying that the Xenon CPU and Cell CPU would be weaker than Pentium 4s. There was even an article like that on Anandtech of all places, which they've long since removed. Those kinds of articles look laughable in restrospect. 

04dcarraher

if you have any knowledge about computer hardware you know that they will not put in a gpu that has more then a 130w TDP, or ignore the statements and visual slides made and shown by AMD relating to the next consoles. There is no new standard coming out on the horizon that will change unified shaders for quite some time. The PS3's Cell was a POS for gaming(taking more time and money to use SPE's to allow the PS3 to match 360), Pentium 4's core performance is actually faster then one of Xenon's core's and the Cell's PPE"main cpu" . However your forgetting that in 2005 you had the Athlon X2 which was faster then ether console cpu.

All I can say to this is LOL. I know this is absolute BS because the Cell CPU can handle full 1080p video playback. You can't even play back a 720p file on a Pentium 4. 

#46 Posted by 04dcarraher (19171 posts) -

[QUOTE="04dcarraher"]

[QUOTE="bigfoot2045"]

We don't know that for certain. We know next to nothing about these consoles other than vague rumors. If the console makers have any aces up their sleeves like last time around with the unified shaders on the 360 GPU, or the PS3's heavily threaded Cell CPU, anything you build now would be a waste of money. 

And as I recall, there were very similar articles on tech sites last time around saying that the Xenon CPU and Cell CPU would be weaker than Pentium 4s. There was even an article like that on Anandtech of all places, which they've long since removed. Those kinds of articles look laughable in restrospect. 

bigfoot2045

if you have any knowledge about computer hardware you know that they will not put in a gpu that has more then a 130w TDP, or ignore the statements and visual slides made and shown by AMD relating to the next consoles. There is no new standard coming out on the horizon that will change unified shaders for quite some time. The PS3's Cell was a POS for gaming(taking more time and money to use SPE's to allow the PS3 to match 360), Pentium 4's core performance is actually faster then one of Xenon's core's and the Cell's PPE"main cpu" . However your forgetting that in 2005 you had the Athlon X2 which was faster then ether console cpu.

All I can say to this is LOL. I know this is absolute BS because the Cell CPU can handle full 1080p video playback. You can't even play back a 720p file on a Pentium 4. 

:lol: are you kidding me ? this post right proves you one lost little puppy....
#47 Posted by bigfoot2045 (773 posts) -

How do you figure? While it's hard to do direct comparisons of console and PC CPUs, I can compare them in real world usage, like playing back video files. The Cell CPU in the PS3 has no issues with Blu-ray or 1080p video files. Try doing either of those things on a Pentium 4 and you'd be watching a clip show. 

#48 Posted by 04dcarraher (19171 posts) -

How do you figure? While it's hard to do direct comparisons of console and PC CPUs, I can compare them in real world usage, like playing back video files. The Cell CPU in the PS3 has no issues with Blu-ray or 1080p video files. Try doing either of those things on a Pentium 4 and you'd be watching a clip show. 

bigfoot2045

Its not hard to compare pc cpu's vs console cpu's. Xenon can do 6 cpu clock cycles per second (2 per core) while the Cell "PPE" can only do 3.2, and an Athlon X2's can do 7.3(3.65 per core) cpu clock cycles per second. MIPs is another way to compare, Xenon(3.2 ghz) can do 19,000 while PS3's PPE can only do 10,000, and an Athlon X2 6000 3.0 ghz can do 22,000 MIPs.  Now even with the might of the Cell's SPE's cant even match an intel C2Q Q6600 in Folding@home is 2x faster.  Also Not sure why you like to keep comparing the Pentium 4's since it was designed and released in 2000.....  its 2003/2004 era brothers are able to send the video data to the gpu for decoding just fine. Heck an old laptop using a P4 at 2.6 ghz and a FX 5200 can do 720 video play back. As long as you have a gpu that able to decode bluray your not going to have any trouble.

#49 Posted by kraken2109 (12978 posts) -

guys

64692d1353692312-post-up-your-turbocharg

#50 Posted by blaznwiipspman1 (6028 posts) -

1080p video playback can be done on cell phones nowadays.  It was possible from around 2009- maybe even earlier.  I also know for SURE that a pentium 4 cpu is actually more powerful than an ARM processor...you simply cant compare the two.  An a5 processor in an iphone is much more powerful than an xbox cpu or a cell processor.  An a5 processor is similar in power to a 1995 pentium processor.  The only reason an A5 can run 1080p properly is because of the design of ARM cpus is alot easier to do some tasks.  Try running windows xp, hell even windows 98 would have an absurd amount of difficulty running on an ARM cpu.  Which is why you get these garbage wannabe OS's like Android and ARM that are incredibly stripped down.  So lets just say that youre comparing apples to oranges/