Official ALL NextGen rumors and speculation (PS4, 720, Steambox)

This topic is locked from further discussion.

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#51 MK-Professor
Member since 2009 • 4214 Posts

[QUOTE="MK-Professor"]

both cpu and gpu look outdated even for today standards. I mean a real HD7970 is almost double the speed of a HD7970M

Mr_BillGates

7970m is close to desktop 7870, which means "double the speed" is bs.

:lol::lol::lol::lol::lol::lol::lol::lol::lol::lol::lol::lol::lol::lol::lol:

a real HD7970 have 4097 GFLOPS and the HD7970M have 2176 GFLOPS

and a HD7970 is around 60% faster than the HD7870 and the HD7970M is slower than the HD7870 so yes what i said is 100% correct

and also i never said "double the speed" but "almost double the speed"

Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 Mystery_Writer
Member since 2004 • 8351 Posts
I hope at least the HD 8770 is faster than my 2 year old HD 6990
Avatar image for TheDidact
TheDidact

3986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#53 TheDidact
Member since 2012 • 3986 Posts
OMG THEY ACTUALLY LISTENED TO ME!!!! :D
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54 clyde46
Member since 2005 • 49061 Posts
I hope at least the HD 8770 is faster than my 2 year old HD 6990Mystery_Writer
I don't think thats the case.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#55 clyde46
Member since 2005 • 49061 Posts
OMG THEY ACTUALLY LISTENED TO ME!!!! :DTheDidact
Why would we listen to you?
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] Just wait till they gang up and try to claim ownage over the PC.clyde46

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

GTX 780, a true successor to the 580!

Planning to go for it, huh?

Sweet, would've done the same if I still had my 580, but after the sudden 670 upgrade, will be holding out for 800-series...

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#57 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

Rocker6

GTX 780, a true successor to the 580!

Planning to go for it, huh?

Sweet, would've done the same if I still had my 580, but after the sudden 670 upgrade, will be holding out for 800-series...

When did you upgrade to a 670? I thought you were still rocking some old hardware.
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] GTX 780, a true successor to the 580!clyde46

Planning to go for it, huh?

Sweet, would've done the same if I still had my 580, but after the sudden 670 upgrade, will be holding out for 800-series...

When did you upgrade to a 670? I thought you were still rocking some old hardware.

Must've confused me with someone! :P

Been rocking the 670 since May, when my 580 died in a mad experiment gone wrong...

Avatar image for TheDidact
TheDidact

3986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#59 TheDidact
Member since 2012 • 3986 Posts
[QUOTE="TheDidact"]OMG THEY ACTUALLY LISTENED TO ME!!!! :Dclyde46
Why would we listen to you?

Because I'm a pretty cool guy. I kills aliens and doesn't afraid of anything.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#60 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

Planning to go for it, huh?

Sweet, would've done the same if I still had my 580, but after the sudden 670 upgrade, will be holding out for 800-series...

Rocker6

When did you upgrade to a 670? I thought you were still rocking some old hardware.

Must've confused me with someone! :P

Been rocking the 670 since May, when my 580 died in a mad experiment gone wrong...

What did you do to that poor thing?!
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] When did you upgrade to a 670? I thought you were still rocking some old hardware.clyde46

Must've confused me with someone! :P

Been rocking the 670 since May, when my 580 died in a mad experiment gone wrong...

What did you do to that poor thing?!

Had a reference Zotac model, great card, but the fan was too loud, and was pissing me off. Then I bought an aftermarket cooler, AC Accellero Xtreme, but while removing the stock cooler, one of those small screws on the PCB got stripped, so I tried removing it by force (a huge mistake), I accidentally scratched the PCB, killing the card... :(

Later, out of curiosity, I drilled out that screw, and mounted the cooler, just to see what it'd look like in place. That worked out flawlessly, only making me pissed about not thinking straight. If I used the drill in the first place, the whole thing would've been a cakewalk...

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#63 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

Must've confused me with someone! :P

Been rocking the 670 since May, when my 580 died in a mad experiment gone wrong...

Rocker6

What did you do to that poor thing?!

Had a reference Zotac model, great card, but the fan was too loud, and was pissing me off. Then I bought an aftermarket cooler, AC Accellero Xtreme, but while removing the stock cooler, one of those small screws on the PCB got stripped, so I tried removing it by force (a huge mistake), I accidentally scratched the PCB, killing the card... :(

Later, out of curiosity, I drilled out that screw, and mounted the cooler, just to see what it'd look like in place. That worked out flawlessly, only making me pissed about not thinking straight. If I used the drill in the first place, the whole thing would've been a cakewalk...

Should of stuck to playing with headphones :P
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] What did you do to that poor thing?!clyde46

Had a reference Zotac model, great card, but the fan was too loud, and was pissing me off. Then I bought an aftermarket cooler, AC Accellero Xtreme, but while removing the stock cooler, one of those small screws on the PCB got stripped, so I tried removing it by force (a huge mistake), I accidentally scratched the PCB, killing the card... :(

Later, out of curiosity, I drilled out that screw, and mounted the cooler, just to see what it'd look like in place. That worked out flawlessly, only making me pissed about not thinking straight. If I used the drill in the first place, the whole thing would've been a cakewalk...

Should of stuck to playing with headphones :P

Nah, always hated the damn things. Don't want any creepy looking headwear while gaming! :P

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#65 Kinthalis
Member since 2002 • 5503 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

Meh, won't change anything... 360 and the PS3 are nearly identical in power, yet it didn't stop the pixel counting, and claiming ownages over 0.1% difference in performance and texture quality.

But yeah, seeing the consolites go all-out in such pointless arguments is always good for a few laughs! ;)

Rocker6

Just wait till they gang up and try to claim ownage over the PC.

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

Not only that but TEH OPTIMIZSIONS!!!!?! excuse will take a significant hit next gen.

Current gen cosnoles can be optimized and perform better than PC hardware running the same *DX9* Engine. That is because on the consoles, developers can get around a lot of the limitations of DX9.

However, modern API's like DX 11 are going to be standard for next gen game son PC, and DX11 is a LOT more in line with modern programming and hardware than DX9 is. The differene between optimizing code at a low level on the consoles and running DX 11 on a PC will be singificantly smaller than doing the same on DX9.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

Had a reference Zotac model, great card, but the fan was too loud, and was pissing me off. Then I bought an aftermarket cooler, AC Accellero Xtreme, but while removing the stock cooler, one of those small screws on the PCB got stripped, so I tried removing it by force (a huge mistake), I accidentally scratched the PCB, killing the card... :(

Later, out of curiosity, I drilled out that screw, and mounted the cooler, just to see what it'd look like in place. That worked out flawlessly, only making me pissed about not thinking straight. If I used the drill in the first place, the whole thing would've been a cakewalk...

Rocker6

Should of stuck to playing with headphones :P

Nah, always hated the damn things. Don't want any creepy looking headwear while gaming! :P

Unless you have a stalker, what are you worried about :P
Avatar image for Rocker6
Rocker6

13358

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 Rocker6
Member since 2009 • 13358 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] Should of stuck to playing with headphones :Pclyde46

Nah, always hated the damn things. Don't want any creepy looking headwear while gaming! :P

Unless you have a stalker, what are you worried about :P

Well, hated headwear ever since I was a little kid. It annoys the crap out of me, from headphones, to winter caps, I hate them all!

I know, I know, must sound strange, but I guess you can call it a pet peeve of mine! :P

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 tormentos
Member since 2003 • 33784 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] Just wait till they gang up and try to claim ownage over the PC.Kinthalis

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

Not only that but TEH OPTIMIZSIONS!!!!?! excuse will take a significant hit next gen.

Current gen cosnoles can be optimized and perform better than PC hardware running the same *DX9* Engine. That is because on the consoles, developers can get around a lot of the limitations of DX9.

However, modern API's like DX 11 are going to be standard for next gen game son PC, and DX11 is a LOT more in line with modern programming and hardware than DX9 is. The differene between optimizing code at a low level on the consoles and running DX 11 on a PC will be singificantly smaller than doing the same on DX9.

How about having extra hardware to handle physic and lighting wouldn't that help.? Because is say the PS4 has extra hardware for things like that,which would mean more free resources from the GPU much like Cell helped the RSX,not only that this CPU and GPU are on the same chip,with GDDR5 not on a PCIe port,so exchange of data is actually faster,there are other variables as well,i bet that if you showed people on 2006 screens of Uncharted 3 they would have told you it was bullshots. Many of you forgot how weak the RSX is and how incredible Uncharted 3 look to come from such a weak hardware..
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69 clyde46
Member since 2005 • 49061 Posts

[QUOTE="clyde46"][QUOTE="Rocker6"]

Nah, always hated the damn things. Don't want any creepy looking headwear while gaming! :P

Rocker6

Unless you have a stalker, what are you worried about :P

Well, hated headwear ever since I was a little kid. It annoys the crap out of me, from headphones, to winter caps, I hate them all!

I know, I know, must sound strange, but I guess you can call it a pet peeve of mine! :P

Headphones are the best, just got to get a set thats right for you. MY HD25 Mrk 2's aren't the best for gaming as they are designed for monitoring but they still sound pretty good, only problem is after about 4 hours they start to hurt.
Avatar image for Floppy_Jim
Floppy_Jim

25931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#70 Floppy_Jim
Member since 2007 • 25931 Posts

jmmvTNrEFMURe.jpg

lawl

Sony's devs will do some truly ridiculous things with 4GB GDDR5 RAM. Considering all they did with that weird 256/256MB setup. I'm curious about Durangy's leftover 3GB RAM which, by my limited understanding, is extreme overkill for the OS.

Avatar image for GotNugz
GotNugz

681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 GotNugz
Member since 2010 • 681 Posts

i don't know what to believe anymore. 3GB of ram for durango OS seems overkill, and the rest is DDR3 it's just outdated. i'm not sold on sony using an 8 core cpu, it's the first time i heard about that while on the other hand Durango has been rumored for a while to be using some 8 or 16 core processor. I think both companies are using smoke and mirror to throw each other off. i think sony's final SDK are out already while the final 720 hardware won't be in devs hands until May 2013.

Playstation Omni final hardware: AMD APU quad core@ 3.2Ghz with built in 6670HD(Liverpool) GPU AMD 7850 level dedicated gpu 4GB GDDR5

Xbox Durango: AMD octo-core@2.0ghz, GPU(Venus) custom build reference 8900 series, 1050Mhz core clock, 8GB DDR3(i believe it will be split 4GB DDR3 for OS and 4GB GDDR5 for games) MS may decide to throw in another GPU maybe rumored MARS varient so Crossfire type setup.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#72 Kinthalis
Member since 2002 • 5503 Posts

[QUOTE="Kinthalis"]

[QUOTE="Rocker6"]

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

tormentos

Not only that but TEH OPTIMIZSIONS!!!!?! excuse will take a significant hit next gen.

Current gen cosnoles can be optimized and perform better than PC hardware running the same *DX9* Engine. That is because on the consoles, developers can get around a lot of the limitations of DX9.

However, modern API's like DX 11 are going to be standard for next gen game son PC, and DX11 is a LOT more in line with modern programming and hardware than DX9 is. The differene between optimizing code at a low level on the consoles and running DX 11 on a PC will be singificantly smaller than doing the same on DX9.

How about having extra hardware to handle physic and lighting wouldn't that help.? Because is say the PS4 has extra hardware for things like that,which would mean more free resources from the GPU much like Cell helped the RSX,not only that this CPU and GPU are on the same chip,with GDDR5 not on a PCIe port,so exchange of data is actually faster,there are other variables as well,i bet that if you showed people on 2006 screens of Uncharted 3 they would have told you it was bullshots. Many of you forgot how weak the RSX is and how incredible Uncharted 3 look to come from such a weak hardware..

What hardware will be thereto help? I would imagine perhaps an APU that can tap a low powered GPU for physics? That cna be conpensated on PC by using DX11's direct compute on the GPU, and remeber that CPU's on the PC are more robust.

Avatar image for ninjapirate2000
ninjapirate2000

3347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 ninjapirate2000
Member since 2008 • 3347 Posts

Both consoles will be the same, but Xbox has the advantage of DirectX.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 cryemocry
Member since 2013 • 590 Posts

wii ruined sony and microsoft

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#76 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Kinthalis"]

[QUOTE="Rocker6"]

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

tormentos

Not only that but TEH OPTIMIZSIONS!!!!?! excuse will take a significant hit next gen.

Current gen cosnoles can be optimized and perform better than PC hardware running the same *DX9* Engine. That is because on the consoles, developers can get around a lot of the limitations of DX9.

However, modern API's like DX 11 are going to be standard for next gen game son PC, and DX11 is a LOT more in line with modern programming and hardware than DX9 is. The differene between optimizing code at a low level on the consoles and running DX 11 on a PC will be singificantly smaller than doing the same on DX9.

How about having extra hardware to handle physic and lighting wouldn't that help.? Because is say the PS4 has extra hardware for things like that,which would mean more free resources from the GPU much like Cell helped the RSX,not only that this CPU and GPU are on the same chip,with GDDR5 not on a PCIe port,so exchange of data is actually faster,there are other variables as well,i bet that if you showed people on 2006 screens of Uncharted 3 they would have told you it was bullshots. Many of you forgot how weak the RSX is and how incredible Uncharted 3 look to come from such a weak hardware..

Yeah I wonder what they added to the CPU to help with graphics this time. I'm recently playing Uncharted 3 and it is amazing more often than not. Only 2 parts of the game don't stack up.

Seems like the Ps4 is taking the same approach to hardware as Ps3 did while 720's taking a page from Wii U's book. Abundance of DDR3 ram with tons of eDRAM to offset speed issues.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 cryemocry
Member since 2013 • 590 Posts

Both consoles will be weak they dont care about gamers anymore.

Avatar image for White_Dreams
White_Dreams

925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 White_Dreams
Member since 2011 • 925 Posts
3GB reserved so only 5GB of slower ram compared to 4 GB of much faster, ram, looking like PS4 has the advantage in that department.
Avatar image for PAL360
PAL360

30570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#79 PAL360
Member since 2007 • 30570 Posts

Nice thread, Adobe, it was needed! Can't wait to see what devs line Rockstar, Naughty Dog, 343i, Epic, etc can do on these consoles, considering what they managed to achieve on 2005 hardware!

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80 tormentos
Member since 2003 • 33784 Posts
What hardware will be thereto help? I would imagine perhaps an APU that can tap a low powered GPU for physics? That cna be conpensated on PC by using DX11's direct compute on the GPU, and remeber that CPU's on the PC are more robust.Kinthalis
Didn't you read the article.? ""However, there's a fair amount of "secret sauce" in Orbis and we can disclose details on one of the more interesting additions. Paired up with the eight AMD cores, we find a bespoke GPU-like "Compute" module, designed to ease the burden on certain operations - physics calculations are a good example of traditional CPU work that are often hived off to GPU cores. We're assured that this is bespoke hardware that is not a part of the main graphics pipeline"" Now if that is to help with physics,lighting and stuff like that,i am sure sony will be able to get more from the GPU than PC developers will,also remember that some of the games games are made with multiple configurations in mind,if not quite as specific as consoles are games are.
Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 cryemocry
Member since 2013 • 590 Posts

How many people here bought xbox1 and ps1 on launch? lol Well i did in 95 and 2001 and i have no hype for ps4/720 since they sold out on me.

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#82 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

3GB reserved so only 5GB of slower ram compared to 4 GB of much faster, ram, looking like PS4 has the advantage in that department.White_Dreams

But 720 could have like 128 mb's of eDRAM or something crazy like that. Plus it says PS4's OS will use half a gig, so 3 and a half gigs available for games.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 tormentos
Member since 2003 • 33784 Posts
Yeah I wonder what they added to the CPU to help with graphics this time. I'm recently playing Uncharted 3 and it is amazing more often than not. Only 2 parts of the game don't stack up.

Seems like the Ps4 is taking the same approach to hardware as Ps3 did while 720's taking a page from Wii U's book. Abundance of DDR3 ram with tons of eDRAM to offset speed issues.

Chozofication
Is not on the CPU apparently is a hardware outside the GPU and CPU,i hope sony was smart enough to learn from Cell strengths,physics was one of them and in games like Uncharted 3 and The last of us look they look impressive and even superior to many found on PC.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 tormentos
Member since 2003 • 33784 Posts
But 720 could have like 128 mb's of eDRAM or something crazy like that. Plus it says PS4's OS will use half a gig, so 3 and a half gigs available for games.Chozofication
Is rumored to be 32MB. 3 and half GB of GDDR5 ram is quite allot the PS3 only had 256 for video.. Is about bandwidth.. GDDR5 = 192GBS DDR3 best scenario = 68GB..
Avatar image for White_Dreams
White_Dreams

925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 White_Dreams
Member since 2011 • 925 Posts

[QUOTE="White_Dreams"]3GB reserved so only 5GB of slower ram compared to 4 GB of much faster, ram, looking like PS4 has the advantage in that department.Chozofication

But 720 could have like 128 mb's of eDRAM or something crazy like that. Plus it says PS4's OS will use half a gig, so 3 and a half gigs available for games.

I would think we would have heard it by now, from what i understand people have known about durango's specs longer, and yeah 3.5, but that is about 1.5 GB difference but with ram 2.5 times the speed.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="Rocker6"]

[QUOTE="clyde46"] Just wait till they gang up and try to claim ownage over the PC.Kinthalis

The tears will be sweet... no custom hardware for them this time, no unified shader model, nothing... :twisted:

By the time consoles come out, we'll already have new AMD and Nvidia series, making a console using something like 7970M well outdated...

Not only that but TEH OPTIMIZSIONS!!!!?! excuse will take a significant hit next gen.

Current gen cosnoles can be optimized and perform better than PC hardware running the same *DX9* Engine. That is because on the consoles, developers can get around a lot of the limitations of DX9.

However, modern API's like DX 11 are going to be standard for next gen game son PC, and DX11 is a LOT more in line with modern programming and hardware than DX9 is. The differene between optimizing code at a low level on the consoles and running DX 11 on a PC will be singificantly smaller than doing the same on DX9.

Xbox 360 can mix pixel and vertex instructions which you can't do on DX9c PC. DX9c PC doesn't have Xbox 360's memport (DX10 has a similar feature) and DX10's 3DC+ texture compression features.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="MK-Professor"]

both cpu and gpu look outdated even for today standards. I mean a real HD7970 is almost double the speed of a HD7970M

Chozofication

Consoles should all use Laptop gpu's really, they'd get a top of the line gpu that doesn't create heat like a volcano. Wii U uses a laptop gpu, I think it'd be great if Sony and MS take note and start focusing more on power and heat costs. No one wants another RROD.

All I want for next gen is what PC can do now but in a console environment, we'll get a lot better than that.

Laptop GPUs usually avoids flagships desktop GPUs e.g laptop 7970M is based on desktop 7870.
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#89 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="White_Dreams"]3GB reserved so only 5GB of slower ram compared to 4 GB of much faster, ram, looking like PS4 has the advantage in that department.White_Dreams

But 720 could have like 128 mb's of eDRAM or something crazy like that. Plus it says PS4's OS will use half a gig, so 3 and a half gigs available for games.

I would think we would have heard it by now, from what i understand people have known about durango's specs longer, and yeah 3.5, but that is about 1.5 GB difference but with ram 2.5 times the speed.

Yeah PS4 will have super fast ram, and it should have plenty. I don't think the difference in Ram amount will matter much. Durango won't have that much eDRAM but it will have to have at least double Wii U's, 32 mb's would be an unthinkably bad decision, I would expect 96.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#90 ronvalencia
Member since 2008 • 29612 Posts
I hope at least the HD 8770 is faster than my 2 year old HD 6990Mystery_Writer
8770 would be faster than 8760 OEM (rename 7770), but slower than 8870 OEM (rename 7870). It's unlikely 8770 would beat two Radeon HD 6970s i.e. you need two 7850s.
Avatar image for TheXFiles88
TheXFiles88

1040

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 TheXFiles88
Member since 2008 • 1040 Posts

[QUOTE="Chozofication"]But 720 could have like 128 mb's of eDRAM or something crazy like that. Plus it says PS4's OS will use half a gig, so 3 and a half gigs available for games.tormentos
Is rumored to be 32MB. 3 and half GB of GDDR5 ram is quite allot the PS3 only had 256 for video.. Is about bandwidth.. GDDR5 = 192GBS DDR3 best scenario = 68GB..

Lol, you are such a hypocrite. You just wanted to beleive in whatever looks good for $ony but always rebuff any advantages that the Next Xbox might have. Also, you still don't understand the RAM types to being of.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#92 ronvalencia
Member since 2008 • 29612 Posts

both cpu and gpu look outdated even for today standards. I mean a real HD7970 is almost double the speed of a HD7970M

MK-Professor

Desktop HD7970 uses AMD Tahiti XT GPU while HD7970M uses downclocked/gimped AMD Pitcairn XT (aka 7870)

----

Just to mess everybody up, AMD releases/allows 7870 model number with AMD "Tahiti" GPU i.e. 256bit VRAM + AMD "Tahiti" LE GPU e.g. PowerColor Radeon HD 7870 MYST Edition.

You have two GPU designs referring to the same "7870" model number. PowerColor's Radeon HD 7870 MYST Edition (AMD "Tahiti" LE) is ussualy faster than a reference "7870 Ghz Edition" (AMD Pitcairn XT).

Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#93 Mystery_Writer
Member since 2004 • 8351 Posts

More rumors just emerged - sauce

=========

The cycle of current-gen hardware is coming to an end, which inevitably means the cycle of rumors about the specs of the next generation must begin. Right on cue, an unnamed insider has released a hefty amount of detail regarding the specs of the PS4 and Xbox 720. Be warned, these are only rumors, but they do give us some interesting scenarios to ponder.

Spec rumors of prominent pieces of tech must be taken with a grain of salt, but they almost always pose some interesting questions. Toward the latter years of the PS3 and Xbox 360 generation, games have ended up relatively on par, from both graphical and technical standpoint. This was largely because the machines are similar in power, and because many popular franchises are multiplatform Call of Duty,Mass Effect,Assassins Creed, etc. and if youre a developer, theres really no need to make one version of a product technically superior compared to the others.

PlayStation 4

The insider noted the name of the console will be the Omni, and Sony chose that name to show the unit can do everything. Itll be interesting to know if itll simply be called Omni, or something like PlayStation Omni. If Sony does drop the PlayStation brand from the name, it would be a first for the companys gaming systems and a huge marketing move.

More notable than the naming is the rumor that the Omni wont be capable of displaying in a native4K resolution, which is most likely going to be the next significant upgrade from HDTV (though ExtremeTechs Joel Hruskasuggests OLED TVsare going to be the next big thing). Instead, the Omni will upscale to 4K, and as most of us know a large enough upscale can significantly reduce image quality.

Sony 4K

The insider also says that he has heard from other testers that the Omni dev kits are experiencing heating problems. This is said to be a result of Sony not having enoughmoneyicon1.pngto devote to research and development, like they did with the PS3s Cell processor. He also claimed that Omni dev kits have quietly been available since December of 2012, as Sony is attempting to launch the Omni before Microsofts NextBox. Furthermore, the Omnis chipset, dubbed Starsha, is supposedly going into production on January 23 of this year.

Of course, previous rumors stated that the PS4 would launch in September of 2013, so if the chipset is going into production just a few days from now, the units could easily be ready by then.

The supposed tech specs of the Omni:

  • CPU is an x86 system with a 256-bit bus
  • An APU with a fast GPU
  • 4GB of DDR3 RAM
  • 4GB of GDDR3 RAM
  • Capable of 3.2TFLOPS of data
  • An off-the-shelf design, more reminiscent of a PC than a next-gen gaming console

==========

X720

Supposedly, the Xbox 720 is designed to employray tracing, a method of graphics rendering that yields very pretty results, but requires much more computational power than the current system. Because of this, ray tracing isnt ideal for graphics that need to be generated on-the-fly, such as those found in video games, and is better suited for imagery that can be created beforehand, such as movies. Pixar films are generated using ray tracing so compare those to a standard modern-day video game and youll get some idea of the difference.

The console will supposedly contain threeSoCs, one Mars model used for the system, and two Venus models used for applications. The insiders specs of the Xbox 720 are a lot more detailed and specific than those of the Omni:

  • Mars SOC: AMD 8850 spec GPU clocked at 600MHz,x86 1.8GHz quad-core CPU,Audiodigital signal processing
  • Venus SOC: AMD 8900 spec GPU clocked at 800MHz, quad-core CPU clocked at 2.5GHz, 1.5GB GDDR3 RAM on each SOC clocked at 1.2GHz
  • Power brick of 300 watts
  • Console has 8GB of RAM, but 1GB is dedicated to the operating system
  • Capable of 4.2Tflops of data

Whether or not all this information is true,most of the information the insider releasedabout the PS4 was presented in a somewhat negative spin, whereas the information released about the Xbox 720 was put forth in a positive way. Again, none of this information could be true, but if it is, either the PS4 is looking like the inferior unit, or the insider has a noticeable bias.

Xbox controller

Conversely,VG247 spoke with developer sourcesafter CES 2013, and were told that the PS4 will have a run-capability of 1.84Tflops, whereas the Xbox 720 will only have a run-capability of 1.23Tflops. VG247 pointed out that iftheserumors are true, then the PS4 will be 50% more computationally powerful than the NextBox. As for the conflicting RAM, the CES sources claimed that the PS4 will have 4GB of RAM, and use 1GB of it strictly for the operating system, whereas the Xbox 720 will have 8GB of RAM, but will have 3GB dedicated to the operating system. TheCESsources also stated that both consoles will play 100GB Blu-ray discs.

Previous rumors suggested that the PS4 is built on AMDs A10 APU, but Joel Hruska swooped in to note that AMDs A10-5800K is not possible of generating the aforementioned 1.84Tflops, and can only manage about 736Gflops. That delta, he noted, would not be able to be made up with a simple variant of the chip, it would require a different design.

So, while both variations of the console specs that have been recently leaked seem to be very different from each other, one common thread is that the PS4 and Xbox 720 will have some sort of significant gap in power between one another, whereas the PS3 and Xbox 360 are comparatively similar. If you dont want to start arguing on the internet about which console is rumored to be more capable, at least you can all agree that both consoles willblow the Wii U out of the water.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#94 ronvalencia
Member since 2008 • 29612 Posts

Both consoles will be the same, but Xbox has the advantage of DirectX.

ninjapirate2000

DirectX 11.1 level 11.1 doesn't expose AMD GCN's non-DX GpGPU features.

From AMD's POV, DirectX is part of legacy support.

HSA-OCL-v1.21-graphic-e1341960719282.png

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#95 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

Lol 4.2 teraflops and ray tracing.

Avatar image for Mystery_Writer
Mystery_Writer

8351

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#96 Mystery_Writer
Member since 2004 • 8351 Posts
[QUOTE="Mystery_Writer"]I hope at least the HD 8770 is faster than my 2 year old HD 6990ronvalencia
8770 would be faster than 8760 OEM (rename 7770), but slower than 8870 OEM (rename 7870). It's unlikely 8770 would beat two Radeon HD 6970s i.e. you need two 7850s.

Ok, which one is faster, 256bit GDDR5 or 384bit GDDR3 clocked at 1.2Ghz?
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#97 clyde46
Member since 2005 • 49061 Posts

Lol 4.2 teraflops and ray tracing.

Chozofication
And that 8900 series GPU....
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#98 ronvalencia
Member since 2008 • 29612 Posts

Previous rumors suggested that the PS4 is built on AMDs A10 APU, but Joel Hruska swooped in to note that AMDs A10-5800K is not possible of generating the aforementioned 1.84Tflops, and can only manage about 736Gflops. That delta, he noted, would not be able to be made up with a simple variant of the chip, it would require a different design.

Mystery_Writer

AMD Kaveri APU has about about 1 TFLOPs from Radeon HD 7750 level GPU i.e. 8 CU based GCN. In reach 1.8TFLOPs, it needs to double the CU count to 16, which rivals Radeon HD 7850 level GPU i.e. 16 CU based GCN.

The AMD Kaveri APU can be extended with an additional 8 CUs + related fix function GPU hardware i.e. it's like duct taping another 7750 GCN on AMD Kaveri APU design.

Avatar image for cryemocry
cryemocry

590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 cryemocry
Member since 2013 • 590 Posts

1080p kinect

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#100 04dcarraher
Member since 2004 • 23829 Posts
[QUOTE="Chozofication"]

Lol 4.2 teraflops and ray tracing.

clyde46
And that 8900 series GPU....

No 8850 and a 8950 duct taped together! :P