AMD: Nvidia "completely" sabotaged Witcher 3's performance

  • 113 results
  • 1
  • 2
  • 3
Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#1  Edited By Gue1
Member since 2004 • 12171 Posts

As always the battle for who to optimize more and in this case AMD claims they got shafted in favor of Nvidia HairWorks

First, it was shiny new racing sim Project Cars that drew the ire of the Reddit community, with users claiming that the game is built on a version of PhysX that "simply does not work on AMD cards." Never one to turn away from a good Nvidia-bashing, AMD's corporate VP of alliances Roy Taylor responded with a tweet, saying "Thank for supporting/ wanting an open and fair PC gaming industry."

Naturally, with the complaints flowing in thick and fast, Project Cars developer Slightly Mad Studios joined the fray, and proceeded to place the blame for the game's issues squarely on AMD. "We’ve provided AMD with 20 keys for game testing as they work on the driver side," said Slighty Mad Studios' Ian Bell. "But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips.

A CD Projekt Red developer (speaking to Overclock3D) weighed in on the performance issues, saying: "Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology—the answer is yes! However, unsatisfactory performance may be experienced, as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."

The trouble with GameWorks, at least as AMD tells it, is that Nvidia isn't willing to share the source code for its proprietary graphics APIs like HairWorks and HBAO+. Without that source code, AMD can't optimize its drivers for Nvidia's tech.

Which brings us to the next question: why did CD Projekt Red choose to include HairWorks but not AMD's TressFX? It's entirely possible to include tech from both companies, and as Nvidia claimed to PC Perspective, it's "agreements with developers don’t prevent them from working with other IHVs [independent hardware vendors]."

Ultimately, though, there's an additional amount of time and cost attached to including two very different types of technology to produce largely the same effect.

The article is super long, I just copy pasted a few lines.

http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/1/

Avatar image for effec_tor
Effec_Tor

914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#2 Effec_Tor
Member since 2014 • 914 Posts

didn't Nvidia get sued for that practice in the past?

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3  Edited By lostrib
Member since 2009 • 49999 Posts

isn't that more of the developers fault

Avatar image for RedentSC
RedentSC

1243

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By RedentSC
Member since 2013 • 1243 Posts

yeah this is Nvidia being a dickhead. They don't have to release the source, just provide it to AMD who can work on driver optimization. i know there could be some issues with them being the competition but all AMD API''s are open source....

Basically Nvidia are money grubbing bastards

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#5  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

That's pretty shit, but things like this as well as the exclusive gpu based things like Physx are part of the reason I only look to purchase Nvidia GPU's now.

Avatar image for mbrockway
mbrockway

3560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By mbrockway
Member since 2007 • 3560 Posts

I'd say **** Nvidia, but my laptop has a 960M in it. Desktop rig has AMD's crossfired though. This seems like giant BS. Its sounds like its not actually adding features, it getting devs to add code that intentionally bogs down AMD cards. Then they throw it in AMD's face and say "make it work lol" knowing full well AMD can't do crap about it without Nvidia's source code. Thats stupid and anti-competitive. The games probably run fine on the PS4 and Xbone because that code isn't there.

Good to know not to buy Project Cars. Frankly, Shift 2 was garbage, and I don't like supporting devs that support crap like this. They know full well why it runs good on the consoles and not on AMD PC's.

Avatar image for aroxx_ab
aroxx_ab

13236

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 aroxx_ab
Member since 2005 • 13236 Posts

Isnt even Nvidia users turning off haireffects and PhysX in games to get better performance?

Avatar image for uninspiredcup
uninspiredcup

58822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 86

User Lists: 2

#8 uninspiredcup
Member since 2013 • 58822 Posts

Nvidia are the best, Cobra Kai forever.

Avatar image for Heil68
Heil68

60695

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9  Edited By Heil68
Member since 2004 • 60695 Posts

How scandalous for AMD PC users. :(

Avatar image for lamprey263
lamprey263

44532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#11  Edited By lamprey263
Member since 2006 • 44532 Posts

they made the same claim after AC Unity

they made this claim long before then

the issue has to do with Nvidia helping devs squeeze before performance out of their games so much so the games are basically tailored to Nvidia hardware at the expense of ignoring development for AMD hardware

Avatar image for Metallic_Blade
Metallic_Blade

565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#12 Metallic_Blade
Member since 2005 • 565 Posts

I second that nVIDIA are being a bunch of pricks. They've always have been. At least AMD is nice enough to offer open API's. That, and they don't price gouge as much as nVIDIA does on their GPU line. You know what, we should do with all of this proprietary crap that constantly plagues development.

Eh, Physx looks gimmicky anyways.

Avatar image for mbrockway
mbrockway

3560

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 mbrockway
Member since 2007 • 3560 Posts

@Metallic_Blade said:

I second that nVIDIA are being a bunch of pricks. They've always have been. At least AMD is nice enough to offer open API's. That, and they don't price gouge as much as nVIDIA does on their GPU line. You know what, we should do with all of this proprietary crap that constantly plagues development.

Eh, Physx looks gimmicky anyways.

Physx is open source now. TressFX is not however, along with all the Gameworks crap that everyone just turns off.

Avatar image for Metallic_Blade
Metallic_Blade

565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#14 Metallic_Blade
Member since 2005 • 565 Posts

@mbrockway said:
@Metallic_Blade said:

I second that nVIDIA are being a bunch of pricks. They've always have been. At least AMD is nice enough to offer open API's. That, and they don't price gouge as much as nVIDIA does on their GPU line. You know what, we should do with all of this proprietary crap that constantly plagues development.

Eh, Physx looks gimmicky anyways.

Physx is open source now. TressFX is not however, along with all the Gameworks crap that everyone just turns off.

You don't say? Huh, I guess I was wrong about Physx then.

Avatar image for gamer_durandal
Gamer_Durandal

129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#15 Gamer_Durandal
Member since 2014 • 129 Posts

If the market share was reversed, you can bet AMD would do it to NVidia in a heartbeat. I kind of take what AMD is saying with a grain of salt (or an entire salt lick...) because they make crappy drivers. ATI made horrible drivers, and AMD has continued that tradition - apparently not figuring out that if you make advanced hardware, you need to invest in making solid, performing drivers to match.

Perhaps DirectX12 will level the playing field a bit - allow people to mix and match both camps' cards to get the best of both worlds. Until then, AMD will just need to step up their game if they want to compete (for now).

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#16 DEadliNE-Zero0
Member since 2014 • 6607 Posts

lol, AMD is salty

Avatar image for vtoshkatur
vtoshkatur

1962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 vtoshkatur
Member since 2011 • 1962 Posts

Nvidia has many shady practices. Such as gimping the GTX 960 with a sub par 128 bit bus.

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18  Edited By JangoWuzHere
Member since 2007 • 19032 Posts

Nvidia has always been a bunch of shady dirtbag assholes. I would have bought an AMD card this year if they at least announced them by now.

Even though I have a 970, I still recommend everyone turn off those added nvidia effects like hairworks and HBAO+. They look like total garbage, and they drain performance significantly.

Avatar image for lundy86_4
lundy86_4

61466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#19 lundy86_4
Member since 2003 • 61466 Posts

I don't even have HairWorks enabled on my 970. Way too much of a hit.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#20 04dcarraher
Member since 2004 • 23829 Posts

@lundy86_4 said:

I don't even have HairWorks enabled on my 970. Way too much of a hit.

Not hard to fix..... rendering,ini turn down the hairworks AA to like 2 instead of 8

Avatar image for lundy86_4
lundy86_4

61466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 lundy86_4
Member since 2003 • 61466 Posts

@04dcarraher said:
@lundy86_4 said:

I don't even have HairWorks enabled on my 970. Way too much of a hit.

Not hard to fix..... rendering,ini turn down the hairworks AA to like 2 instead of 8

Yeah I saw the post on that. I'd bother with it, but the lack of HW really doesn't bother me. Nice feature, but nothing crazy.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22 04dcarraher
Member since 2004 • 23829 Posts


lol people blaming Nvidia...... AMD tends to blame everyone but themswelves.....

The game in general uses a good amount tessellation and using Hairwork's adds more heavy tessellation on all objects that use it. And all AMD GCN based gpu suck at tessellation. Their good with direct compute but perform poorly with tessellation. While Nvidia's Kepler performs ok with tessellation but poor with direct compute, And Maxwell handles both quite well. Maxwell handles tessellation better then Kepler by 2-3x faster. AMD users can bypass the performance hit by going into their control panel and limiting tessellation to like 8x or 16x this will solve over usage of tessellation . Also editing the rendering.ini file changing hairworksAA from 8 to 2x will help as well. Also 1.03 patch of the game lowered tessellation usage in hairworks speeding things up some.

AMD blaming Nvidia for sabotage is just plain stupid.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#23  Edited By parkurtommo
Member since 2009 • 28295 Posts

Nvidia has a clear monopoly on the PC gaming industry... Not sure how they're getting away with it. But I like their products and don't give a shit what they do towards AMD or anything like that. Just find it curious how their practices are legal.

Avatar image for osan0
osan0

17802

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 osan0
Member since 2004 • 17802 Posts

its the one thing i dont like about nvidia (currently a 780M owner). they develop all this cool tech but then make it as difficult as possible for other hardware manufacturers to use it. in the end no one wins. nealry half the PC gamer market cant use it (or cant use it well) and developers in general are reluctant to use it because it runs like crap on too many computers. no point in implementing the feature.

with tressFX AMD made it known how it works and gave all the information needed to get good performance. Nvidia had a driver update sorted quickly and any issues nvidia had with tressFX became history quickly.

i dont expect nvidia to write AMDs driver for them but i dont like it when they also make it unnecessarily difficult to get up to speed. AMD have to reverse engineer the problem.

so, like so many of these cool features, we are again in a situation where developers wont integrate it themselves and the only games that will use it are games where nvidia does most of the leg work to get it running in the game. this nonsense has to stop. it benefits no one.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#25 parkurtommo
Member since 2009 • 28295 Posts

@lundy86_4 said:
@04dcarraher said:
@lundy86_4 said:

I don't even have HairWorks enabled on my 970. Way too much of a hit.

Not hard to fix..... rendering,ini turn down the hairworks AA to like 2 instead of 8

Yeah I saw the post on that. I'd bother with it, but the lack of HW really doesn't bother me. Nice feature, but nothing crazy.

It's quite noticeable on large creatures. But yeah, too much of a hit for it to be worth it really. And whenever the camera gets anywhere close to geralt's hair because of some obstacle or wall the fps goes down to like 10 for an instant (because of the progressive rendering of hairworks based on distance, and the built in AA).

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#26  Edited By Daious
Member since 2013 • 2315 Posts

The sub-par driver support for my 780ti is extremely depressing. My performance was on par with gtx 980 earlier in the year. Definitely not the case anymore.

They were able to somewhat optimize a game on two consoles based on AMD hardware.

Avatar image for dxmcat
dxmcat

3385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 dxmcat
Member since 2007 • 3385 Posts

When considering upgrades......

My system with 2 potential GTX 970s would use 480W

My system with 2 potential R9 290X would use 780W

So for needing 2x the power, and causing 2x or more the Heat, do I get 2x the performance?

No. K Thx Bye AMD.

Avatar image for lundy86_4
lundy86_4

61466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#28 lundy86_4
Member since 2003 • 61466 Posts

@parkurtommo said:

It's quite noticeable on large creatures. But yeah, too much of a hit for it to be worth it really. And whenever the camera gets anywhere close to geralt's hair because of some obstacle or wall the fps goes down to like 10 for an instant (because of the progressive rendering of hairworks based on distance, and the built in AA).

I ran it for the first little while, and it definitely looks nice. Not a huge deal breaker.

Really cool tech, mind you.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#29  Edited By Daious
Member since 2013 • 2315 Posts

@dxmcat said:

When considering upgrades......

My system with 2 potential GTX 970s would use 480W

My system with 2 potential R9 290X would use 780W

So for needing 2x the power, and causing 2x or more the Heat, do I get 2x the performance?

No. K Thx Bye AMD.

R9 290 series is what? 2 years old?

Maxwell is less than a year old. AMD FIJI architecture (the competitor of maxwell) doesn't come out until June.

The major benefit of the r9 290 is it's lower price. R9 290 is in the price range of a gtx 960 but offers way more performance.

We have to see what FIJI brings to the table to see how they compare with the 900 series.

Avatar image for SolidTy
SolidTy

49991

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#30 SolidTy
Member since 2005 • 49991 Posts

Nvidia plays dirty. Doesn't surprise me one bit.

Avatar image for ominous_titan
ominous_titan

1217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 ominous_titan
Member since 2009 • 1217 Posts

@uninspiredcup: sweep the leg

Kobra_kai is my psn gamertag

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#32 04dcarraher
Member since 2004 • 23829 Posts

@SolidTy said:

Nvidia plays dirty. Doesn't surprise me one bit.

AMD tends to blame everyone but themselves.....

You didnt see "Nvidia" complain about the games using Mantle..... the games that were promoted by AMD.

Witcher 3 in general uses a good amount tessellation and using Hairwork's adds more heavy tessellation on all objects that use it. And all AMD GCN based gpu's poorly handle tessellation. Their good with direct compute but perform poorly with tessellation. While Nvidia's Kepler performs ok with tessellation but poorly with direct compute, And Maxwell handles both decently . Maxwell handles tessellation better then Kepler and GCN.

Avatar image for SexyJazzCat
SexyJazzCat

2796

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33  Edited By SexyJazzCat
Member since 2013 • 2796 Posts

Why the hell would a company share technology with its competitor? Lol.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#34 ronvalencia
Member since 2008 • 29612 Posts

@Gue1:

Read http://www.reddit.com/r/witcher/comments/36jpe9/how_to_run_hairworks_on_amd_cards_without/

How to run Hairworks on AMD cards without crippling performance

It's the same method for Crysis 2 NVIDIA patch. Excessive tessellation with large overdraw issues that doesn't improve visuals.

Project Cars issue

Loading Video...

Up to 40 percent frame rate improvements with Windows 10 on Radeon HD R9-280X. Wait for Windows 10's release on July 2015

http://wccftech.com/nvidia-responds-witcher-3-gameworks-controversy/

As a result of the backlash, Nvidia changed its policy to allow game developers access to the source code as of mid April of last year. However according to AMD, this did not alter the situation, as game developers engaged in the Nvidia GameWorks program were still not allowed to work with AMD to optimize the code for their hardware. Something which Nvidia initially denied but later Nvidia’s Tom Petersen and Rev Lebaradian admitted. Witcher 3 developers, CD Projekt Red, reaffirmed this again two days ago.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35 ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher: GCN Tonga handles Hairworks similar to 960

Avatar image for Pedro
Pedro

69345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#36 Pedro
Member since 2002 • 69345 Posts

@gamer_durandal said:

If the market share was reversed, you can bet AMD would do it to NVidia in a heartbeat. I kind of take what AMD is saying with a grain of salt (or an entire salt lick...) because they make crappy drivers. ATI made horrible drivers, and AMD has continued that tradition - apparently not figuring out that if you make advanced hardware, you need to invest in making solid, performing drivers to match.

Perhaps DirectX12 will level the playing field a bit - allow people to mix and match both camps' cards to get the best of both worlds. Until then, AMD will just need to step up their game if they want to compete (for now).

The claim that AMD have horrible drivers is parrot drivel and you should stop parroting misinformation.

Avatar image for HalcyonScarlet
HalcyonScarlet

13659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#37 HalcyonScarlet
Member since 2011 • 13659 Posts

I still prefer Nvidia. I like their hardware, software and driver support. I keep on hearing about people having performance issues on AMD cards. It's up to AMD to improve anytime they want.

I don't think Nvidia should share anything. Think about it, they are a graphics company, they are pro active, they invent decent new graphics hardware and software technologies. Why should they share, because AMD are passive and don't try to lead the way themselves.

Avatar image for Pedro
Pedro

69345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#38 Pedro
Member since 2002 • 69345 Posts

@HalcyonScarlet said:

I still prefer Nvidia. I like their hardware, software and driver support. I keep on hearing about people having performance issues on AMD cards. It's up to AMD to improve anytime they want.

I don't think Nvidia should share anything. Think about it, they are a graphics company, they are pro active, they invent decent new graphics hardware and software technologies. Why should they share, because AMD are passive and don't try to lead the way themselves.

AMD passive? That makes no sense. You have Gsync by Nvidia which is proprietary and then you have Freesync by AMD which is open source. They both accomplish the same task but Gsync increases the price of monitors by $100. You got Hairworks by Nvidia and Tressfx by AMD.

Avatar image for gamer_durandal
Gamer_Durandal

129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#39 Gamer_Durandal
Member since 2014 • 129 Posts

@Pedro: ...or I could be making an observation based on the number of AMD systems I've put together - every single one of them has driver issues of some sort. Like the ALi Alladin chipset... the ATI Radeon driver problems with Windows 95... I've seen driver issues with my mother's system, which is an A88X based A10 system - great for web surfing and occasional video things. It took a serious amount of tinkering and under-clocking to even get that thing working properly - almost all the performance boosting features in the A10 cause Windows 8 to lock up or stop responding - took me three times to get the OS to even install on it.

So, parroting, huh?

The only AMD system I have that hasn't given me trouble is serving as a FreePBX server box and that's because it isn't really pushed all that hard (No GUI, no fancy features needed, etc...).

I'd really like to see AMD/ATI and nVidia be on equal footing, but that can't happen as long as they don't address these problems. AMD hardware has some pretty impressive features listed and the designs of some of their products are pretty innovative. They just shoot themselves in the knees with the drivers and supporting software.

Avatar image for HalcyonScarlet
HalcyonScarlet

13659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#40 HalcyonScarlet
Member since 2011 • 13659 Posts
@Pedro said:
@HalcyonScarlet said:

I still prefer Nvidia. I like their hardware, software and driver support. I keep on hearing about people having performance issues on AMD cards. It's up to AMD to improve anytime they want.

I don't think Nvidia should share anything. Think about it, they are a graphics company, they are pro active, they invent decent new graphics hardware and software technologies. Why should they share, because AMD are passive and don't try to lead the way themselves.

AMD passive? That makes no sense. You have Gsync by Nvidia which is proprietary and then you have Freesync by AMD which is open source. They both accomplish the same task but Gsync increases the price of monitors by $100. You got Hairworks by Nvidia and Tressfx by AMD.

Maybe it's just perception thing then. But they need to stop whining and get competitive.

Anyway, personally AMD products just don't excite me right now. They come off as the stuff you get when you just want bang for your buck. It's up to AMD to market their stuff right and to get people excited about their stuff. Stop blaming the competition.

Avatar image for HalcyonScarlet
HalcyonScarlet

13659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#41 HalcyonScarlet
Member since 2011 • 13659 Posts
@gamer_durandal said:

@Pedro: ...or I could be making an observation based on the number of AMD systems I've put together - every single one of them has driver issues of some sort. Like the ALi Alladin chipset... the ATI Radeon driver problems with Windows 95... I've seen driver issues with my mother's system, which is an A88X based A10 system - great for web surfing and occasional video things. It took a serious amount of tinkering and under-clocking to even get that thing working properly - almost all the performance boosting features in the A10 cause Windows 8 to lock up or stop responding - took me three times to get the OS to even install on it.

So, parroting, huh?

The only AMD system I have that hasn't given me trouble is serving as a FreePBX server box and that's because it isn't really pushed all that hard (No GUI, no fancy features needed, etc...).

I'd really like to see AMD/ATI and nVidia be on equal footing, but that can't happen as long as they don't address these problems. AMD hardware has some pretty impressive features listed and the designs of some of their products are pretty innovative. They just shoot themselves in the knees with the drivers and supporting software.

I don't hear about that too often. Is it as literal as it sounds. You really had to turn the clock frequency down to get it to work?

Avatar image for gamer_durandal
Gamer_Durandal

129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#42 Gamer_Durandal
Member since 2014 • 129 Posts

@HalcyonScarlet: Yeah, it's advertised as a 3.7 Ghz chip with 4.0 available through Turbo Boost. Not only would turbo-boost cause the system to just hang for no reason, I had to clock down to 3.5 GHz to get the system to remain stable. The graphics power on it are impressive for a 'budget build' machine, its just a shame I can't get it to live up to its higher potential - it would really future-proof the system.

Avatar image for Pedro
Pedro

69345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#43  Edited By Pedro
Member since 2002 • 69345 Posts

@gamer_durandal said:

@Pedro: ...or I could be making an observation based on the number of AMD systems I've put together - every single one of them has driver issues of some sort. Like the ALi Alladin chipset... the ATI Radeon driver problems with Windows 95... I've seen driver issues with my mother's system, which is an A88X based A10 system - great for web surfing and occasional video things. It took a serious amount of tinkering and under-clocking to even get that thing working properly - almost all the performance boosting features in the A10 cause Windows 8 to lock up or stop responding - took me three times to get the OS to even install on it.

So, parroting, huh?

The only AMD system I have that hasn't given me trouble is serving as a FreePBX server box and that's because it isn't really pushed all that hard (No GUI, no fancy features needed, etc...).

I'd really like to see AMD/ATI and nVidia be on equal footing, but that can't happen as long as they don't address these problems. AMD hardware has some pretty impressive features listed and the designs of some of their products are pretty innovative. They just shoot themselves in the knees with the drivers and supporting software.

Referencing Windows 95 is truly grasping for straws. A10 are APUs and anyone expecting steller performance from those are being unrealistic whether its Nvidia, AMD or Intel. So, you are parroting. I have extensive experience with both Nvidia and AMD, this claim that AMD drivers are terrible is nothing but unfounded trash talk. Out of all of the machines that I have constructed and supported Nvidia has been the only one that has had driver issues and I am not talking about handful of computers but over hundreds(not stating that hundreds fail but hundreds that I have worked on). Thats not to say that either company has drivers issues, they are more or less on the same level with Nvidia being the only ones that have failed because of drivers.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#44 04dcarraher
Member since 2004 • 23829 Posts

@ronvalencia said:

@04dcarraher: GCN Tonga handles Hairworks similar to 960

Tonga is the exception, ie GCN 1.2,

Avatar image for Pedro
Pedro

69345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#45 Pedro
Member since 2002 • 69345 Posts

@HalcyonScarlet said:

Maybe it's just perception thing then. But they need to stop whining and get competitive.

Anyway, personally AMD products just don't excite me right now. They come off as the stuff you get when you just want bang for your buck. It's up to AMD to market their stuff right and to get people excited about their stuff. Stop blaming the competition.

You are aware that AMD has been sabotaged by Intel in the billions undermining their earnings dramatically.

Avatar image for HalcyonScarlet
HalcyonScarlet

13659

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#46  Edited By HalcyonScarlet
Member since 2011 • 13659 Posts

@gamer_durandal said:

@HalcyonScarlet: Yeah, it's advertised as a 3.7 Ghz chip with 4.0 available through Turbo Boost. Not only would turbo-boost cause the system to just hang for no reason, I had to clock down to 3.5 GHz to get the system to remain stable. The graphics power on it are impressive for a 'budget build' machine, its just a shame I can't get it to live up to its higher potential - it would really future-proof the system.

Surely AMD are aware of this issue right?

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

This sounds like something someone in 2nd place would say. AMD needs to fix their drivers and make a CPU that's worth a damn.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#48  Edited By 04dcarraher
Member since 2004 • 23829 Posts
@Pedro said:

Referencing Windows 95 is truly grasping for straws. A10 are APUs and anyone expecting steller performance from those are being unrealistic whether its Nvidia, AMD or Intel. So, you are parroting. I have extensive experience with both Nvidia and AMD, this claim that AMD drivers are terrible is nothing but unfounded trash talk. Out of all of the machines that I have constructed and supported Nvidia has been the only one that has had driver issues and I am not talking about handful of computers but over hundreds(not stating that hundreds fail but hundreds that I have worked on). Thats not to say that either company has drivers issues, they are more or less on the same level with Nvidia being the only ones that have failed because of drivers.

AMD have stepped up game since late 2012. But AMD general DX11 optimizations and overheads are awful compared to Nvidia's. They have to specifically optimize and fix it's short comings every time a popular game comes out. A recent example is Project Cars, running the game in Win8 and then running Win 10 seen a massive gains.

Avatar image for gamer_durandal
Gamer_Durandal

129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#49 Gamer_Durandal
Member since 2014 • 129 Posts

@Pedro: Windows 95, Windows 98, Windows XP (though the latter was a system a friend built, not me). I've even seen some of the newer AMD notebooks have driver issues. I owned a Sony notebook computer that had a ATI Mobile Radeon chip in it that had driver issues and performance problems. So, your gonna nick-pick my argument because I didn't lay down an extensive listing of the systems I've seen problems with? Oh come on...

I'm not expecting stellar performance out of an APU processor, but at least *ADVERTISED* performance out of it is my baseline of comparison. When it won't even operate as designed, that's a sign to me that there is a problem. Either its a serious hardware deficiency or its supporting software problems - and I am placing my bets on the supporting software. Why? Its where a TON of companies fsck things up. They make FANTASTIC hardware, but forget to back it up with software to make it all work.

You keep glossing over all the praises I've had for AMD/ATI in your blind defense of an entire company - both past and present. I mean, nVidia has had some stinkers too, if I remember right - its just the nature of the business. So....believe me or don't believe me. I don't care - it won't change what I've seen and experienced; no matter how loudly you want to call dissenting opinions 'trash talk'.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#50  Edited By Ten_Pints
Member since 2014 • 4072 Posts

Which is one of the reasons I have never bought an Nvidia card in my life, they are over priced and they do stupid shit that fragments PC gaming, is a lose for all gamers.