This topic is locked from further discussion.
Is 10.1 really going to make a difference? Think of it this way... No game supported DX 9.1 or 8.1 developers went straight to 10.0 and 9.0 so my question is why does ATI make DX 10.1 when it wont make a difference, and no game it going to support it? bcroger2I think it's better that a company support standards, whether they are used or not. Better to leave the door open to developers and the consumer. There are a couple games that use DirectX10.1, the main benefit of 10.1 is that the number of framebuffers is reduced, which leaves you more memory to crank up the resolution or apply AA.
Maybe we should make more games with DX10 and maybe run a little bit better first. DX11 will probably become a standard before 10.1 really starts to get used.Kurushiohence i will pull my hair out if DX11 graphics cards come out in less than 2 years from now (if they're needed for new games)
DX10.1 looks like what DX10 should have been. DX10 sucked imo, was way to taxing even on good hardware for pretty much no boost in graphics, I hope DX11 is different.V4LENT1NEI agree. DX9 was the best Windows Driver Architecture, yet.
can you please tell me the name and approx date when the first dx11 card comes out this summer? I wanted to buy a gtx 295 i7 920 comp NOW, but should I wait until this card comes out? If it's around june that's ok, but i really don't want to wait until September! also will this card be as or more powerful than the gtx 295?
If we don't have a clue, what can we possibly tell you? And besides, why would you want a first-edition of a card with new technology? You'll be paying out the nose for that priviledge, and with the exception of the 8800GTX aside, you won't see much a return on your money once games that actually USE the new version API start showing up in the market.can you please tell me the name and approx date when the first dx11 card comes out this summer? I wanted to buy a gtx 295 i7 920 comp NOW, but should I wait until this card comes out? If it's around june that's ok, but i really don't want to wait until September! also will this card be as or more powerful than the gtx 295?
ket222
I dont know where you guys are getting your information from, but DX10.1 has no graphical improvements over DX10. At all.
The only thing it does is make certain optional hardware features a requirement. In order for a card to be DX10.1 compliant it MUST be able to do 4X AA, and 32-bit floating point texturing filtering, and a few other features. High end card already had these features. The idea was to force low end or GPUs that claim DX10 compliant to force a higher level of performance (such as the capability of doing 4X AA).
DX10.1 does nothing else. Theres no new software features or graphical improvements and it was already redundant with high end video cards.
[QUOTE="XaosII"][QUOTE="clyde46"]
So does that mean that ATI cards with DX10.1 run games slightly better than cards with DX10? I dont mean graphicly but performance wise.
clyde46
Nope.
So why are ATI pushing 10.1 then?Because they are 10.1 compliant.
All 10.1 did is up the standards in regards to features that a video card must support. These features were already there for high-end and most mid-range cards.
A low end 10.1 ATI card is guaranteed to support 4x AA because its part of the minimum specifications of DX10.1 compliancy. A low-end DX10 nVidia does NOT have to do 4x AA because its not required of DX10.
Its really more marketing than anything for ATI. Its ATI saying "oh look at us! Our cards are more advanced than nVidias! Buy ours! its 0.1 better!" and nVidia is saying "whatever. Our cards already supportred the 10.1 features. We're not gonna bother updating our marketing for something so minor cuz we're lazy."
DX10.1 is a significant step for integrated GPUs and extremely low end video cards because it ups the performance standard for them quite a bit. Its meaningless for mid-ranged and high-end because they already have these features.
This is why developers dont code for DX10.1. They already were.
[QUOTE="clyde46"][QUOTE="XaosII"]
Nope.
So why are ATI pushing 10.1 then?Because they are 10.1 compliant.
All 10.1 did is up the standards in regards to features that a video card must support. These features were already there for high-end and most mid-range cards.
A low end 10.1 ATI card is guaranteed to support 4x AA because its part of the minimum specifications of DX10.1 compliancy. A low-end DX10 nVidia does NOT have to do 4x AA because its not required of DX10.
Its really more marketing than anything for ATI. Its ATI saying "oh look at us! Our cards are more advanced than nVidias! Buy ours! its 0.1 better!" and nVidia is saying "whatever. Our cards already supportred the 10.1 features. We're not gonna bother updating our marketing for something so minor cuz we're lazy."
DX10.1 is a significant step for integrated GPUs and extremely low end video cards because it ups the performance standard for them quite a bit. Its meaningless for mid-ranged and high-end because they already have these features.
This is why developers dont code for DX10.1. They already were.
Aside from that DX 10.1 supports tesellation and reduces required framebuffers for certain effects.There is no DX9.1. The highest DX9 is DX9.0c which is equivalent to DX9.1 because of the new parallax mapping support, you know, crazy brick walls popping out. magicalclickYeah, I was going to say directX 9c. I know a bunch of games were made with that because that is when my old graphics card wasn't able to play them anymore.
So why are ATI pushing 10.1 then?[QUOTE="clyde46"][QUOTE="XaosII"]
Nope.
XaosII
Because they are 10.1 compliant.
All 10.1 did is up the standards in regards to features that a video card must support. These features were already there for high-end and most mid-range cards.
A low end 10.1 ATI card is guaranteed to support 4x AA because its part of the minimum specifications of DX10.1 compliancy. A low-end DX10 nVidia does NOT have to do 4x AA because its not required of DX10.
Its really more marketing than anything for ATI. Its ATI saying "oh look at us! Our cards are more advanced than nVidias! Buy ours! its 0.1 better!" and nVidia is saying "whatever. Our cards already supportred the 10.1 features. We're not gonna bother updating our marketing for something so minor cuz we're lazy."
DX10.1 is a significant step for integrated GPUs and extremely low end video cards because it ups the performance standard for them quite a bit. Its meaningless for mid-ranged and high-end because they already have these features.
This is why developers dont code for DX10.1. They already were.
This has to be the most well put understandable read I've come across in a long time. I didn't know much of that surrounding DX10.1 but now I do. (I kinda figured 10.1 was nothing special)Please Log In to post.
Log in to comment