omnimodis78's forum posts

Avatar image for omnimodis78
omnimodis78

I distinctly remember reading so much about this about 2 years ago, likely even a bit longer, and they were constantly making out like it was ready for retail within months. What is the hold up? Are there still tech issues to iron out, or are we talking simple economics here?

Avatar image for omnimodis78
omnimodis78

@Dannerfjord @Hellidol Investment? I can easily resell on Kijiji a ~1 year old mid-range card at about 70% of its cost, but what sort of person willing and able to drop cash on such a crazy thing would ever care to buy it used, so really, there is no resale value, and how much better is his ROI seeing that, for example, my GTX770 is limited to 60fps (v-sync) and plays every game out there on max (or close enough that I don't sweat not dropping >$1,600). Not an investment at all, it's just the sucker principle being proven.

Avatar image for omnimodis78
omnimodis78

@evil13killer The debate is still out on which came first BTW, but it does seem more likely that because AMD was closely associated with DX12's development from a very early stage, they probably got some ideas... It is looking more and more like AMD ran with an idea that was not really their own. So, nothing to thank AMD for.

If you're interested in facts: << LINK REMOVED >>

Avatar image for omnimodis78
omnimodis78

I seriously doubt that most of you complaining over this could tell the difference between 720p and 1080p under a controlled environment test. You think you could, you say you could, but you better believe it that you wouldn't - and in fact 720p is the minimum for the xbox one, more games run ~900 than 720, if I'm not mistaken. On the whole, though, it's unfortunate that this even has to be an issue, 1080p is the current HD standard and compromising on that was just stupid on Microsoft's part.

Avatar image for omnimodis78
omnimodis78

<< LINK REMOVED >><< LINK REMOVED >> I remember those days quite well, and I can tell you that Doom3 did set a new standard in real-time graphics and sound design. The game was fantastic. Besides both games being masterpieces (and the timing of release), there's nothing else that compares there.

Avatar image for omnimodis78
omnimodis78

@smartest_gamer @Joeguy00 My dear friend, do you know what the 'k' means in i7-4770k? It means that Intel is begging you to overclock it. They can't officially tell you to do it, but if 10 Intel engineers came to your home and saw that you're running your i7-4770k on stock, 10 of them would line up and slap the silly out of you, one by one! Need help overclocking, ask, many of us can point you in the right direction! In fact, you can OC your processor, making it faster, on all cores (right now, your CPU only works at full speed on a limited number of cores) and still let it use LESS power! Not kidding. Not kidding at all. Oh, and an OC will actually scale with your SLI set-up, so the benefits wouldn't just be synthetic.

Avatar image for omnimodis78
omnimodis78

@smartest_gamer You should be able to run Crysis 3 without any difficulty, since I can do so with just one 770, half your RAM (which doesn't matter) and a step-down from your CPU: the i5 3570k. By the way, what do you mean by "smooth"? Are you saying that you are experiencing stuttering or micro-stuttering, or that you downright cannot reach 60fps?



Avatar image for omnimodis78
omnimodis78

I wish there could be some standard in identifying if a game is more GPU or CPU intensive. Also, some benchmark to indicate if the game is just unoptimized or if it's the system that can't keep up. I can play nearly every one of the games on this list with either on max or a "tweaked max" (i.e. uber-sampling in the TW2 kills frame rates even to this day, but it's completely unnecessary to enable), but then ACIV played like sh*t even with most things on medium (until I discovered a way to make it work right...) It's frustrating because I think most performance issues these days are more due to just unoptimized PC releases/ports, and it's unfair that we have to pick up the slack with having to buy PC parts with more raw power at its disposal.

Avatar image for omnimodis78
omnimodis78

@AzureBonds76 @Sun-Tzu-GE Yes, that's why Diablo III was rated as great/perfect game by the first round of professional reviewers even though indie reviewers were giving it somewhere around mediocre to bad. That's also why, this site included, instead of reviewing the state of the game as it was at that time, they focused on where the game could one day go. Yes, most are professional. Of course, Diablo 3 is just one single tiny example of where we see this exact same nonsense happen with AAA games that are actually mediocre - but not according to the professionals.

Avatar image for omnimodis78
omnimodis78

Between this and The Witcher 3, and actually one or two games I know to be open world (and high profile titles but I can't think of them right now) I actually think that something rare will be happening in gaming. Open world RPG is just so much fun, when done well, and besides Skyrim, I can't think of it being done well in a while.