webedoomed's forum posts

Avatar image for webedoomed
webedoomed

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#1  Edited By webedoomed
Member since 2013 • 25 Posts

I think you mean the r9 290x. You're better off saving $150 on an aftermarket r9 290 and overclocking it 50mhz, as that's the only difference between the two.

There's still a slight chance I'll splurge for it.

Avatar image for webedoomed
webedoomed

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#2 webedoomed
Member since 2013 • 25 Posts

Is there actually any good reason someone can provide me for having a 120hz monitor or going higher than 1080p? Are there any games which are cramped by a mere 1920x1080 resolution? Is it even possible for the best eyes out there to process over 45-60 frames?

I was reading an article saying that most people can't even distinguish between 720p and 1080p unless you're really close up. I don't plan on ever sitting hunched behind a computer desk again. The monitor is about 7 ft away. Can someone provide me with reasoning for why I would want/need to go any higher?

I'll eventually jump to 4k when they are under $500 (proly 24-36 months away from that), but don't see the point until then. I have doubts that any game out there will see actual improvements from it. If developers cater to 4k, they lose 99% of their market, because lower resolutions will be too cramped. Is this reasoning correct?

Avatar image for webedoomed
webedoomed

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#3 webedoomed
Member since 2013 • 25 Posts

That gap will disappear when mantle is released. The question is, how many engines will utilize it? I'm hearing ati is proly better considering the next gen consoles are based on their architecture. The 270x should slightly outperform the ps4s gpu. Think it's a safe bet to keep me going for the next few years.

Avatar image for webedoomed
webedoomed

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#4 webedoomed
Member since 2013 • 25 Posts

Appreciate the comments. I have a 42" 1080p lcd in the front room. That will be what I use. See no reason to game over 60fps or higher than that resolution anytime soon. 3 years down the line I'll purchase 4k and upgrade the hardware.

I'm going with a 2600k from frys for $145. Still debating on how much I want to spend on the video card. Will probably stick with the 270x. It's a beefed up 7870 and that seems to be more than enough. I should be able to build it out for $700 and be satisfied.

Avatar image for webedoomed
webedoomed

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#5  Edited By webedoomed
Member since 2013 • 25 Posts

I'm complete noob to gaming.

My question is if it's really worth it to purchase such expensive video cards when most of the tech is updating every 6-18 months depending on the components. Couldn't I spend $199 on a r 270x and play 99% of titles that has ever existed in 1080 at high to ultra settings? Isn't it just a dozen or less titles that couldn't run well with that card?

From what I'm gathering, there's practical gaming with a focus on actual gameplay, and then there's uber gamers who value the eye candy. Is there that much of a difference in the feel of a game going from medium to high or high to ultra settings and textures?

Figure I can put a decent rig together for $700 that will play nearly anything on high/ultra, else splurge an extra $400 and play the rest maxed out. I'm not sure this is worth it. The newest games seem to rarely have large discounts, where as older games are highly discounted. Doesn't it make more sense to stay a gen behind and play all you like, with high settings on the cheap?

Serious question. I know it sounds rather noobish, well I'm tech savvy, but been out of gaming for about 15 years now.