@techhog89: Thanks, I appreciate your informative knowledge. I had bought the first Witcher game for my birth day, I used the little knowledge I learned from one semester in a 9th grade computer class and some how to videos on YouTube to help guide me in replacing my prebuilt pc Geforce 7300 Le with an BFG Geforce 8600 Gt OC 512mb and a new psu . I had a 720p 19 inch Magnavox television and at the time I hadn't played a video game since I played a pc version of Bloodrayne 2 which came with a dvd copy of the film Bloodrayne; before that the last game I played was Killer Instinct or either Flying Dragon for the Nintendo 64 my parents bought me for Christmas in the late 90's. The Witcher was very buggy, for it repeatedly crashed, so I gave up on it about 3/4 to 4/5 into the game.
truebond's forum posts
@techhog89: Thanks! I did not know that about developer's kits. I thought extra ram would have helped raised the fps and resolution as long as the level of graphical details stay the same as it is now. I thought because the same graphic cards can sell with different amounts of memory, such as Radeon RX 570 comes in 4gb, 8gb and 16gb versions. I wonder why do companies sell higher ram versions of the same card since it doesn't help with fps and resolution? Is the memory meant for something else? Sorry, if I'm asking too many questions. I don't know much about the technical aspects of gaming, for I'm just a once in a blue moon casual gamer.
I heard that the chip (cpu/gpu) that Nintendo chosen was their best option Nintendo had to achieve an efficient battery life for a viable handheld console while still being a viable home console. Was the Nintendo Switch at launch sold for a profit, loss or did they break even at launch? If the Nintendo Switch sold for a profit, do you think Nintendo should have added 8 GB of main memory instead of 4 GB and would that of made porting games easier and would that have allow it to sustain 1080p Dock and 720p mobile at 60fps since Nintendo couldn't get a better cost effective low power consuming chip (cpu/gpu) at that time to maintain an efficient battery life for viable mobile gaming? I know that the pc has many different settings at different resolutions and that the gap in low-end to high-end hardware for pc's to run games on low settings to high settings is much bigger than Nintendo Switch (low settings) to XBOX ONE and Sony PlayStation 4 (high settings), so why aren't the third party support better when The Witcher 3 will be on the Nintendo Switch?
I suspect cafe will be power by a gpu comparable to the hd 6670 and have 1gb of DDR3 ram, with a cpu similar to the xbox360 but more powerful. If it is less powerful than this, then I will not be interested in it. I will just game on my entry-level pc, and wait to see if Microsoft or Sony has what I want.
IthinkNintendo will put 1gb of DDR3 ram in Project Cafe, and Sony and Microsoft will put 2gb of DDR3 or GDDR5 ram in their next generation counsel. I`am more concern with the amount of ram then the cpu and gpu, it is confining PC games. PC developers are designing games with counsel limited amount of ram in mind. PC games current average ram requirement is 2gb, if counsels had half this amount, they would not have received compromised game such as crysis 2, which has a less open world environment. Just like Gears of War developers pushed to get Microsoft to add an extra 256mb of ram to xbox 360, the PC developers should do the same for next generation of counsels.
system ram 2gb ( I will settle for 1gb ). The counsel should cost no more than $300. No 3d, 3d will deplete the system resources ( frame rate drop, decrease graphics capability, physics slow down, which affects the AI and brings total system cost up ). It should have graphics comparable to mid range PC graphics cards.
I hope they decide to bring it over to pc, so I and many others without counsels can enjoy.
No, gameplay and story should equally be great.
Graphics are important but it does not trump story or gameplay.
I can enjoy playing a game with good story and gameplay and have average graphics.