Why (How?) Does Computer Processing Power Increase Every Year?

This topic is locked from further discussion.

Avatar image for Srbanator
Srbanator

790

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#1 Srbanator
Member since 2006 • 790 Posts

I just spent a few moments wondering about how far graphical rendering has come in the past say 30 years or so. We've come from 2D pixels, to 3D polygons, and now we're literally making games that can look ridiculously detailed. I wondered then, where we'll be 30 years from now? Will we be able to render down to body tissues? Cells? Molecules?

It just got me thinking, what is it really that makes computers so much stronger every year? It's not even like we have breakthroughs every once in a while. No, Computer technology increases each and every year, exponentially and consistently. What is it that makes a processor "better"? How is there seemingly no ceiling for the level of improvement we develop every year? It's not an engine or something, where we can physically find ways to better push air through it or something. How is an object that operates digitally strengthed so frequently and so rapidly?

Avatar image for YoshiYogurt
YoshiYogurt

6008

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 YoshiYogurt
Member since 2010 • 6008 Posts
They are able to figure out ways to put more tiny transistors on as time passes. We are about to hit a wall with Moor's law, but they say that couple years or so.
Avatar image for Jetset314
Jetset314

234

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#3 Jetset314
Member since 2011 • 234 Posts

I would suggest looking into Moore's Law. It gives some good information on the how. However, knowing what technology is going to look like 30 years from now is impossible. In most cases we know (vaguely) what the next decade holds and even that can change drastically given the right innovation.

I just spent a few moments wondering about how far graphical rendering has come in the past say 30 years or so. We've come from 2D pixels, to 3D polygons, and now we're literally making games that can look ridiculously detailed. I wondered then, where we'll be 30 years from now? Will we be able to render down to body tissues? Cells? Molecules?

It just got me thinking, what is it really that makes computers so much stronger every year? It's not even like we have breakthroughs every once in a while. No, Computer technology increases each and every year, exponentially and consistently. What is it that makes a processor "better"? How is there seemingly no ceiling for the level of improvement we develop every year? It's not an engine or something, where we can physically find ways to better push air through it or something. How is an object that operates digitally strengthed so frequently and so rapidly?

Srbanator

Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 GummiRaccoon
Member since 2003 • 13799 Posts

The transistor is what makes computers do all its work, the more transistors you have the faster the work gets done. About every 18 months transistors shrink to the point that they just about double in count. This has been going on for a few decades.

A few decades of transistor count doubling every 18 months = moar powa

Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#5 JigglyWiggly_
Member since 2009 • 24625 Posts
and you can always add more cores
Avatar image for Srbanator
Srbanator

790

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6 Srbanator
Member since 2006 • 790 Posts
Those were excellent replies, thank you. I've heard of computer power doubling every 18 months but I'd never known what physical process itself was being doubled.
Avatar image for kraken2109
kraken2109

13271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 kraken2109
Member since 2009 • 13271 Posts

Just you wait for quantum computing.

Avatar image for ionusX
ionusX

25777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#8 ionusX
Member since 2009 • 25777 Posts

its increasing but as far as gaming is concerned its nearly declining. as resolution goes up the gpu takes over. as a result intel graphics are probably screwed. the gpu wrkload is much higher at these resolutions and intel isnt evolving fast enough.

on the cpu side of things as stated your cpu is very minor when we start talking resolutions over 1200p. when 1600p is the norm for example you could technically game on a core 2 extreme and get preformance nearly always identical to say a 4970k (haswells flagship), and the only cases where this wouldnt happen is when your fps would be so bloody high anyway it wouldnt matter (zomg 120fps as opposed to 160.. end of galaxy!)

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#9 wis3boi
Member since 2005 • 32507 Posts

Those were excellent replies, thank you. I've heard of computer power doubling every 18 months but I'd never known what physical process itself was being doubled. Srbanator

it will hit a wall in about 10 years when the power of silicon reaches its limit. You can't make the part any less than five atoms across or it will melt. At that point a new material and/or method will have to come into play. As that is being figured out, we'll continue to tweak what we already have (like intel going with 3D transistors instead of flat ones). Quantum processing is very far off for now...the record for a calculation done by quantum computing is "5 x 3 = 15"

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 wis3boi
Member since 2005 • 32507 Posts

Those were excellent replies, thank you. I've heard of computer power doubling every 18 months but I'd never known what physical process itself was being doubled. Srbanator

it will hit a wall in about 10 years when the power of silicon reaches its limit. You can't make the part any less than five atoms across or it will melt. At that point a new material and/or method will have to come into play. As that is being figured out, we'll continue to tweak what we already have (like intel going with 3D transistors instead of flat ones). Quantum processing is very far off for now...the record for a calculation done by quantum computing is "5 x 3 = 15"

Avatar image for JohnF111
JohnF111

14190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#11 JohnF111
Member since 2010 • 14190 Posts
In 30 years game will make theselves, we'll have one massive game but within that game the game makes more mini-games for us to play within the larger game but they won't be small, they're full sized fully featured games since technology will have moved on so much.
Avatar image for XaosII
XaosII

16705

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 XaosII
Member since 2003 • 16705 Posts

Those were excellent replies, thank you. I've heard of computer power doubling every 18 months but I'd never known what physical process itself was being doubled. Srbanator

This is all going to be an oversimplification: The most typical physical process tends to be making thinner lasers.

The most basic unit of a CPU is the transistor. A transistor is an electrical component that will not conduct electricity until it reaches a certain point, and then it will let electricity through without any resistance. Its a 0 when theres little or no charge, and a 1 when theres a small charge. The CPU's are a thin layer of a semi-conductor that is cut up into tiny rectangles that form a transistor. This is what a CPU looks like magnified to the nanometer scale:

p_intel-32nm_transistor.jpg

They only need to make them small enough to be an independant, noncontiguous section. The easiest way to accomplish this is to be able to have lasers that that cut and etch the processor with thinner transistors that leave less of a gap or "wasted space" between them.

While the picture im about to show you isnt the best example it demonstrates the effect of a thinner laser:

A5vsA5fromatv.jpg

The left was manufactured on a laser that is 28 nanometers thin. Its cuts must have a minimum of 28 nanometers in gaps. The right was using a 20 nanometer laser. Theres a significant savings in are just by translating an old design into a smaller laser.

Transistors will all generate heat. The bigger the transistor, the more heat that is generated as the entire surface heats up. A CPU designed on a smaller laser, if nothing else is changed, tends to run much cooler at the same load. This also means that if you go to a thinner laser, you can decide to devote more area to more processing, to run at the same temperature. In other words: Same physical size as before, but a lot more transistors for processing power.

Modern day GPU's are made up of *billions* of transistors, still taking up around the same space as GPU's made 10 years ago of only around a million or two.

Most of processor advances in technology are related to developing thinner lasers, and making the manufacturing process more reliable with those lasers. Theres relatively little breakthroughs in terms of radical processor designs themselves.

The biggest problems facing CPU's today and getting them to increase in performance is how to handle the heat. We are nearly reaching the limits of silicon's thermale envelope. We may have to start to move to a new material, but otherwise, the theory of simply cutting them with thinner lasers is the same.

Again, most of this si an oversimplification of the process, but it should give you a basic idea.

Avatar image for JohnF111
JohnF111

14190

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#13 JohnF111
Member since 2010 • 14190 Posts
How is there seemingly no ceiling for the level of improvement we develop every year?Srbanator
There is, and we're reaching it already.... Link There always was a ceiling, Moore just generalized the idea of transistor density doubling every 18 months but I highly doubt he thought this idea would last forever, he was an Intel engineer afterall and would know the limits of physics.