# Anyone up to date on CPU news?



## ADF (Jul 16, 2008)

I haven't been keeping track of the CPU market so I'm a little out the loop, anyone know what Intel/AMD have planned post Conroe/AM2+? I've looked around and haven't found much.


----------



## Runefox (Jul 16, 2008)

Intel's roadmap for 2010 includes 8-core (Nehalem, Westmere, Sandy Beach) and higher CPU's with 16MB+ shared L2 cache, hyperthreading (16+ virtual CPU's) capability and on-chip memory controllers to increase performance across the board. Also on the agenda, I believe, is a HyperTransport-alike front side bus. The CPU socket is slated to change for these chips, to LGA-1366.

I'll see if I can pull up the official roadmap that I was reading not long ago


----------



## Runefox (Jul 16, 2008)

Sorry, this is for 2008-2009. http://www.pcper.com/article.php?aid=534&type=expert&pid=2 They're shooting for 32-core by 2010.

And the Cache is L3, at 8MB, not 16MB (not hard to believe that 16MB may appear in this series though)


----------



## Pi (Jul 16, 2008)

Still doing pretty well with a 166mhz single-core pentium.

Programmers still haven't quite figured out what to do with dualcore machines, why are we bothering with 8x?


----------



## ADF (Jul 16, 2008)

Thanks for that Runefox, I'm hoping to get some information on both Intel and AMD so I know what to compare for my next CPU purchase.

The way my current dual core is lasting there will probably be affordable 8/16 core CPUs out :lol: Just hope there is stuff to take advantage of them, quad core is hardly utilized these days.


Pi said:


> Programmers still haven't quite figured out what to do with dualcore machines, why are we bothering with 8x?


I've had my dual core utilised quite a few times, dual cores seem mostly useful for performance gains than changing how games play. I think when quad core has a similar install base we will start seeing them improve game play.


----------



## Pi (Jul 16, 2008)

ADF said:


> I've had my dual core utilised quite a few times, dual cores seem mostly useful for performance gains than changing how games play. I think when quad core has a similar install base we will start seeing them improve game play.



Uh, I wasn't talking about games. When you're doing concurrent multiprocessing as in an n-core system, programmers have to worry about shared state, bus contention, and memory coherency. Until we've solved that problem (ie, by getting rid of every single program out there) we can't effectively utilize multiple cores in a way that makes sense.


----------



## Hollud (Jul 16, 2008)

Pi said:


> Programmers still haven't quite figured out what to do with dualcore machines, why are we bothering with 8x?



Multiple cores are useful if you are in a relevant industry that requires some heavy-duty processing power. These include multimedia (graphic design, video editing, music creation, etc.), engineering (CAD/CAM, building safety simulation, etc.) and medicine are amongst others that would benefit from a multiple-core machine.

Different people have different requirements. However, as we come into an age of increasingly graphic-intensive documents, processing power would be expected to exponentially increase. A Pentium 166 MHz may have been great a decade ago, but in today's world of Web 2.0 applications, broadband Internet and blink-of-an-eye information at the tips of your fingers, people want the best experience and the best productivity they can get out of their hardware.





Pi said:


> Uh, I wasn't talking about games. When you're doing concurrent multiprocessing as in an n-core system, programmers have to worry about shared state, bus contention, and memory coherency. Until we've solved that problem (ie, by getting rid of every single program out there) we can't effectively utilize multiple cores in a way that makes sense.



Well, unfortunately, we live in a world where we can't make everybody happy. Someone has got to swallow the bitter pill.


----------



## Pi (Jul 16, 2008)

Hollud said:


> Multiple cores are useful if you are in a relevant industry that requires some heavy-duty processing power. These include multimedia (graphic design, video editing, music creation, etc.), engineering (CAD/CAM, building safety simulation, etc.) and medicine are amongst others that would benefit from a multiple-core machine.


All of which an average user of the FA forums does on a day to day basis. Additionally, in some of your quoted cases, the problem is implicitly sequential, so parallelism will do nothing.



> Different people have different requirements. However, as we come into an age of increasingly graphic-intensive documents, processing power would be expected to exponentially increase.


Flinging bits around at 10ghz is fairly meaningless if the rest of the bus is chugging along at 1/10th the speed. Processing power hasn't exponentially increased, only clock speed has.



> A Pentium 166 MHz may have been great a decade ago, but in today's world of Web 2.0


As TeX might put it, Emergency stop.


----------



## nrr (Jul 16, 2008)

Pi said:


> Uh, I wasn't talking about games. When you're doing concurrent multiprocessing as in an n-core system, programmers have to worry about shared state, bus contention, and memory coherency. Until we've solved that problem (ie, by getting rid of every single program out there) we can't effectively utilize multiple cores in a way that makes sense.


Hey, want to write a massively concurrent OS that forces userspace programs to be massively concurrent themselves?


----------



## Pi (Jul 16, 2008)

nrr said:


> Hey, want to write a massively concurrent OS that forces userspace programs to be massively concurrent themselves?



Be was on the right track. :V

ps also smalltalk and erlang!


----------



## nrr (Jul 16, 2008)

Hollud said:


> Multiple cores are useful if you are in a relevant industry that requires some heavy-duty processing power. These include multimedia (graphic design, video editing, music creation, etc.), engineering (CAD/CAM, building safety simulation, etc.) and medicine are amongst others that would benefit from a multiple-core machine.


... like they've all been benefiting from SMP machines for the last decade or so?



			
				Hollud said:
			
		

> However, as we come into an age of increasingly graphic-intensive documents, processing power would be expected to exponentially increase.


It's actually increased geometrically for the most part.



			
				Hollud said:
			
		

> A Pentium 166 MHz may have been great a decade ago, but in today's world of Web 2.0 applications, broadband Internet and blink-of-an-eye information at the tips of your fingers, people want the best experience and the best productivity they can get out of their hardware.


A Pentium 166 is absolutely awesome today.  The run of the mill mobile handset with a J2ME implementation running on it has about as much processing power as a Pentium 100, if even that.

Imagine having a Pentium 166 in your pocket every day.  Wouldn't that be awesome?


----------



## nrr (Jul 16, 2008)

Pi said:


> Be was on the right track. :V
> 
> ps also smalltalk and erlang!


No real need for Smalltalk if you have Erlang.  You can pretty much reimplement all of Smalltalk using Erlang's message passing facilities.


----------



## Hollud (Jul 16, 2008)

nrr said:


> Imagine having a Pentium 166 in your pocket every day.  Wouldn't that be awesome?



Well, I'm a more hands-on horse... I'll only believe the performance statistics when I like the way it handles my day-to-day tasks.

In any case, I think we're drifting a little off-topic here. Best to check the alignment of the socket pins before it gets bent horrendously and is rendered unusable.


----------



## Xenofur (Jul 17, 2008)

Actually Pi...

There are ways multi-core processors can be easily taxed properly for normal users already. Let me just look at what my computer is doing at any given time:
- web browser with 50+ tabs open, some of them executing self-refreshing directives
- web browser with email/rss client that constantly refreshes the state of those
- e-mail spam pre-filter
- mp3 player
- mirc + miranda + skype (work)
- bandwidth monitoring tool with graphs
- launchy monitoring my available links in the startmenu, quickstart, desktop and other places
- process explorer constantly monitoring the state of all programs and keeping a history
+ a slew of other small utilities, since this is a laptop

now when i start my coding ide, you can add the following tasks:
- constant monitoring of my project files against outside changes, so the ide keeps up-to-date with when i move shit around in total commander
- constant monitoring of the svn state of each file
- syntax-checking of each file as soon as i haven't typed for a second

As you see, computers of normal users can easily do a lot of shit concurrently even with normal day-to-day crap. On a single-core system all this would be rather painful, since, for example, the syntax-checking can get VERY CPU-intensive and thus slow down the IDE.

Put the whole stuff on a multi-core system and suddenly everything runs completely smooth, since the syntax-checking grabs one of the cores and leaves the other free for the other programs.

Also, i like to play stuff on a ps2 emulator and without multi-core cpu that stuff runs like total shit.


----------



## nrr (Jul 17, 2008)

I think what Pi is getting at is mostly the fact that the application developers haven't boned up on ways to make individual processes concurrent across multiple threads to allow the OS' process scheduler to work its magic and shift things across cores as appropriate.  What Xenofur has described here is just a bunch of disjoint processes which will, of course, see automatic improvement because the scheduler will just shove individual processes around as it sees fit anyway.  This is old news.



Xenofur said:


> As you see, computers of normal users can easily do a lot of shit concurrently even with normal day-to-day crap. On a single-core system all this would be rather painful, since, for example, the syntax-checking can get VERY CPU-intensive and thus slow down the IDE.


I don't know if this is the case for you, but syntax checking in Emacs for me is implemented by actually compiling the file in the buffer and parsing the compiler errors appropriately.  That's probably where it becomes reasonably CPU-intensive.


----------



## virus (Jul 17, 2008)

You only really need around 500mhz bus speed for a really fast computer. The problem is, none of the components talk to each other at the same rate. Some are MUCH faster or slower then others. This is a huge flaw of the PC. Instead they just bump up processing power to try and make up for this.


----------



## ADF (Jul 17, 2008)

Right... perhaps people can take a breather long enough to tell me the name of AMD's upcoming chip? If I knew what it was called I may have better results in Google.


----------



## Hollud (Jul 18, 2008)

ADF said:


> Right... perhaps people can take a breather long enough to tell me the name of AMD's upcoming chip? If I knew what it was called I may have better results in Google.



*Puma* is AMD's next notebook platform. It's a combination of a new processor (code name *Griffen*... or otherwise known as the *AMD Turion X2 Ultra*), an AMD M780G mobile chipset and a 802.11a/b/g/n WiFi chip.

Think big cats and mythological animals. Mrrrrrowwwl... 



EDIT: I'll go one up and give you the direct link on their site. Click!


----------



## dietrc70 (Jul 18, 2008)

ADF said:


> Thanks for that Runefox, I'm hoping to get some information on both Intel and AMD so I know what to compare for my next CPU purchase.
> 
> The way my current dual core is lasting there will probably be affordable 8/16 core CPUs out :lol: Just hope there is stuff to take advantage of them, quad core is hardly utilized these days.
> 
> I've had my dual core utilised quite a few times, dual cores seem mostly useful for performance gains than changing how games play. I think when quad core has a similar install base we will start seeing them improve game play.



I recommend anandtech.com if you want to keep up on PC tech news.  I also think Intel is going to be the best choice in the near future.  AMD has had a lot of problems, and even if their new CPU's are good it would still be a good idea to wait until any issues with their chipsets are ironed out.  It's very hard to beat a 45nm Intel Core 2 duo with a P35 or X38 chipset for proven reliability and performance.

Dual core processors are terrific, but I wouldn't get a quad.  You're better off spending the money on a faster dual-core with a larger cache and faster FSB, and RAM with good latency specs.

An awful lot of processor intensive software will only use one core anyway.  If all the cores are used, they can easily wind up fighting over the limited cache and memory bandwidth.


----------



## Runefox (Jul 18, 2008)

> You only really need around 500mhz bus speed for a really fast computer.


Considering most CPU's have an FSB of 200MHz, 266MHz and 333MHz, I guess we're a far way off. Remember that all these "800MHz, 1066MHz, and 1333MHz" FSB's in reality are clocked four times lower because the FSB has four pipelines, and for marketing purposes, 1333MHz sounds tons better than 333MHz x4.

Same goes for DDR RAM speed. Old DDR RAM is multiplied by 2 over its real frequency, and DDR2 RAM is multiplied by 4 over its real frequency.



> An awful lot of processor intensive software will only use one core anyway. If all the cores are used, they can easily wind up fighting over the limited cache and memory bandwidth.


Quad cores currently have this limitation, but the new Intel Nehalem architecture coming out within the next year should negate the cache issue by having a large L3 cache directly shared across all cores. I can personally find a lot of use for a quad core CPU, and I'm really quite excited about them becoming more mainstream. Remember that the same things were said about dual core CPU's when they were first introduced; More software is being designed to take advantage of parallelism, even software that is normally single-threaded.


----------



## nrr (Jul 18, 2008)

Runefox said:


> More software is being designed to take advantage of parallelism, even software that is normally single-threaded.


Right, and a lot of developers are kindly taking the hint to do everything necessary to make their applications parallelizable, even if that means using some of the more expensive, but simpler, concurrency models (separate processes and IPC) to achieve that.

Mathematica is a reasonable example of something kind of like this, but Wolfram did it more in the interest of modularization than anything else.  The GUI frontend is one process, but the actual CAS kernel itself is yet another process.  You can spread certain operations over other kernel instances (you can start up more than just one) in order to make things a little more parallel computationally.


----------



## Aden (Jul 18, 2008)

Pi said:


> All of which an average user of the FA forums does on a day to day basis.



*raises hand*


----------

