# Microsoft, this has got to be the stupidest...



## ADF (Nov 29, 2008)

Microsoft seems to have forgotten why we moved from software to hardware graphics, they are adding software DX10 in Windows 7.

Link

Surly a system with a cheapo/integrated GPU thrown in for Aero Glass would be significantly cheaper than a CPU capable of running it? Vista likes a dual core just for running the OS day to day, likely Windows 7 also, they want to add 3D processing on top of that?

Seeing how the hardware costs of running Vista as opposed to XP are deterring some consumers from adoption; they think moving load from the GPU to the CPU is going to help?


----------



## mrredfox (Nov 29, 2008)

Bascially microsoft are crap, ill stick with my XP and mac, im not gona upgrade just for the sake of making the games graphics a tad better... graphics isnt all about the game.. its about the gameplay!


----------



## net-cat (Nov 29, 2008)

Oh, god no.

Heaven forbid people should have _choices_ and have to consider _trade-offs_.

(I/E: A single quad-core CPU takes up less space on a laptop motherboard than a dual-core CPU and a dedicated GPU chip.)


----------



## ADF (Nov 29, 2008)

net-cat said:


> Oh, god no.
> 
> Heaven forbid people should have _choices_ and have to consider _trade-offs_.
> 
> (I/E: A single quad-core CPU takes up less space on a laptop motherboard than a dual-core CPU and a dedicated GPU chip.)


It's not a matter of choices but what's practical, compared to what would be needed from a CPU for software 3D; the costs would be massive compared to an add in card or compatible integrated chip.

They needed an Octo Core i7 to get a little (2fps increase) more performance in 3D rendering than Intel integrated graphics, probably the most powerful CPU available to consumers to date. A bit much don't you think? To render Aero on the CPU would probably require 2 or 3 cores to have everything running smoothly, it's impractical when GPUs perform so much better for far cheaper.

It's not about choices and trade-offs, not when it is so one sided.


----------



## Runefox (Nov 29, 2008)

Surely you've heard of Intel's foray into the "GPU on the CPU" concept?


----------



## ADF (Nov 29, 2008)

Runefox said:


> Surely you've heard of Intel's foray into the "GPU on the CPU" concept?



That would be AMD's Fusion processor, Intel's Larrabee is an add in card. Either way neither of these are relevant here because they wouldn't take any effect, what Microsoft is talking about here is old fashioned graphics on the CPU but done with DX10. 

There is a reason we moved graphics work away from the CPU, CPUs are far less efficient at it and cost significantly more to do the same job. Were as a cheapo GPU would do the job at running Aero just fine, we are talking several CPU cores to do the exact same thing. It's actually far more expensive to run Aero on the CPU than it is on a GPU, how this is supposed to make it more accessible is beyond me.


----------



## indrora (Nov 29, 2008)

Yeeeaaah i've heard of the "GPU on the CPU" thing -- load of crap.
I have an Nvidia graphics card for a REASON. its so i CAN pull it to full res and play Garry's Mod at full blast.


----------



## Runefox (Nov 29, 2008)

(Actually, aside from Larrabee, Intel does have a GPU-on-CPU coming down the pipes; The Auburndale. Also, all of the next-gen Intel chips have a direct PCI Express connection to the GPU)

I guess what they're getting at is, as CPU's become more powerful (Core i7, for a recent example of a performance spike) and more efficient, it might be better to enable the CPU to handle graphics-related tasks (as taking up a single core or hyperthreading virtual core off something like a Core i7 (which should be more mainstream by Windows 7's release) won't adversely affect most other tasks) than to deny the user the ability to run those tasks at all to begin with.

That said, games would probably suck quite heartily using this approach; I'm sure what they're trying to pull here is a cheap solution to the "why can't I have Aero" question, and helping laptop manufacturers to forego a power-hungry hardware accelerator chip, perhaps running all graphics in software.

And of course, I wouldn't recommend actually using this feature, but it's there. A bit of a waste of time, maybe, since most PC's nowadays have rudimentary 3D acceleration at worst.


----------



## mrredfox (Nov 29, 2008)

the whole point of a gpu is that it does all of the graphics processing so the cpu ram and motherboard has more resources to do the other tasks (hence why dedicated cards are better than intergrated) now you would have to have a fucking awesome cpu to be able to coupe with both gpu and system processess, and you would also need the ram to back that up, and a decent motherboard that can coupe with it all. not to mention that vista or windows 7 does/will need a spectacular specification for it to be effective.


----------



## Runefox (Nov 29, 2008)

mrredfox said:


> the whole point of a gpu is that it does all of the graphics processing so the cpu ram and motherboard has more resources to do the other tasks (hence why dedicated cards are better than intergrated) now you would have to have a fucking awesome cpu to be able to coupe with both gpu and system processess, and you would also need the ram to back that up, and a decent motherboard that can coupe with it all. not to mention that vista or windows 7 does/will need a spectacular specification for it to be effective.



Well, to be brutally honest, if a GMA 950 can do it (in precisely the way you mention, just as an off-CPU chip), so can a Core i7 core, or even a Core 2 Quad core. 

The same things could be said of the sound card; That used to offload ALL sound processing tasks from the motherboard and CPU, and now we've turned around 180 degrees and have gotten to the point where Microsoft's even dropped DirectSound from DirectX.

Considering Intel has always been drooling over the possibility of Ray Tracing on the CPU and not requiring a GPU at all, perhaps this is part of Microsoft's infallible will to please Intel?


----------



## mrredfox (Nov 29, 2008)

Runefox said:


> Well, to be brutally honest, if a GMA 950 can do it (in precisely the way you mention, just as an off-CPU chip), so can a Core i7 core, or even a Core 2 Quad core.
> 
> The same things could be said of the sound card; That used to offload ALL sound processing tasks from the motherboard and CPU, and now we've turned around 180 degrees and have gotten to the point where Microsoft's even dropped DirectSound from DirectX.
> 
> Considering Intel has always been drooling over the possibility of Ray Tracing on the CPU and not requiring a GPU at all, perhaps this is part of Microsoft's infallible will to please Intel?


well i have a quad core with 2 gigs of ram running xp, and the cpu alone costs Â£180, if all the new pc's had to have a MINIUMUM of that... i doubt they will sell many pc's as they would all be like Â£500+ which people dont want to pay for... also the pc wont run as efficiant if the cpu literally has to do everything rather than haveing 2 componants do their own job, its like saying to me, i have to write a game, but also create the concept art and the graphic art and the mission stories and the 3d renders all at the same time, each componant has their own job, and should keep it that way, and also as you said the cooling would be rediculous, you would need like a 20 cm fan minimum for it to be cool enough to run.


----------



## Runefox (Nov 29, 2008)

Ah, don't worry about that. Dual core processors were just becoming popular by the time Vista was released, and they were rather expensive, too (the Core 2 Duo E6300 was about the same price that the Core 2 Quad Q6600 is today, IIRC). Same with the Pentium 4 processors and Windows XP (though the point in having a Pentium 4 back then was pretty nonexistent, since the performance gap between the P4 and P3 processors were very slight at the time). By the time Windows 7 is released, we'll be seeing a lot more of the quad core CPU's hitting the market.


----------



## lilEmber (Nov 29, 2008)

Also to toss in a bit of information here, Direct3D 11 is what's being shipped with Windows 7, not Direct3D 10.


----------



## ADF (Nov 30, 2008)

NewfDraggie said:


> Also to toss in a bit of information here, Direct3D 11 is what's being shipped with Windows 7, not Direct3D 10.


DX11 is not a backwards compatibility cut off like DX10 was, DX10 and DX11 will run under Windows 7. Just seems according to this article DX10 mode will also run in software.


----------



## lilEmber (Nov 30, 2008)

ADF said:


> DX11 is not a backwards compatibility cut off like DX10 was, DX10 and DX11 will run under Windows 7. Just seems according to this article DX10 mode will also run in software.



What that means is, basically you have DX11 coming on Windows 7, but 10.1 (not 10, but 10.1, slight difference, not a lot) will be the "backwards" computable that you're speaking of, windows 7 will utilize the CPU to run DX10.1 IF your videocard can't handle it on its own. Basically, allowing you to run it with no issues, if your video cant run it alone or support it.

If you don't like this (which is kinda stupid seeing as you've said you got a 8800 series card and this will be no issue for you), then you will need to purchase a videocard to lighten the load on your CPU. Not that it will matter because as RuneFox said, the new cores like i7 can get better FPS (like 5 more, bit whoop) in games doing it this way in Windows 7 versus not doing it in Vista.

You basically are able to run things better, or things you would be unable to run because you didn't go whip out cash for a better videocard.

At least so far that's my understanding and this article I read through briefly (I'm being honest here, I never read it all).


----------



## ADF (Nov 30, 2008)

For some odd reason I somewhat doubt an average Joe consumer who apparently cannot afford the GPU needed for Aero Glass; has an Core i7 idle as a substitute for the job. Something about them being the most powerful processors you can buy today, quad core native and DDR3 exclusive, tells me they will not be in cheapo budget machines in two years time.

I'm not saying this affects me, I just think this is a ridicules idea if Microsoft thinks this will broaden the Aero Glass ready system spec.


----------



## lilEmber (Nov 30, 2008)

Uh...it can work with a P4 core clocked at 800mhz. :\



> The minimum CPU spec needed is just a simple 800MHz CPU, and it doesnâ€™t even need MMX or SSE, although Microsoft says that WARP 10 will work much quicker on multi-core CPUs with SSE 4.1.


From the link I posted, here.

You can read about this, it's called WARP.


----------



## ADF (Nov 30, 2008)

NewfDraggie said:


> Uh...it can work with a P4 core clocked at 800mhz. :\
> 
> From the link I posted, here.
> 
> You can read about this, it's called WARP.


And the minimum to run Crysis is a 2.8ghz pentium 4, doesn't mean it will do it well. We are also talking about the company who are currently in a court case for misleading minimum specification when labelling PCs Vista ready.

Also the link you posted before is the exact same link in the original post. 

Microsoft is implementing this to broaden what can be considered Aero ready in consumer PCs, seeing how a software 3D setup will be more costly than a GPU setup; it is hardly going to make previously none Aero ready machines ready in Windows 7. If you buy a Vista retail machine today and it doesn't have the GPU to run Aero, it is doubtful it will have a CPU that can take its place because it will be a bottom of the line machine.


----------



## lilEmber (Nov 30, 2008)

ADF said:


> And the minimum to run Crysis is a 2.8ghz pentium 4, doesn't mean it will do it well. We are also talking about the company who are currently in a court case for misleading minimum specification when labelling PCs Vista ready.
> 
> Also the link you posted before is the exact same link in the original post.
> 
> Microsoft is implementing this to broaden what can be considered Aero ready in consumer PCs, seeing how a software 3D setup will be more costly than a GPU setup; it is hardly going to make previously none Aero ready machines ready in Windows 7. If you buy a Vista retail machine today and it doesn't have the GPU to run Aero, it is doubtful it will have a CPU that can take its place because it will be a bottom of the line machine.



Wait...what? You're saying that because of this you will be unable to get a decent framerate with a p4 in crysis?

*Facepalm, hard*
You do know that Windows 7 will be -lighter- than Vista, right? As well it will have a higher framerate for games, too.


----------



## ADF (Nov 30, 2008)

NewfDraggie said:


> Wait...what? You're saying that because of this you will be unable to get a decent framerate with a p4 in crysis?
> 
> *Facepalm, hard*


I only brought up Crysis to demonstrate how a minimum is not evidence of running something well, that the reality often requires considerably more performance. Anything else you may have interpreted from that post is of your own creation.

That said what exactly are you trying to argue here? Using a standard general purpose CPU for 3D graphics is a bad idea, no amount of rationalizing is going to change that.


----------



## lilEmber (Nov 30, 2008)

ADF said:


> I only brought up Crysis to demonstrate how a minimum is not evidence of running something well, that the reality often requires considerably more performance. Anything else you may have interpreted from that post is of your own creation.
> 
> That said what exactly are you trying to argue here? Using a standard general purpose CPU for 3D graphics is a bad idea, no amount of rationalizing is going to change that.



No... it will only use that if you don't have the video, or it will share. So instead of not working at all, it works sluggishly. So basically you're argueing people need to purchase a new videocard if they want Windows 7, that's fine I mean they don't have to get windows 7 so they don't have to get a new videocard. But if they want windows 7 and aren't gaming running the Direct3D 10/10.1 through their current video and CPU will be fine. If their gaming then they're going to need a new videocard, which is surprising seeing as I've never seen anybody able to game in DX10 with a non DX10 card... hmm... oh oh, or somebody planning on gaming within the past year without having something at least DX10 able.


----------



## Xenofur (Nov 30, 2008)

ADF said:


> I only brought up Crysis to demonstrate how a minimum is not evidence of running something well, that the reality often requires considerably more performance. Anything else you may have interpreted from that post is of your own creation.


You apparently didn't read the fucking article. Let me quote the relevant parts here.



> In what could be seen as an easy answer to the Vista-capable debacle


Translation: They're trying to make Aero feasible for people without needing to shell out for a full gfx card when they're only going to run MS Office anyhow.



> Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intelâ€™s DirectX 10 integrated graphics.


In order to test it, they ran one of the most ridiculous games in regards to hardware requirements and also one that is a fucking industry standard at this point. They managed to get a 42% improvement over an Intel card.



> Microsoft says that the technology is also targeted at casual games


The actual target isn't the Crysis crowd, but games which make the GPU yawn ANYHOW and which can easily run on the CPU as long as it knows HOW to run them.


----------



## net-cat (Nov 30, 2008)

It should also be pointed out that, as far as hardware accelerated graphics go, Aero is pretty damned trivial despite the fact that they coded it to take advantage of things available in DX9.

This is not a way for people to run Crysis. (Although CPUs might get there eventually.) This is a way to run Aero. Why fly a 747 when all you need is a Cessna?


----------



## TheGreatCrusader (Nov 30, 2008)

ADF said:


> Microsoft seems to have forgotten why we moved from software to hardware graphics, they are adding software DX10 in Windows 7.
> 
> Link



Maybe it's because you need a balance between software and hardware. You can have a brilliant machine, but if you have DirectX 7 installed on your system, then the games will look like crap, and vise versa. That's why Microsoft is updating DirectX every few years; so it can stay up to par with the hardware advances in the industry. Besides, Microsoft has done this in the past. They packaged DirectX 8 with XP and that was brilliant software and it changed the way we looked at games. The problem with DirectX 10 is because it was buggy, it took massive systems to run it correctly and not many games supported it, at the time of it's inital release.



> Surly a system with a cheapo/integrated GPU thrown in for Aero Glass would be significantly cheaper than a CPU capable of running it? Vista likes a dual core just for running the OS day to day, likely Windows 7 also, they want to add 3D processing on top of that?


Microsoft has already confirmed that they are optimizing 7 for systems with single core processors and a small amount of RAM. In other words, is going to take LESS to run 7 than it takes to run Vista.

I'm looking forward to 7. From what I have seen, it has lots of useful features that I would use and it will be DirectX 11 compatible, similar to XP and DirectX 9.

http://gizmodo.com/5070219/giz-explains-why-windows-7-will-smash-vista


----------



## ADF (Nov 30, 2008)

Xenofur said:


> You apparently didn't read the fucking article. Let me quote the relevant parts here.
> 
> Translation: They're trying to make Aero feasible for people without needing to shell out for a full gfx card when they're only going to run MS Office anyhow.


I get that part, the point I'm making is software graphics make it even less accessible than GPU graphics, it requires a more expensive machine than a GPU solution for Aero. Therefore how is it making Aero more accessible? 



Xenofur said:


> In order to test it, they ran one of the most ridiculous games in regards to hardware requirements and also one that is a fucking industry standard at this point. They managed to get a 42% improvement over an Intel card.


And it only required the most powerful retail CPU known to date, much better than an integrated chip that costs as much as a cup of Starbucks coffee. Care to do the math on price per frame?



Xenofur said:


> The actual target isn't the Crysis crowd, but games which make the GPU yawn ANYHOW and which can easily run on the CPU as long as it knows HOW to run them.


At no point have I judged this from a gaming point of view. I only brought up Crysis to show how you cannot trust minimum specifications, because WARP will run on 800mhz minimum doesn't mean it will only require that to run Aero.

I seriously don't want to waste my time in a argument over whether software graphics is suitable, that is ancient history and it lost. NewfDraggie has a habit of getting into arguments with me so I can expect him to be argumentative, I don't expect others to question software vs hardware graphics however. 

Dedicating a core or two just for running a 3D interface is highly inefficient, especially considering a cheaper and better performing solution is available. Software graphics Vs hardware graphics performance is not up for debate, this is a well known fact, this is the rout the industry has taken and it has worked pretty well for a long time. Even if every retail machine is equipped with a quad by Windows 7; talk about a waste of performance that could have been used to actually speed up day to day programs.

Now are people seriously going to continue to debate the merits of having an entire operating systems 3D interface running in software or can I give this a rest?


----------



## net-cat (Nov 30, 2008)

Not in terms of performance, no. Anytime you have a specially engineered part (GPU) versus a generic one (CPU,) the specially engineered part is going to win in terms of performance. (Unless it was really old or really badly designed.)

In terms of costs and board design? You're damn right I am. It's about volume. 

Let's take your claim of "an integrated chip that costs as much as a cup of Starbucks coffee." Let's say $5. Now make five million boards with that chip on it. Oh, look. $25,000,000.

Now, you've pointed out many, many times that it took the latest and greatest core from Intel to almost kind of sort of not suck. So what? That "latest and greatest core" will eventually be a dime a dozen. So, while it may not be economically feasible with chips on the market today, _that won't be true in a year or two._ Remember when Dual Core was "cutting edge" and insanely expensive? Now they make Celeron Dual-Core. So, while Quad Core and more are expensive and cutting edge now, they _will_ become dirt cheap to the point of it being more economically feasible to do graphics on CPU for basic and trivial things like Aero.

While this may seem quaint and silly to gamers, anyone who has done any work in board design for mass production is looking at this as a way to cut costs in the future.


----------



## ADF (Nov 30, 2008)

net-cat said:


> Not in terms of performance, no. Anytime you have a specially engineered part (GPU) versus a generic one (CPU,) the specially engineered part is going to win in terms of performance. (Unless it was really old or really badly designed.)
> 
> In terms of costs and board design? You're damn right I am. It's about volume.
> 
> Let's take your claim of "an integrated chip that costs as much as a cup of Starbucks coffee." *Let's say $5. Now make five million boards with that chip on it. Oh, look. $25,000,000.*


And CPUs are going to be in a better position? It took an Octo core to perform slightly better than an Intel integrated chip. That is *never* under *any* circumstances going to become a cheapo standard in the next couple of years, especially not in time for Windows 7. Even if by some miracle it did, an add in GPU/integrated solution is always going to be significantly cheaper for the same performance.



net-cat said:


> Now, you've pointed out many, many times that it took the latest and greatest core from Intel to almost kind of sort of not suck. *So what? That "latest and greatest core" will eventually be a dime a dozen.* So, while it may not be economically feasible with chips on the market today, _that won't be true in a year or two._ Remember when Dual Core was "cutting edge" and insanely expensive? Now they make Celeron Dual-Core. So, while Quad Core and more are expensive and cutting edge now, they _will_ become dirt cheap to the point of it being more economically feasible to do graphics on CPU for basic and trivial things like Aero.
> 
> While this may seem quaint and silly to gamers, anyone who has done any work in board design for mass production is looking at this as a way to cut costs in the future.



We are talking about a quad core native, DDR3 only, top of the line CPU that has only just been released. On top of that the performance benchmark was done on a yet to be released octo core that is still in the works. I think it is going to take a tad more than 2 years for that to be "a dime a dozen", technology doesn't become cheap that fast.

Seriously though, I just find it amazing that this is even up for debate in 2008. It just goes to show 'anything' said on the Internet, no matter how true, will always have someone disagreeing with it. We are clearly not going to convince each other so how about stopping here? When Windows 7 comes out we will see how software mode in a modern 3D interface runs, I'm just saying right here right now it is a terrible idea.


----------



## Runefox (Nov 30, 2008)

It isn't about performance, though; it's about compatibility and forward-compatibility. In a decade or two, how much do you want to bet that you'll be running something incapable of running games like Crysis anymore because the API's are completely different? Emulation in a virtual machine is then the way to go, and look at that, no 3D acceleration possible there. Hmm. Thank god for the software renderer, eh? There are already examples of that in today's world - Final Fantasy VII, for one, and any number of the games that jumped on the 3DFX bandwagon before OpenGL and DirectX were popular (or even around).

And going closer to our own time, you can run it in a virtual machine and have things like Aero and decent video playback working instead of disallowing it altogether. By the time Windows 7 is released, we'll definitely be running quad cores like they're going out of style, and octo-core CPU's will be hitting the market in force. All with hyperthreading.

It's really not so far-fetched. The point is, it's not supposed to compete with discrete graphics, but supplement it or offer alternatives when it's not feasible to use it.

EDIT:


> Just seems according to this article DX10 mode will also run in software.


Is that what you think this is? No, this is software _fallback_ - In other words, piping the graphics into a software renderer when a proper hardware renderer isn't available, and without notifying the application of it (unlike with the current method for choosing the software renderer, which alerts the application and requires the SDK). In essence, it allows it to be transparent. If you have an accelerator, it will accelerate; If not, it'll use software. If you want to compare it to something else in the computing world, Mesa3D in Linux is pretty much precisely the same thing.


----------



## Xenofur (Nov 30, 2008)

ADF said:
			
		

> :words:



Dude. You're an idiot.

Next time you read a post, read it as one thing, not just each sentence on its own.

Now, before i go one step further, i'll need to say this and i'll need you to realize and accept this, otherwise any sort of communication with you is utterly pointless and a waste of time:

Aero does not work on Intel cards right now. In order to make Aero work one has to buy a really expensive graphics card, or in the case of laptops, an entirely new laptop. Microsoft set out to rectify that.

Do you understand?


----------



## net-cat (Nov 30, 2008)

Yes, it took an eight core processor to match the performance of a chip that was running something it was never meant to run in the first place.

Crysis and the people who would run it are not, nor would it ever be the target of this technology. Just because it takes eight cores to run Crysis at 7 FPS in pre-release software does not mean that it will take eight cores to run Aero. Or any other number of basic, non-gaming applications. Using Crysis as the benchmark for this is utterly meaningless at best and outright misleading at worst.

For the record, DirectX has always supported software rendering in development environments. And this continues to be one of the targets. And they think they're getting good enough with this to break into the low-end graphics market. Time will tell if they are right or not. But the idea of consolidating many chips onto singe chips is a concept that's been going on for decades. Graphics are not immune.



Xenofur said:


> Aero does not work on Intel cards right now. In order to make Aero work one has to buy a really expensive graphics card, or in the case of laptops, an entirely new laptop. Microsoft set out to rectify that.


Bzzt. Wrong. Aero has "worked" on every Intel graphics chipset since the GMA 950. It requires DirectX 9c support.


----------



## CyberFoxx (Nov 30, 2008)

On one side this almost looks like another example of "Everything hardware is software now" (Eg: Software mixing of sound, softmodems, softprinters)

But on the other hand, this looks like MS is actually admitting that they goofed up on something. "Yeah, offloading desktop rendering to the GPU was a good idea, but we added too many bells and whistles to make it efficient on the average on-board card, which most of our customers do have."

Hell, when compiz had to use the CPU to do some of the effects for some nVidia cards, until nVidia fixed the binary blob, there was hardly any complaint. But as soon as MS screws up, people jump down their throat. Then again, I guess that's the difference, with "experimental" software it's expected to have shortcomings, but "release-grade" is supposed to "work all the time, every time." And lets not get into the whole "But I paid real money for it to 'work perfectly all the time, every time'!!!!!!"

(Insert rant about "making stuff work perfectly on an insanely complex and always changing configuration platform, and the amazement that any OS even manages to boot" here.)


----------



## net-cat (Nov 30, 2008)

CyberFoxx said:


> Hell, when compiz had to use the CPU to do some of the effects for some nVidia cards, until nVidia fixed the binary blob, there was hardly any complaint.


This has aways bothered me about Aero. Compiz worked fine on my GeForce 4 at work. Apple has been doing similar in OS X with a lot less.


----------



## ADF (Nov 30, 2008)

Xenofur said:


> Do you understand?


Do you understand that I cannot be arsed to get into a debate over something that was resolved over a decade ago and have taken a wait and see approach now? Read my previous post, this is not something worth getting into a huge argument over.

Do you also understand referring to the integrated chip was regarding the price/performance difference between the octo and the Intel extreme graphics; and had nothing to do with the capability of running Aero? 

Good, now leave me in peace and keep your insults to yourself.


----------



## lilEmber (Nov 30, 2008)

ADF you're not understand what this means...

Ok, let me break this down -really- simple for you.

This allows people that can't run windows vista to run windows 7 and people that already can run vista will be able to run windows 7 -much- better and faster.

It's not -only- software, but instead a *fallback* in case your GPU can't take it, or run it. This is a *fallback* system so that a shitty, single cored computer with integrated video will be able to run Windows 7. This is not a step down in -any- way and instead is a massive boost up in all ways.


----------



## DuncanFox (Nov 30, 2008)

ADF said:


> Now are people seriously going to continue to debate the merits of having an entire operating systems 3D interface running in software...



Microsoft, the company which helped make Bill Gates the richest man in the world, with a market cap of $151 Billion, has decided to spend significant time and money to make DX10 work in software.

You ... make belligerent posts about it on a furry forum.

Whose opinion am I more inclined to believe on the merits of running DX10 in software?


----------



## mrredfox (Nov 30, 2008)

DuncanFox said:


> Microsoft, the company which helped make Bill Gates the richest man in the world, with a market cap of $151 Billion, has decided to spend significant time and money to make DX10 work in software.
> 
> You ... make belligerent posts about it on a furry forum.
> 
> Whose opinion am I more inclined to believe on the merits of running DX10 in software?


bill gates owns microsoft.


----------



## DuncanFox (Nov 30, 2008)

mrredfox said:


> bill gates owns microsoft.



Yes, yes he does.  Thank you for that.


----------



## mrredfox (Nov 30, 2008)

DuncanFox said:


> Yes, yes he does.  Thank you for that.


your welcome


----------



## TheGreatCrusader (Nov 30, 2008)

mrredfox said:


> bill gates owns microsoft.


No, no he doesn't. He was the CEO of Microsoft for years, not the owner. Microsoft is a publicly held company. Nobody can _really_ own it.


----------



## ADF (Nov 30, 2008)

DuncanFox said:


> Microsoft, the company which helped make Bill Gates the richest man in the world, with a market cap of $151 Billion, has decided to spend significant time and money to make DX10 work in software.
> 
> You ... make belligerent posts about it on a furry forum.
> 
> Whose opinion am I more inclined to believe on the merits of running DX10 in software?



Appeal to authority much? I'm attempting to drop out of this discussion now; but silly comments like this are irritating. Them being Microsoft does not rewrite history on graphics acceleration, currently being in a court case for false advertising so Intel can meet the Vista ready specification doesn't help their credibility either.


----------



## Xenofur (Nov 30, 2008)

Please show me how all this fancy graphics acceleration does shit for voxel technology. :v

Also, you started this shit, now live with it. If you want to get out, DO IT, instead of talking about it and playing victim.


----------



## DuncanFox (Nov 30, 2008)

ADF said:


> Appeal to authority much?



Congratulations, you know the name of a logical fallacy.  Now work on applying it properly.

http://en.wikipedia.org/wiki/Appeal_to_authority



> Since we cannot have detailed knowledge of a great many topics, we must often rely on the judgments of those who do. There is no fallacy involved in simply arguing that the assertion made by an authority is true, in contrast to claiming that the authority is infallible in principle and can hence be exempted from criticism: It can be true, the truth can merely not be proven, or made probable by attributing it to the authority, and the assumption that the assertion was true might be subject to criticism and turn out to have actually been wrong.



To put it in simpler terms for you, just because you can correctly label something an "Appeal to Authority," that does not dismiss it.

Microsoft may turn out to be wrong about this.  But it's a topic they know a lot about and have arguably more experience in than anyone else, and they're willing to devote significant time and resources to the project.  Their combined experience and confidence speaks volumes about the probability of their being correct.

The fact that they are an "Authority" does not automatically make them wrong.  In fact, it does quite the opposite: it puts more weight behind their argument.

It would be a logical fallacy to say "Microsoft is always correct, therefore they are correct about it."  But that's not what I'm saying.  I'm saying they probably know more about what they're doing than you do.


----------



## ADF (Nov 30, 2008)

You assume I am wrong on the basis that Microsoft are the ones who are doing it, even though we have a decade+ history of GPUs indisputably replacing CPUs for graphics; doing better in both performance and price. It doesn't matter how much experience they have, what they are doing goes against what has been tried and proven for *years*. Despite all this if you still think I'm wrong just because it is Microsoft that's doing it, you're not even using evidence you are just relying on the authority of the name *as* evidence. 

As I keep saying Microsoft is currently in court for lying about Vista hardware requirements, they retrieved E-mail evidence of communications with Intel to reduce 'Vista ready' requirements below what's needed; just to help Intel out at the consumers expense. This is the company you're taking the word of, compared to the tried and proven GPU acceleration that is still used to this day.

Know what, to hell with it. I just think it is a stupid idea because a CPU costs more to perform at the same job as a GPU, negating the supposed broadening of audiences a lack of GPU requirement would have. That is my view on this, I have gotten that across plenty of times now.

Thank you to the people who, although disagreed with me, kept it civil. Since I have a bad habit of getting back into topics I want to leave based on user responses; I'm requesting that this gets locked.


----------



## DuncanFox (Nov 30, 2008)

ADF said:


> Despite all this if you still think I'm wrong just because it is Microsoft that's doing it...





DuncanFox said:


> It would be a logical fallacy to say "Microsoft is always correct, therefore they are correct about it."  But that's not what I'm saying.  I'm saying they probably know more about what they're doing than you do.



Read first, then reply.  That order is important.


----------



## Archibald Ironfist (Nov 30, 2008)

Considering AMD is, and Intel plans to, and VIA has been, integrating GPU+CPU+Memory Controller onboard, this is quite logical.

The GPU is both physically closer and more efficient, and better integrated, when it's part of the CPU board, chip, or substructure.  Even a slower CPU-GPU link, with lower latency, has higher performance that a faster GPU, with high latency.  And besides, this is the entire reason PCI Express 2.0 added the option OF a direct CPU-GPU link.  It will perform equal or better performance with less wattage, less heat output, and less latency.


----------



## Runefox (Nov 30, 2008)

Archibald Ironfist said:


> It will perform equal or better performance with less wattage, less heat output, and less latency.



That's debatable. I guess you could say it takes less wattage than two separate chips, and has less heat output than two separate chips, but the single chip will produce much more heat than either one of the two chips before. That means the cooling for that chip needs to be more efficient, which I'm sure they're fine with doing, but from all the laptops I've seen that had to be taken apart to be cleaned properly for all the dust, I wonder what the life span of these units will be.


----------



## yak (Dec 1, 2008)

Xenofur said:
			
		

> They're trying to make Aero feasible for people without needing to shell out for a full gfx card when they're only going to run MS Office anyhow.


I've got a better solution - kill GMA* series embedded gfx cards. Honestly, just let them die a painful death already.
They're horribly outdated, cheap as a handful of dirt and they are slowing down the progress of mobile solutions because better embedded cards can't quite compete to the price even though they're much more advanced, and manufacturers stick to what's cheaper.

AMD embedded graphics are more then suitable for running light-to-med games, much less Aero.


-----

I am baffled.

Guys, guys, we are talking about a program that can display the list of my files and launch my applications. Why in the name of all things holy should it require 1Gb of RAM and a dual core processor just to run? Why does typing text in a document should require as much raw processing power as guiding a thousand intercontinental nuclear missiles to their destination all at once?

I have no idea why people put so much in these bells and whistles eye candy interfaces. Is it like, the super mandatory thing all Vista users can't live without if they can't use Aero on their bargain laptops?


I do agree with ADF.

If a laptop was cheap enough not to include a graphics card with DirectX 9 support, do you really think it will have a performant enough CPU to run software graphics? 

I mean, Vista on budget laptops already runs like shit even without _any themes enabled at_ all much less have Aero running. It's already eating away CPU like crazy for god knows what, lags like hell and people already are trying to disable as much stuff as possible in futile attempts to make is lag less.

You really think these people have a raging boner for running Aero? They are the only people who DX10 graphics are targeted at, since others will run Aero on hardware with no problem. They are stuck with Vista because it's what they've got preinstalled, and don't have computer knowledge enough to downgrade back to XP. Their experience is already miserable for crying out loud, don't make it worse then it already is.


DX10 software is not a solution to the 'Vista ready' fiasco - it's a lame attempt to put a big enough plug into the problem hoping for it to go away.


----------



## Runefox (Dec 1, 2008)

Yak: While it isn't strictly necessary, something like this does the following:

1) Enables the Larrabee chipset to nicely use DX10 whether it actually can when it's released or not. (MSFT is in bed with Intel; Otherwise the GMA series wouldn't have been certified for "Vista Compatible" to begin with)

2) More importantly, will enable DX10 to be run in software at a future date when we'll all be running (X) technology that has no support for our ancient ways today. Try using anything requiring Direct3D in a virtual machine using VMWare or VirtualBox; Not going to happen, and unless a graphics API comes out that allows itself to be piped like that (with guest OS drivers that accept such calls), we're looking at software rendering or no rendering. So, things like a virtualized workstation or thin client can be used with all the features of a full machine, or in a decade from now, we can boot up our lowly Windows 7 virtual machine and play some good old Crysis. Remember Crysis? Man, it was a love/hate kinda game. The graphics are so rudimentary, but man, it was awesome back then.

In my opinion, aside from pleasing Intel, there's nothing that this technology will do for us _today_. But I'm all for it.


----------



## yak (Dec 1, 2008)

But it _was_ presented as a solution to the Vista compatibility fiasco, wasn't it? That's not a solution for the future, that's a solution to current problems.


And honestly, I don't quite see the need to think that far into the future as assuming everyone already runs mobile octo-core CPUs. The architecture in them will likely be different, mobile power saving features will likely evolve much further in both the CPU and the video cards, video cards might be fused with CPUs, blah, blah, etc. etc. in the end eliminating the need for software emulation like that.


As for virtualization, IMO, the way to go is to give the underlying OS direct/switching/whatever access to video graphics hardware rather then trying to emulate all the functions with a severe performance loss.


----------



## net-cat (Dec 1, 2008)

yak said:


> But it _was_ presented as a solution to the Vista compatibility fiasco, wasn't it?


Not by Microsoft, it wasn't. Just because custompc.co.uk says so doesn't make it so.



yak said:


> The architecture in them will likely be different, mobile power saving features will likely evolve much further in both the CPU and the video cards, video cards might be fused with CPUs, blah, blah, etc. etc. in the end eliminating the need for software emulation like that.


You do realize that the x86 instruction set hasn't changed significantly since the 386, right? Sure, it's been extended and instructions have been added along the way. But code that ran on an 8088 back in the day will run just fine on a Core 2 Extreme. Software may be slow, but it's also a safe bet.



yak said:


> As for virtualization, IMO, the way to go is to give the underlying OS direct/switching/whatever access to video graphics hardware rather then trying to emulate all the functions with a severe performance loss.


IAWTC


----------



## yak (Dec 1, 2008)

net-cat said:


> You do realize that the x86 instruction set hasn't changed significantly since the 386, right? Sure, it's been extended and instructions have been added along the way. But code that ran on an 8088 back in the day will run just fine on a Core 2 Extreme. Software may be slow, but it's also a safe bet.



Yes, but at that rate, we'll not be coming close to implementing AI anywhere soon; too slow.
There is a solution to the problem, right there, on a surface. GPUs have been doing this for a while now.



> Shaders are used to allow a 3D application designer to program the graphics processing unit (GPU) "programmable pipeline", which has mostly superseded the older "fixed-function pipeline", allowing more flexibility in making use of advanced GPU programmability features.



CPU's drivers could provide a HAL of sorts that the OS could be using for example; anything, whatever. 

But why would CPU vendors do such a thing? Research costs money, deployment costs money, advertisments and convincing regular Joe Average that he needs this new products and not a QUAD CORE 4GHZ CPU costs an ungodly amount of money to an extent of hardly breaking any profit with a sale until the market becomes popular.

And why would they want to spend billions when they can milk the market dry by firstly releasing faster, then multi-core versions of chips with the same architecture as older chips?

Can't blame them really, that's business for you.


----------



## Xenofur (Dec 2, 2008)

Just noticed something else for which this'll be just plain awesome. Over-clock testing. If you overclock your gfx card you'll notice artifacts if you go too far. If you go just a *slight* bit too far, these artifacts may not be noticable enough.

By being able to render it on the CPU you have an absolutely pristine image to compare your gfx card test render against, which of course can be automated.


----------

