# The future of physics/animation, today! And free!



## Bokracroc (Jul 9, 2007)

http://www.naturalmotion.com/ele.htm

The Rag-dolling of the future. It's simple to use (well, editing the existing scenes) and funny too.
Like editing the Trapeze scene only for him to fall off half way.
A 60mb is tiny to catch what they'll be doing in the future.


----------



## Zero_Point (Jul 9, 2007)

Too bad you can't develop any commercial application with it. As such, I don't consider it "free". And I'm much more impressed by Digital Molecular Matter myself. The applications for that engine are limitless. Imagine: Selectably gibbing an opponent. *BAM!* There goes a thumb... *BAM!* There goes an ear...


----------



## ADF (Jul 9, 2007)

Like any of todays new simulation technologies it is also a CPU hog; oh well whatever gets Havok off its butt, they have been dominant for way too long and have done little in that time to improve their tech.


----------



## Zero_Point (Jul 9, 2007)

Actually, I think the new Havok engine is designed to run on multiple cores OR the graphics card if it supports it.


----------



## ADF (Jul 9, 2007)

Zero_Point said:
			
		

> Actually, I think the new Havok engine is designed to run on multiple cores OR the graphics card if it supports it.


Bleh! I hate that new thing, but it is Havok's answerer to all the recent competition, they only announced these new concepts a couple of months after allot of competitors popped up with more impressive API's.


----------



## Bokracroc (Jul 9, 2007)

Zero_Point said:
			
		

> Too bad you can't develop any commercial application with it. As such, I don't consider it "free".


Look at who uses it
http://www.naturalmotion.com/endorphin.htm
Be thankful they let you have an unlimited trial of it (You can't export). 
From what I've scanned through, Digital Molecular Matter and endorphin/euphoria look like rivals of sorts.

The new GTA game is using euphoria (Does the same thing. It just does it 'on the fly'). It's basically Ragdoll 2.1


----------



## ADF (Jul 9, 2007)

http://uk.youtube.com/watch?v=3bKphYfUk-M

Here is a little something if you want to see a video demo of the euphoria API.


----------



## Zero_Point (Jul 9, 2007)

Bokracroc said:
			
		

> Zero_Point said:
> 
> 
> 
> ...



The new Star Wars game uses both Euphoria and DMM to great effect, as will the next Indiana Jones game. Also, I consider stuff they let you try a demo, or at the least a physics "toy". All it's really for is to convince you that it's worth shelling out how-ever many thousands of dollars it would cost to get a full dev. license. I mean, you can't even make a real demo app. out of it like you could with Novodex. When you said "It's FREE!", I assumed you meant "Free to use for what-ever".


----------



## OnyxVulpine (Jul 9, 2007)

In my opinion, its not all THAT great. With the wood it just seems like they did the same with the first wood. Just a LOT more pieces and whatever part it hits is the first to give way. It doesn't generate the cracks from the randomly thrown thing. And like the glass, its all set up, it just gives way from the part of collision. With the metal, I'm pretty sure metal doesn't bend as a perfect curve like that all together. I don't know how to explain it, but it is better than I have previously seen. Though not as real as I would like it as. You see the thrown object, it moves as fast as a tossed ball but devestates that metal.

Hehe... I'm going into Video Game design and simulation later on.. I'm gonna have to experience this stuff first hand a few years from now. If I'm still here til then I'll give you guys the heads up if I can ^^


----------



## ADF (Jul 9, 2007)

OnyxVulpine I'm going to tell you what I tell everyone else, did we have photo realism with the first GPU?  We are only now getting the CPU performance to power these affects in games thanks to multi core processors, like HDR DEVs need to play with it for a while before it can be implemented effectively. Until then expect over the top and weird uses of next gen physics as DEVs show off, Cellfactor for instance (even though it is a different tech).


----------



## Bokracroc (Jul 9, 2007)

Zero_Point said:
			
		

> Bokracroc said:
> 
> 
> 
> ...



When you look at the state of gaming these days (A nice bunch of games basically being a glorified Tech demo), yeah. This is free.
It offers a nice range of interactivity, capability and insight to see what the next few years of gaming, animation and movies might be capable of.
Shitloads better than watching hordes of "The Making Of" clips or a one-trick-pony game/thing.


----------



## ADF (Jul 9, 2007)

Bokracroc if you want something physics related to play with try the CellFactor beta demo, put EnablePhysX=false in the shortcut so it can be run without the PPU. The full game is out but software physics got nerfed.

Boom Boom ^.=.^


----------



## ceacar99 (Jul 9, 2007)

this sort of thing has been on my mind for a while. physics that effect players in game while they are still alive. it does bring problems to gameplay but it also brings a new level of realism to whatever you want to play. like i can imagine playing red orchestra and being knocked on your ass by a grenade that blew up in front of you but for some reason didnt shred you with shrapnel... the only issue is that you will have to basically create an ai to interpterate the controlls in context. if someone is blown into a sitting possition and they press foreward the game has to recognise that command as "get up" and so on. 

the only issues i see arising in the system right now is how efficiant it is and how easy it is for someone to code new content using that system into a game. with havok in oblivion for example its a BITCH to add any new object to the world, namely because you cannot create a new physbox and have to borrow one from another object.... the efficiancy could also be bad because that system essentially generates animations on the fly, your cpu is litterally thinking of how your character(and everything else) should move. it would look fantastic, especially to see someone run over rough ground but it might eat up a lot of power....


----------



## Bloodangel (Jul 9, 2007)

I prefer this kind of physics myself.


----------



## OnyxVulpine (Jul 9, 2007)

I was just throwing in what I thought it looked like. I said that it is better than what we have seen before. I just can't wait until it just looks just about perfectly real to me.

I'm even a critic with special effects in movies.. Like how they rig explosions.. Why did the car blow up before it hit the wall? Why did the wall blow up? and all that good stuff. Its just my nature, I suppose I'm a bit of a perfectionist.


----------



## Zero_Point (Jul 9, 2007)

OnyxVulpine said:
			
		

> In my opinion, its not all THAT great. With the wood it just seems like they did the same with the first wood. Just a LOT more pieces and whatever part it hits is the first to give way. It doesn't generate the cracks from the randomly thrown thing. And like the glass, its all set up, it just gives way from the part of collision. With the metal, I'm pretty sure metal doesn't bend as a perfect curve like that all together. I don't know how to explain it, but it is better than I have previously seen. Though not as real as I would like it as. You see the thrown object, it moves as fast as a tossed ball but devestates that metal.



Actually, what's happening with DMM is the engine is creating new vertices and collision boxes on the fly, and those actions are governed by a script. One can modify the script until the desired effect is achieved. The reason the metal was so over-the-top in the way it bent was so that you could clearly see "Oh shit, the metal's denting!".


----------



## OnyxVulpine (Jul 9, 2007)

I'm only going off what I see, I'm a real nub when it comes to actual computer like.. programs and such. But one day I will see ^^


----------



## Rostam The Grey (Jul 9, 2007)

Two words: Physics cards

The next generation of gaming will use not only the standard processor and video card. They will also utilize physics cards. Here is a preview of what you get... I particularly like the machine gun slicing the leaves up in the first video...


----------



## Bokracroc (Jul 9, 2007)

So far the PhysX stuff is a crock of shit.
CellFactor was glitchy and laggy as hell, even if you have a PhysX card. The Havok in HL2 did heaps better without a PhysX card


----------



## Rostam The Grey (Jul 10, 2007)

Bokracroc said:
			
		

> So far the PhysX stuff is a crock of shit.
> CellFactor was glitchy and laggy as hell, even if you have a PhysX card. The Havok in HL2 did heaps better without a PhysX card



Of course new stuff will have problems!


----------



## Bokracroc (Jul 10, 2007)

If your going to release a Tech demo, make sure the tech is actually useful.
ADF posted it, you can disable the PhysX physics on earlier builds with very little difference between it. Great showcasing there. After they plonked in Cloth physics it actually mattered.


----------



## ADF (Jul 10, 2007)

The problem with the PPU is not its capabilities, it has proven its ability to complete the tasks it is given many times better than the CPU numerous times. The card has been running several years now without needing to be upgraded, API improvements over time would also allow it to accelerate euphoria style affects without burdening the CPU.

However it has not been successful thus far because of a couple of reasons.

1) Software Implementation

Ageia develops the software API and sells the PhysX card, they have no role in its implementation. They cannot make developers utilize it effectively; most of them probably only support the card because it allows them to use Ageia's physics API for free, once they have it for their games there is no incentive to spend additional money to use it to its fullest.

On top of that, because advanced physics can augment game play you can reach a point where the PPU is required to run the game. A example of this is in Cellfactor where players can use their powers to push liquid lava towards enemies/other players, without the PPU accelerating those liquids players frame rate would become a crawl even on the best CPUs (Hell I can say that from experience). To get around this problem developers limit the cards use to only effect physics, allowing the physics to be scaled depending on the players hardware without changing game play. But that of course reduces the merit of owning the card in the first place, the incentive in buying it is to allow previously impossible game play.

2) Chicken and Egg

Developers will support the PPU better and utilize it in game play when a large enough install base justifies its implementation in a greater number of games. However gamers refuse to purchase the card on mass until there are enough games that utilize it in a impressive manner, you can see the problem with this. The only people who are justifying its support in a very small number of titles are the enthusiast crowed who took a risk and jumped on board, and those who purchase high end rigs in retail having the card included.

3) Fake Physics Looks Better

We are only now seeing advanced physics in games; focus has always been on games looking real, but now they are focusing more on making them act real. However physics processing has not reached a point where it looks as impressive as fake physics, fake physics has had many years to be tweaked and perfected to the point that real physics cannot match it yet. 

You see fake physics every time you look at a particle simulation like fire or smoke, the water falling down a waterfall or clothing moving around a characters frame. They have had years to tweak techniques to makes these look good; but try the same in real physics and most likely, CPU or PPU, will die under the strain. This causes people to complain when they look at new physics that they don't appear realistic enough, even make them question the point in using them at all. 

There is no question physics processing will add new levels of realistic interaction in games, but whether they will look as good as fake physics is another thing. The trick is to combine the two just right to allow the new level of physical realism while keeping the visual superiority of fake physics.

4) Hardware Power

There is no question that the PPU reduces the load on the CPU, allowing far greater real time physics than previously possible. However there is more to physics than just making it run fast enough in real time, there are many other performance factors to consider. For instance every single physical object will have a mesh, normal mapping, textures, shaders, HDR lighting and so on in modern games. Take the object count in a game like Cellfactor and your high end computer is brought to its knees because of the hundreds of objects that are being rendered on the screen. This isn't just a problem for the PPU, but all advanced physics driven games. If you want to have a incredible number of physical objects on the screen, you are going to need good hardware in all areas to run it.

5) Adoption Refusal

Even if you somehow resolved all of the above problems, there are just people who refuse to add another item to their upgrade list. I have run into these people myself; the PPU could be successful and standardized and these people would still make a fuss about having to buy yet another add on card for their PC, despite not needing to be upgraded as often as other components. 

Not only are they totally against the PPU on all levels regardless of whether it is successful or not, but they will go out of their way to encourage others to avoid it in hopes they can kill it off through propaganda. They understand the impact consumer perception can have on the success and failure of a product, and intend to exploit it fully in a attempt to ensure they will never have to buy yet another component for their computers. This makes them highly gullible to anything that isn't the PPU, I have seen these people fall for the most ridicules claims simply because they didn't have PhysX in their name. They will support anything as long as it can be done using the hardware they already buy.

That is not to say you cannot be against the PPU and have valid reasons, the above mentioned issues being some of them. But these people need to get a clue, they need to research subjects before arguing them into oblivion. For instance I am getting sick of hearing â€œwhat is the point in getting a PPU when todays graphics cards have them built in anyway?â€.


----------



## ceacar99 (Jul 10, 2007)

> On top of that, because advanced physics can augment game play you can reach a point where the PPU is required to run the game.



you dont need a physics processing card if you didnt shove all this extra shit into a game.... its about MANAGING RESOURCES FOR YOUR ERA. one thing i have against most game developers these days is that they dont give a piece of shit if thier game is actually fun or runs efficiantly. they just make it look beautifull, attach some crap that eats up cpu time and call it good...

hell even with a damned ppu multiplayer games with physics on that massive scale would be laggy beyond your wildest dreams. hl2 for example because the way physics work in the game is incapible of manageing large ammounts of physics objects at once without lagging up a multiplayer game. the only game i have played that works well with large ammounts of physics interactions at once was ut04 with the large ammounts of ordinance that is flung around...  even then if the physics have to be worked through down to every single player step then there is no hope for a good multiplayer game.

in short, a ppu is a nice idea but utterly useless. unless you want to be stuck in the single player world you'll tone down the ammount of physics interactions you have in your game. 

anyway.... i downloaded endorphin to see what i can do with it.... i am more interested in it for the opertunity to QUICKLY rig up some highly realistic and fluid animations. maybe i could find some volunteer voice actors and make a short? who knows....


----------



## Bokracroc (Jul 10, 2007)

PPU's are going to come into mainstream, there's no denying it. They used to say GPU's were a waste of time/pipe dream/useless. But alot of the PhysX stuff has been pretty unimpressive at this current time. CellFactor wasn't all that. Now it's starting to poke it's head in properly.

However the largest problem by far is that the Tech in the Gaming Industry is quite simply going too fast. We barely learn how to use our current tech to it's max before shoving it out the window for something new. Both Consoles and PC have this problem. I won't go on about this, it's pretty well known.


----------



## Zero_Point (Jul 10, 2007)

Well, apparantley, multiple-core CPUs, which are becoming more mainstream and more affordable, are better suited for physics processing than any add-on card. The newest update to Source, which will power HL2: Episode 2, makes use of this functionality to enhance the gameplay with destructible environments such as those observed in the multitude of trailers. BUT, the problem is that apparantley, multi-threaded physics are a pain in the ass to develop from scratch, but once it's done, the pay-off is spectacular.


----------



## ADF (Jul 10, 2007)

That is the reason CellFactor only allows multiplayer over a network; I recall hearing someone claim the physics bumped up the data transfer rate to 14mbps, far higher than any consumer Internet connection can handle. I have to say though that the physics isn't there simply for wow factor; the cloth maybe but the liquids and objects are there to be used as weapons by the players, which is the main appeal of the game.

Regardless the exact same thing would happen if you tried it with software running on a CPU; most online games rely on effect physics since they don't impact game play, hence it doesn't matter if the physics are different for every player and it does not have to be sent over the Internet to other gamers.

Some people say the answer to the performance issues of next gen physics is to not use them at all, a reasoning which quite frankly I find silly and anti progress. If all you want to do is frag people online then there are plenty of games that will let you do that, but there are people who would like to experience a level of realism beyond simple eye candy. We don't need better physics in the same way we don't need better visuals in games, they don't make the game experience but they sure as hell add to it. 

I personally look forward to the death of mesh clipping, water that isn't stuck to presets, wind/movement influenced hair & clothing and being able to tear down buildings to rubble. Graphics may make games 'look' realistic, but physics is what will make them 'feel' real.

[random note]

Bloody Internet! Took me ages to post this.

[edit]



			
				Zero_Point said:
			
		

> *Well, apparantley, multiple-core CPUs*, which are becoming more mainstream and more affordable, *are better suited for physics processing than any add-on card.*


Let me tell you right now that this is NOT the case, thinking like that is the propaganda spread by the group mentioned at the end of my issues post. Dedicated hardware will always be more efficient at a task than general purpose, why do you think the unified GPUs need so many shader pipelines compared to the last generation?

Maybe in a couple of years when we are all running quad/octo core processors we will get PPU 'like' physics, but definitely not in the dual core generation. Even then one has to consider the effectiveness of having one or two cores sitting idle the majority of the time, not utilized, just in case a physics event occurs? I mean is it really worth putting a entire core aside just to run a couple of high poly cloth outfits? When we are talking about cores instead of percentages to run something you have to ask if it is worth it, especially considering what else that core could be used for. 

I am convinced through my findings that while CPU physics will improve dramatically thanks to multicore processors, it will take years before PPU style physics are practical for the performance required. Even the toned down software versions push dual cores to their limits with fps in the teens. Share that with other data processing like AI, game code and background streaming and you have one hell of a performance issue.

I can say from experience and personal experiments that dual core CPUs cannot match the PPU; a quad core dedicating *everything* to physics maybe but I have yet to see it applied. If it could be done we would have seen Intel flaunting it by now; and yes I have seen the Alan wake demo, I am not impressed by those physics if it required that much power to run them.

[yet another edit]

To put even more emphasis on what I mean check out this video.

Take note of the CPU usage on this 5600+ X2 (2.8ghz dual core), this is just for the physics being run and nothing else. It doesn't include any of the other processes that will have to be run along with them like AI and this is just a few pieces of high quality cloth! Can you imagine a character outfit made out of this? It would bring your processor to its knees.

Remember that on a dual core 50% usage means a entire cores worth of performance is being used; when used on a large scale within a game, do you really think a quad core will make much of a difference? Is it really worth it for the affect? This is the cloth the PPU uses in its games, you will have to reduce its quality dramatically for it to work efficiently in a software game.


----------



## Bokracroc (Jul 10, 2007)

I'm short on time so I'll make it quick.

Bigger doesn't mean better. Just because we can have a Physics card doesn't mean we need it. With the rise of multiple-core CPUs/multi-thread whatevers that can more than likely, comfortably process the new fangled Physics why must we have a Physics card?
Will it end up like Graphics card and you'll need to update every year if you want to keep on top of things? The last thing PC gaming needs is another thing to constantly upgrade on top of CPU's, RAM, Video Cards, Hard-drives and all.


----------



## ADF (Jul 10, 2007)

My, you were short on time; you brought up allot of stuff I talked about  but nevertheless...

I will also make this quick; as before I put continuous emphasis that CPUs *cannot* run the high level physics in games while the PPU *can*. Now I am not saying the PPU is the future of physics processing, I have a post above on its downfalls, but if the industry plans on sticking with multicore processors in the long term they need to come right out and admit we won't be seeing PPU affects for a long long time.

None of this hype propaganda about how a few dedicated cores will solve all our problems, because quad core is a long way from being standardized and even longer away from having games that utilize it for more then just performance boosts. Hell even when we have quad core I still doubt it will be on a PPU level; two cores for running the game and another two for physics? How is that different from the tech demos running in software now?


----------



## Zero_Point (Jul 10, 2007)

According to what I've seen from the Episode 2 videos, multi-core technology seems to be working fine. As for tearing physics cloth, what benefit do you get for having every character clothed in that? It's like the awesome hair effects you see in the nVidia tech-demos: Sure, it looks pretty bad-ass, but I've yet to see any game where characters have hair like that. It's just a minor detail that you're hardly going to notice. As for stuff like normal physics objects, sure, a dedicated card might do it "better", but as mentioned earlier, people aren't going to pay upwards of $200 for something that, for the moment, just adds a little eye-candy. Hell, I've seen benchmarks that resulted in LOWER FPS because the graphics card had to render all the extra crap flying around.
On a side note, what's wrong with GPU-accelerated physics? The videos I've seen so far are quite impressive.


----------



## ceacar99 (Jul 10, 2007)

> being able to tear down buildings to rubble



play red faction.... its been soooooo long since i played that treasure that i dont remember how multiplayer performed but the fact that the entire world(if told to) could be blown up was a serious bonus. for gameplay purposes there are some objects in the game that have been made indestructable, but otherwise you are free to blow up what you want. i remember flying a jet thingy and blowing down a tower by blasting its base.

there are also systems like the one in silent storm wich dont use regular physics but they do sit there and calculate the structural integerity of the building. if too much support is lost then the building parts requireing that support are destroyed. its actually pretty cool in that game taking a mg and mowing people down through the floor or walls and in the process tear a hole for your soldiers to advance through...



> I can say from experience and personal experiments that dual core CPUs cannot match the PPU; a quad core dedicating everything to physics maybe but I have yet to see it applied



erm.... look, a physics card is just basicly a 500mgz processor.... its not that fancy nor is it ireplaceable. instead of having that dumn card you have a processor with 500mgz more power(or free up that power in the requirements of the game). done, no problem. now the reason why video cards became inportant is something called advanced texture and lighting, eventually it was also something called direct x. fact is that the video card comes with some tools in it to handle the advanced graphics processes like pixel shaders. a cpu doesnt. however, all physics are is just plain math. you dont need some fancy firmware system like pixel shader is for vid cards, you just plain need something that will do the math, thats it.



> Let me tell you right now that this is NOT the case, thinking like that is the propaganda spread by the group mentioned at the end of my issues post. Dedicated hardware will always be more efficient at a task than general purpose, why do you think the unified GPUs need so many shader pipelines compared to the last generation?



erm.... a multi core CAN have dedicated hardware to the task if it requires it. you can have one of the cores in the cpu dedicated to the physics if you wish it to be so....

look your under the dilusion that these ppus are REALY powerfull and all that stuff. THEY ARENT THAT FANCY. they are just about 500mgz thats all. your fooling yourself with the damned placebo effect by thinking they make a difference when you plug it in. listen to me, ive been working in game development(as a modder) since the days of team fortress classic. ive modded probally 20 different games, including some that people think are inpossible to mod like empire earth. listen to my words when i tell you that such a card is pointless and just a way to get you to buy more crap. 

bottom line, if you have a nice processor then you can handle the physics of ANY system no problem. your just fooling yourself if you think that lil card makes that big of a difference.


----------



## ADF (Jul 10, 2007)

Zero_Point said:
			
		

> According to what I've seen from the Episode 2 videos, multi-core technology seems to be working fine.


Define 'working just fine', working being the opposite of 'not working' or 'broken', I am sure the physics in Half-Life 2 Ep 2 works just fine but I am more interested to what level they work at and how that compares to next gen physics. In other words I am asking for examples of the physics in HL2Ep2, YouTube mostly turns up trailers with little focus on physics. However if they are anything like what I have read about then we are mostly talking particle dynamics and additional rigid bodies, I will need to look into it more before commenting on it.



			
				Zero_Point said:
			
		

> As for tearing physics cloth, what benefit do you get for having every character clothed in that? It's like the awesome hair effects you see in the nVidia tech-demos: Sure, it looks pretty bad-ass, but I've yet to see any game where characters have hair like that. It's just a minor detail that you're hardly going to notice.


Whoa no no no! I am not referring to damageable cloth on all characters, damageable cloth has a much higher poly count than normal physical cloth, that would be overkill. However the cloth you are seeing in the video provided is still much higher poly than anything you have seen in todays games, allowing it to fold and swirl like real cloth instead of like a stiff mattress. Even if software games don't need to use it to that level, multiple characters on screen using physics clothing will pose the same performance issue.

As for why use it, why use HDR? It is such a performance intensive effect yet it doesn't really have a large impact on the game, same goes for all the little shader effects going on in the background. Is the difference between a brick textured wall and one with normal mapping that big that it is worth the resources used? It seems the fact is all these little effects add together towards the overall realism of the game.

The same with hair; a character can either have a large textured lump attached to the back of their head, or free flowing strands that moves with their body language and sways in the wind. It has a psychological impact, something that is dynamic and ever changing will keep the player interested  longer than something static. It is the difference between a forest with and without SpeedTree. Ageia is actually helping to develop a game called heavy rain I think that uses physics hair, check out the released video to see it in action.



			
				Zero_Point said:
			
		

> As for stuff like normal physics objects, sure, a dedicated card might do it "better", but as mentioned earlier, people aren't going to pay upwards of $200 for something that, for the moment, just adds a little eye-candy. Hell, I've seen benchmarks that resulted in LOWER FPS because the graphics card had to render all the extra crap flying around.


First off I take it those benchmarks are from GRAW and City of Villains; I expect that because not only are they horrible implementations of the PPU that get brought up in every argument against it, but they had to be patched multiple times just so a PPU didn't drop performance. Both of those games had the PPU patched in, it caused all sorts of problems.

Secondly, and judging the post made by ceacar99 while I am typing this, it looks like this has turned into another 'me defending PPU against people criticizing it' thread. This seems to happen to me allot; so let me make something very clear.

[size=x-large]I AM NOT SUPPORTING THE PPU![/size]

I am not a Ageia fanboy, I don't own a PPU, I don't plan on owning a PPU and I have already gone over the reasons why it is not viable in the market. My references toward the PPU are for comparison purposes only and discussing why it could have been successful compared to the other options.

I will tell you right now in terms of effectiveness and raw power today the PPU will kick the butt of both CPU and GPU physics, and I will stand by that. However just because I will say that doesn't mean I think everyone will have a PPU in their computers a few years from now; there is nothing wrong with the technology and it has proven itself many times, it is just its ability to become integrated in the market that is what will make it fail.



			
				Zero_Point said:
			
		

> On a side note, what's wrong with GPU-accelerated physics? The videos I've seen so far are quite impressive.


Oh don't get me started, I can write paragraphs on how much I think GPU physics should die a horrible death. Let's just say it solves one problem by creating a even bigger one, and it exists mostly to fix slumping GPU sales on the higher end than meet the needs of the market. Hell I would rather have CPU physics than GPU computing.


----------



## Bokracroc (Jul 10, 2007)

ceacar99 said:
			
		

> > being able to tear down buildings to rubble
> 
> 
> 
> play red faction.... its been soooooo long since i played that treasure that i dont remember how multiplayer performed but the fact that the entire world(if told to) could be blown up was a serious bonus. for gameplay purposes there are some objects in the game that have been made indestructable, but otherwise you are free to blow up what you want. i remember flying a jet thingy and blowing down a tower by blasting its base



You totally weren't playing Red Faction. At most you could dig holes in the ground every now and then. You could blow a hole in a cement floor but the partition board in an office was indestructible.
In Multiplay you could tunnel quite a bit but that's it. You couldn't take down buildings. It wouldn't let you blow a locked door or surrounding wall so you had to find the key. It was pretty minor stuff.


----------



## Zero_Point (Jul 11, 2007)

ADF said:
			
		

> Define 'working just fine', working being the opposite of 'not working' or 'broken', I am sure the physics in Half-Life 2 Ep 2 works just fine but I am more interested to what level they work at and how that compares to next gen physics. In other words I am asking for examples of the physics in HL2Ep2, YouTube mostly turns up trailers with little focus on physics. However if they are anything like what I have read about then we are mostly talking particle dynamics and additional rigid bodies, I will need to look into it more before commenting on it.



Oh, it makes quite effective use for it, mostly graphics-wise, but also in physics (both dynamic and cinematic) and AI processing. Just watch all the game-play trailers to see what I'm talking about. There's also some "benchmark" videos floating around that show off particle physics and enhanced rain effects. Quite neat to watch.



> Whoa no no no! I am not referring to damageable cloth on all characters, damageable cloth has a much higher poly count than normal physical cloth, that would be overkill. However the cloth you are seeing in the video provided is still much higher poly than anything you have seen in todays games, allowing it to fold and swirl like real cloth instead of like a stiff mattress. Even if software games don't need to use it to that level, multiple characters on screen using physics clothing will pose the same performance issue.



Okay... Then why use it? If truly realistic cloth requires a high-end graphics card to render it anyway, might as well do some physics calculations on it, too. 



> As for why use it, why use HDR? It is such a performance intensive effect yet it doesn't really have a large impact on the game, same goes for all the little shader effects going on in the background. Is the difference between a brick textured wall and one with normal mapping that big that it is worth the resources used? It seems the fact is all these little effects add together towards the overall realism of the game.



The shader effects are usually used in eye-candy scenarios and levels where the player can take their time to look around and go "Ooh, wow!". As for HDR, I think people are going too over-board with it (that VGCats comic comes to mind... "SHAZAAM!" *eye 'splosion*). BUT, it's also applied universally, not just to a lone flag here and a table-cloth there.



> The same with hair; a character can either have a large textured lump attached to the back of their head, or free flowing strands that moves with their body language and sways in the wind. It has a psychological impact, something that is dynamic and ever changing will keep the player interested  longer than something static. It is the difference between a forest with and without SpeedTree. Ageia is actually helping to develop a game called heavy rain I think that uses physics hair, check out the released video to see it in action.



Most games these days where such dynamics are involved are usually FPS titles, in which you're too busy running and gunning to take notice of wether or not your team-mate's hair is messed up or not. Also, physics hair can be done just as well on a mid-range card, no need to pay $200 for an add-on card just so I can see hair swaying in the wind.



> First off I take it those benchmarks are from GRAW and City of Villains; I expect that because not only are they horrible implementations of the PPU that get brought up in every argument against it, but they had to be patched multiple times just so a PPU didn't drop performance. Both of those games had the PPU patched in, it caused all sorts of problems.



Actually, those benchmarks were from one of the first games to make use of PhysX: Ghost Recon. While the physics look impressive, there was a hefty frame-rate drop because the graphics card had to render all the extra stuff flying around. It wasn't the physics lagging the game.



> Oh don't get me started, I can write paragraphs on how much I think GPU physics should die a horrible death. Let's just say it solves one problem by creating a even bigger one, and it exists mostly to fix slumping GPU sales on the higher end than meet the needs of the market. Hell I would rather have CPU physics than GPU computing.



In order for me to consider your argument against GPU physics, I would suggest writing said paragraphs so I can see where you're coming from. So far all I can gather is you think the GPU manufacturers are trying to con us into buying the high-end cards. As for performance, I see no issues with it. The videos and demos I've seen at least show that they're capable of calculating several hundred, if not several THOUSAND objects at once in a high-detail environment. A friend of mine claims that these objects "exist in a different physics environment". Even if they do, that's the answer to emulating those "fake physics" you were mentioning earlier. You know, the ones you said weren't really important but made for good eye-candy like water-falls and stuff?


----------



## OnyxVulpine (Jul 11, 2007)

Wahh, what am I going to learn when I take some course in video games. And is it hard? >< All this talk about the physics and animation got me worrying cuz it looks real tough.


----------



## ADF (Jul 11, 2007)

First off let me start by saying that Ghost Recon is GRAW; look at the name, *G*host *R*econ *A*dvanced *W*arfighter. The additional objects flying around being the cause of the frame rate drops is also a common misconception, it was actually created by the pro Ageia crowed as a excuse to draw the performance drop issues away from the PPU. The performance drops were caused by the horrifically bad implementation, the game is famous amongst the PPU crowed as being the worst one out there. Even Bet on Solider, a game that was also patched with the PPU, got better performance results for the new affects in the game than GRAW.

With that aside, if you really want to know what 'I' consider wrong with GPU compute physics then I suppose I can take the time to tell you. However don't say I didn't warn you  So you can either read the needlessly detailed and long version, but quite informative. Or you can skip to *In Short* at the end that covers my opinions on the tech.

*First off lets go over how GPU physics came to be, how it actually works and how the market was manipulated into favoring it over the competition.*

They discovered that the pixel pipelines on GPUs that are used to accelerate shader effects within games are very good at crunching numbers, so good in fact that they can be done many times faster than the CPU. Hence GPU computing was born, software designed to take advantage of the high number crunching on GPUs to greatly accelerate applications. And hell why not? It is not like your GPU is actually doing anything while you are going about your daily tasks; might as well put it to work to get stuff done faster, the GPU companies of course didn't complain as it meant uses for GPUs outside of games and CAD.

The technique itself is still in its early stages, with it only just being advertised in business environments to improve productivity. 

However then comes along Ageia with the dedicated physics processing unit, and all the sudden the gaming communities are alive with discussion about physics. No one had seen a leap in performance like this before, it was literally several generations beyond anything we have been seeing in games at the time. However this came as a great threat to CPU companies; they had already lost many jobs to the GPU companies that justified the need for more CPU power, should the PPU become successful it would mean another big chunk out of their business. However try as they might they couldn't compete with the PPU, multithreading was their answerer but it is still years from being effective enough to compete. Software companies like Havok also realized this threat, they simply could not compete with Ageia with software physics.

But then we have this thing called GPU computing; Havok needed a edge and Nvidia is always open to whatever will boost GPU sales, especially in the upper end when mid range GPUs could run anything at decent settings. So they teamed together to create Havok FX a GPU accelerated physics API. The idea was simple, utilize some of the GPU shader pipelines to crunch the math needed to run physics, it was especially a good time to work on this thanks to unified GPUs offering over a hundred shader pipelines to utilize. However it would take time to develop this technology, in the mean time Ageia already has titles on the market and is gaining market share. So what could they do to deter PhysX adoption?

Many technical demonstrations was leaked to the public, showing tens of thousands of objects colliding in real time. This had a stunting affect on the PPU, gamers put off its adoption because of the promise of something better around the corner, something they are told they will be able to enjoy without spending a penny. Fans started calling it free physics, running next gen physics on existing hardware. This saying evolved over time into the misinterpretation that Havok FX didn't cost anything from the user, either financially or performance wise, a almost magical unlocking of untapped resources on the GPU to allow great looking games with PPU level physics. 

This of course only fueled the hype; leading to many people supporting Havok FX over the alternatives, despite a good deal of them not having all the facts straight. When the unified GPUs was released claiming to have physics support this only increased confusion, GPUs do not require specific hardware to run Havok FX. However people believed that only these graphics cards had the support and purchased them believing they now had a physics processor built into their graphics card, invalidating any need for the PPU. 

Despite all this however, GPU physics still only existed in tech demo form, there were no titles announced to support the technology. On top of this none of the tech demos were ever made available to the public for testing, Havok kept very tight control of who could test the API. The only GPU physics accelerated demos that became available was released by independent developers, in the meantime Ageia let anyone play with their PhysX API. This led to some people believing Havok had something to hide, if the technology was so great why keep such tight ropes on it? Public testing of such a great technology would only lead to greater enthusiasm.

*Now lets look at what further digging brought up about the technology, plus how it will be used by the consumer.*

With all the possibilities aside a limitation was discovered within the design of all GPUs, something that comes from trying to get a piece of hardware to perform a task that it was never intended to do. Graphics cards were never intended to perform the tasks physics requires, there was never a need to. The only job of the GPU was to make games look better; to accelerate graphics and add textures and shaders, it was never meant to actually run code, that was the APIs job. So when the math being run on shader pipelines makes several objects move together, the only way to make a collision detection occur is to do it in the API, in other words in software on the CPU.

This is a significant limitation; GPU physics can not run collision events without the software doing all the work. This means whatever the performance limitations of todays software are they are inherited in GPU physics, if a CPU can handle 3000 convex collisions per second then that is the same for Havok FX. So Havok FX is not really complete GPU computing, but rather a hybrid between the GPU and CPU. But what does this mean to the gamer? It means games like CellFactor, as well as any other games that would use physics game play at a next gen level, are not possible with Havok FX.

So this is quite ironic; on one hand the PPU can accelerate new levels of physics game play, but doesn't have the install base to justify it. While on the other the GPU does have the install base to justify its utilization for game play, but not the ability to actually do it.

So Havok FX cannot compete with the PPU in game play physics; it can have a massive amount of rigid body actors on the screen, but they are mostly just for show. So how do you market this when you claim the PPU is not needed? Well, 'affect physics' goes out the window and now 'effect physics' is the new keyword. Havok FX may not be able to add new levels of game play in games, but they can make them pretty, very pretty. So it seems the GPU is getting back to basics, making the game look prettier rather than play differently.

So that is the capabilities of Havok FX, but how does it handle on the performance side? All the demos released so far have had the entire GPU to use for nothing but physics, in some cases they even had two GPUs to run the demo. This gives a very misleading impression on the type of effects people will be seeing in their games, in real games the physics will be sharing the GPU with all the other visuals in the game, requiring the user to take resolution and quality settings down a few notches. So thousands of objects falling down a hill becomes hundreds, any more will require significant reductions in visual quality to free up resources. For example look at this and this Havok FX demo; impressive physics, but take note of the game itself. Very minimal use of shaders, even the shadows cast by the environment have no impact on player shading. You also have to consider that physics alone requires GPU power to render the additional objects on screen, so that is even more strain on a already taxed graphics card.

Another example is the only Havok FX supporting game that has been announced to date, HellGate London. Which just so happens to be wildly criticized as a DX10 game that looks more like DX9. All those additional physics flying around the scene probably forced them to develop the game with lower than standard visuals; so even if someone payed out for a powerful enough GPU to run high graphics and Havok FX, just the existence of the physics has impacted the games quality.

Of course Havok and Nvidia have already considered the affects of GPU physics on normal game visuals, hence SLI physics. The idea is to make two GPUs a standard in every gaming system, unlike SLI graphics however the GPUs do not have to match. The gamer could for instance have a 8800GT rendering the games graphics and a 8600GT performing the physics. The convenient part is since effect physics are scalable, gamers without the income to afford dual GPUs can either scale down their graphics/physics to compensate on their single GPU system or turn off GPU physics completely.

*In Short...*

I do not like GPU physics because it not only does not provide the game play affecting physics I look forward to, but turns tomorrows physics into something that can only be obtained by those able to afford it. 

Todays CPU physics targets the lowest common denominator, allowing high end users to get better performance while still allowing the budget crowed to play physics intensive games. Even Ageia, despite wanting the success of the PPU, still fully threads their applications to take advantage of todays multi core CPUs. Allowing none PPU users to enjoy the physics they put in games at some level. Havok FX will change that, either you pay the big bucks or you miss out, in the mean time Nvidia racks in plenty of profits for this new justification for their high end and SLI setups.

I don't know about you; but I already spend quite enough on this computer, and I don't want next gen physics to turn into a luxury of the high end crowed. You may argue isn't the PPU the same? Keeping in mind I am aware of its market penetration difficulties, PPU upgrades are more like 3-4 years than every 18 months, and it doesn't require you to upgrade multiple components to change it unlike SLI that requires a high end power supply and CPU. Everyone gets the same high level physics performance, not just the enthusiast crowed.

If the PPU dies because of poor adoption and support, I would rather physics advances at a snails pace with multicore processors than with the too expensive fake physics of Havok FX. At least then GPU power will be given the chance to catch up so it can render all the additional objects high level physics creates.

END

... Agh jeez look at the size of this thing, I have first year university assignments this long! I did warn you I would go on and on about it, oh well apologies. Also apologies for errors due to lack of proper proof reading, hey can you blame me? If you have any questions on the above, things I wasn't clear on and such, just ask.


----------



## Rostam The Grey (Jul 11, 2007)

LOL, I'm reading all this like "It's just an extra 500mhz processor.." and "you don't need al that crap", etc. etc. And thinking... That's like saying a video card is just an extra processor... NO! Video and Physic cards are processors with functionality built in. Does this mean you need them? No. But what you get is an additional processor with the most effective and efficient functionality. Sure you could figure out how to do all the phsyics yourself, and they might be semi-realistic. Just like you could figure out how to do all the graphics yourself, and they might look slightly good... Seriously, everything in computers is getting more and more compartmentalized, that's one of the reasons that games are getting better and looking more realistic. Sure, fun is important, but graphics helps a LOT. I don't care to see block figures when I can see realistic looking figures. Just as I wish I had an FPS game where I could warp the landscape in a manner that would allow me to create traps and hiding spots...


----------



## Silver R. Wolfe (Jul 11, 2007)

Future of physics and animation?  Check out some of the animations in Uncharted: Drake's Fortune.  Utterly amazing.

http://gamevideos.com/video/id/11515

Naughty Dog (developer) went on record to say that the main character has over 3000 different animations.  These animations are used in a 'layered animation' system which allows you to stack them and creates on the fly, unique animations.


----------



## ADF (Jul 11, 2007)

I have to wonder where people get that idea from, how they somehow think the PPU is just a 500mhz processor and some memory slapped on silicon. Hell if it was just that I would be very impressed; think about it, looking at the CPU utilization of the video I posted earlier the very idea of the same thing plus more running on such a low clocked CPU is ridicules. There are actually documents out there that explain how the hardware in the PPU is configured to better accelerate physics, documents I should find and reference if I plan on participating in these discussions.


----------



## ceacar99 (Jul 11, 2007)

here's the scoop on the program that started this thread....

i downloaded it and familiarized myself with how it works and i have to say its downright awesome for somethings. however its also incredibly frustrating. the source of this frustration is that at this point i don't believe that you can actually animate your character in the program. you can apply behaviors, or forces upon the target but not skeletal animations. you have to make the animation in a third party program like max or maya and import it before you can make your character do such things as run, throw a punch or whatever. its a nice program but terribly inefficient right now....


----------



## Zero_Point (Jul 11, 2007)

That's why it's just the Learning Edition. It's basically just a physics toy right now.

It's funny though, when I first heard of the PPU, my friends and I were joking about how soon there'll be an AI card and such. And I appreciate you typing all that stuff out ADF, it brought up points that I didn't think of before and clarified that which I did know. BUT, there is the possibility of dedicated physics calculation hardware being built-in to things like graphics cards and even motherboards. IF the technology becomes cheap enough to integrate it into motherboards I'm sure it will really take off, but until then, last I checked they used the PCI bus, which PCI slots are becoming fewer and fewer on modern motherboards. And besides, as you said, I've spent enough money on this machine. It was $1800 without the OS, case, HDDs, monitor, keyboard and all that other stuff. That was just for the mobo, CPU, GPUs, RAM, and an IDE card because finding more than one IDE connector on a high-end mobo is also difficult. Add the case and cost value of the drives and OS and that puts it up to over $2300. It was top-of-the-line back in Nov. when I put it together. Paying $200 for a PhysX card so I can have better physics on games I have no interest in playing didn't seem worthwhile to me. The ONE game I have that MIGHT support PhysX is Rise of Legends, and I haven't played that in a long time.
As for the argument that PhysX isn't supported by game devs, why not? Why not just use Novodex in your game anyway? It's fast, accurate (more important to me than fast. Blame GMod), AND allows you to support PhysX cards. So really it's the game-devs fault.


----------



## ceacar99 (Jul 11, 2007)

the "learning edition" bit about the program mainly means that you cannot export. 

the issue with animations is that stock you dont have ANY skeletons. just "physics collision boxes" or something like that... so literally the platform for a real animation doesnt exist at that point. in order for you to use a animation in your physics sequence you have to inport a skeleton and mesh(along with making the physics dummy fit the mesh) as well as the animation file for that mesh. basicly it gets kinda complicated. i know how to do it, i just think its silly to have to go through all that when they can just rig up a skeleton with thier physics dummies to begin with....


----------



## ADF (Jul 12, 2007)

First off before I address your post, let me clarify that I am not trying to convince anyone to buy a PPU. I know I am getting repetitive with my disclaimers, but more than I would like in the past has problems been caused by peoples misinterpretations of my role in the thread.

Now then...



			
				Zero_Point said:
			
		

> It's funny though, when I first heard of the PPU, my friends and I were joking about how soon there'll be an AI card and such. And I appreciate you typing all that stuff out ADF, it brought up points that I didn't think of before and clarified that which I did know. BUT, there is the possibility of dedicated physics calculation hardware being built-in to things like graphics cards and even motherboards.


You are right, that is a possibility, it is also possible for a sound card or even a general purpose processor being strapped on the GPU. Practically turning the GPU from a graphics processing unit, into a gaming processing unit. However the industry has not gone that direction, they have decided on GPU computing. I imagine that cost has played a big part in their decision; if we start configuring GPUs to play all these other roles it will not only dramatically increase the retail price, but also the failure rate during manufacturing. Dividing their efforts would be costly and more than likely out done by the specialist hardware.

You also have to consider the range of consumers you will appeal to by such products; motherboards for instance can be used for all sorts of uses, up the price and turn it into a gaming configured motherboard and you limit your market.

I think the all in one processing unit is still a few years off, Intel has one in the works with 60+ cores on a giant bus but you can probably imagine that is nowhere near out of beta. I personally prefer to hand pick my components than to hand that much control over my computer to one company.



			
				Zero_Point said:
			
		

> IF the technology becomes cheap enough to integrate it into motherboards I'm sure it will really take off, but until then, last I checked they used the PCI bus, which PCI slots are becoming fewer and fewer on modern motherboards.


There is actually a PCI-E version in the works that got a hefty die shrink and price drop, I doubt many people would miss one of those slots being used. However Ageia has been promising this revised PPU for like, forever! It has been coming out for the past half year, it will be out eventually but this is getting ridicules.



			
				Zero_Point said:
			
		

> And besides, as you said, I've spent enough money on this machine. It was $1800 without the OS, case, HDDs, monitor, keyboard and all that other stuff. That was just for the mobo, CPU, GPUs, RAM, and an IDE card because finding more than one IDE connector on a high-end mobo is also difficult. Add the case and cost value of the drives and OS and that puts it up to over $2300. It was top-of-the-line back in Nov. when I put it together. Paying $200 for a PhysX card so I can have better physics on games I have no interest in playing didn't seem worthwhile to me. The ONE game I have that MIGHT support PhysX is Rise of Legends, and I haven't played that in a long time.


I can see your point, we already spend quite allot of money on our systems. The PPU has dropped to around $160/Â£90 but it is still a bit too expensive for what it gives right now. However will the GPU option really be cheaper? 

First off you need a really high end GPU to handle both the graphics and the physics, even then graphics performance will still take a hit. Or you can buy a SLI system and have a second GPU handle the physics; as well as the more powerful power supply to fuel both cards, the high quality CPU to remove the bottleneck, the high speed memory to aid the powerful CPU and so on. In the end the cumulative costs of a Havok FX system are far greater than a PPU, if you have a mid range setup you might as well not even bother.

Of course, this only addresses the people who actually want high end physics. Those who don't care can simply stick with their current setups. That is if Havok FX gives players the option of turning it off, it may become mandatory unlike the PPU.



			
				Zero_Point said:
			
		

> As for the argument that PhysX isn't supported by game devs, why not? Why not just use Novodex in your game anyway? It's fast, accurate (more important to me than fast. Blame GMod), AND allows you to support PhysX cards. So really it's the game-devs fault.


You know the game Bioshock? Being based on the Unreal Engine 3 it already had the Agiea physics API built in. Even if they didn't want to support the PPU, they already had a physics engine. But you know what they did? They tore out the engine and paid the hefty $250,000+ price tag to license Havok.

Many successful multi platform games have used the Ageia physics engine; even if you don't add PPU support, the PhysX API is significantly cheaper than anything Havok offers.

So why did they do this? Perhaps game companies are just comfortable with Havok and don't want to risk a different company. The entire industry is used to Havok being 'the' physics API in gaming, trying anyone else could risk development time and costs. Only a few companies have really taken the plunge and tried the PhysX API, but it seems those who can afford it just throw licensing fees at Havok as per usual.

The only company that has really backed Ageia, and not just because of financial restraints, are the guys who made the unreal engine. With OpenGL and Linux on their support list they seem to have a thing for the little guy.


----------



## Zero_Point (Jul 12, 2007)

ADF said:
			
		

> You are right, that is a possibility, it is also possible for a sound card or even a general purpose processor being strapped on the GPU. Practically turning the GPU from a graphics processing unit, into a gaming processing unit. However the industry has not gone that direction, they have decided on GPU computing. I imagine that cost has played a big part in their decision; if we start configuring GPUs to play all these other roles it will not only dramatically increase the retail price, but also the failure rate during manufacturing. Dividing their efforts would be costly and more than likely out done by the specialist hardware.
> 
> You also have to consider the range of consumers you will appeal to by such products; motherboards for instance can be used for all sorts of uses, up the price and turn it into a gaming configured motherboard and you limit your market.
> 
> I think the all in one processing unit is still a few years off, Intel has one in the works with 60+ cores on a giant bus but you can probably imagine that is nowhere near out of beta. I personally prefer to hand pick my components than to hand that much control over my computer to one company.



Only a hard-core gamer would want a PPU anyway, so "limiting the market" isn't really valid there. I've yet to see even a mid-range rig built with a $20 motherboard that's barely suitable for an office machine.
That 60+ core CPU Intel is working on is actually for serves IIRC.




> There is actually a PCI-E version in the works that got a hefty die shrink and price drop, I doubt many people would miss one of those slots being used. However Ageia has been promising this revised PPU for like, forever! It has been coming out for the past half year, it will be out eventually but this is getting ridicules.



I definitely wouldn't miss one of them being used. I have NOTHING in those slots as I've found NOTHING that uses it.



> I can see your point, we already spend quite allot of money on our systems. The PPU has dropped to around $160/Â£90 but it is still a bit too expensive for what it gives right now. However will the GPU option really be cheaper?
> 
> First off you need a really high end GPU to handle both the graphics and the physics, even then graphics performance will still take a hit. Or you can buy a SLI system and have a second GPU handle the physics; as well as the more powerful power supply to fuel both cards, the high quality CPU to remove the bottleneck, the high speed memory to aid the powerful CPU and so on. In the end the cumulative costs of a Havok FX system are far greater than a PPU, if you have a mid range setup you might as well not even bother.



Actually, for my rig, I've already got everything you've mentioned there, minus the PPU, so all it would do is make my system cost more than it does.



> You know the game Bioshock? Being based on the Unreal Engine 3 it already had the Agiea physics API built in. Even if they didn't want to support the PPU, they already had a physics engine. But you know what they did? They tore out the engine and paid the hefty $250,000+ price tag to license Havok.
> 
> Many successful multi platform games have used the Ageia physics engine; even if you don't add PPU support, the PhysX API is significantly cheaper than anything Havok offers.
> 
> ...



Devs like Epic Games are smart. Methinks the only reason that anyone would pay an extra $250,000 on an "inferior" physics API is because of some licensing ass-hattery at work behind the scenes, like if they're sponsored by nVidia or something. In which case, I welcome you to the wonderful world of capitalism!


----------



## ADF (Jul 12, 2007)

Zero_Point said:
			
		

> Only a hard-core gamer would want a PPU anyway, so "limiting the market" isn't really valid there. I've yet to see even a mid-range rig built with a $20 motherboard that's barely suitable for an office machine.
> 
> That 60+ core CPU Intel is working on is actually for serves IIRC.


Tell that to some of the anti PPU crowed; while there are different groups with varying opinions on the subject, there are those who believe in the future there will be no sound cards or GPUs or whatever. They believe everything will be on a massive parallel processor that can dedicate say 30 cores to graphics, 7 cores to physics, 3 cores for sound and so on. While I can see the design being possible in theory, especially for certain markets, it will be at least a decade before it is affordable by the average PC owner and even longer for all of software to be shifted to supporting it. 

That is the problem with grand ideas; getting it to work in hardware is one thing, actually getting the software support is another, Ageia knows that more than anyone. Doesn't stop people from using the idea against the PPU however, I have to wonder how they can even rationalize how technology we won't see for years will be a threat to present tech.

Regardless with that put aside, it is true that only the hardcore market will want a PPU to begin with. It is the same way in that we don't need a 8800GTX to run our games; but there are people who will pay the extra to fix those small imperfections, and of course those who won't. With that is mind if PPU style processors get attached to GPUs why should everyone have to pay for them? Every GPU will have to include one to obtain the install base to encourage developers support, unlike graphics game play physics cannot simply be turned off without influencing the game experience between players.

Nvidia priding itself on being a dedicated GPU company also reduces the possibility, I recall they made fun of ATI for merging with AMD. GPU computing makes more sense for such a company than slapping a physics configured processor on the side of their already too big graphics cards.



			
				Zero_Point said:
			
		

> Actually, for my rig, I've already got everything you've mentioned there, minus the PPU, so all it would do is make my system cost more than it does.


Your rig maybe  but SLI is for the enthusiasts amongst enthusiasts; even high end gamers are completely satisfied with one GPU, you would have to be playing at 1080p style and above resolutions with full AA to justify a second GPU. Don't get me wrong, my system is fully SLI ready with the exception of a second graphics card, but such systems do not make the bulk of the market. While enthusiast hardware is produced, it is the mainstream tech like the 8600 that makes the majority of customers.

Keeping this in mind, even if you are SLI ready it doesn't mean you can afford to pay out for a second GPU, just for physics at that. If Nvidia gets their way the enthusiast setup of today will become the standard setup of tomorrow, I don't think they have considered the impact this will have on the mainstream market. Even if it only exists for high end gamers wanting a bit of additional physics eye candy, why on earth should develops pay out extra to license Havok FX for the minority of their players? The same could be said for the PPU, but Ageia gives out their engine for FREE to anyone who adds PPU support, the licensing fees are only for those who use the engine without PPU support.

With all of this aside, the price of a GPU being used for effect physics is still much higher than a PPU that can do both affect and effect. But the most efficient method is not always the one that wins, the install base alone is enough to encourage GPU physics to become the standard.

[edit]

On a random note, the new PhysX drivers are out. Go download them now if you want to have a play of that cloth demo you saw earlier.


----------



## Zero_Point (Jul 13, 2007)

The main thing where I just don't see hard-core physics acceleration working is in multiplyer. I mean, most connections have trouble synchronizing the movements of a few dozen players. Imagine having to sync thousands upon thousands of physics objects over an internet game. ._.
Though I can see your point on GPU manufacturers trying to make money. I guess it's no coincidence that the $712 CPU I bought back in Nov. is now <$180 where-as the GPUs I bought are only $275 compared to the $300 I payed for them. >.<


----------



## ADF (Jul 13, 2007)

That is largely due to the AMD/Intel price war and the lack of competition for Nvidia's 8 series though.

Multiplayer is a area where game play affecting physics will be student, online connections are simply not fast enough to keep all the players up to date on where everything is. However effect physics will have no impact on such games, a area I believe unreal tournament 3 will be focusing on. I heard only 3 of the maps within the game actually use physics to a level where the PPU is mandatory.


----------



## Zero_Point (Jul 13, 2007)

Whew! Good to hear. I built this rig with UT2K7 in mind. ^_^


----------

