# nVidia to buy Aegia, maker of the PhysX physics processor



## Zero_Point (Feb 6, 2008)

http://hardware.slashdot.org/article.pl?sid=08/02/05/006213&from=rss

Yep, graphics giant nVidia is buying out Aegia, maker of the PhysX physics processing unit. What does this mean for nVidia and Aegia? Hopefully more immersive physics for all.


----------



## ADF (Feb 6, 2008)

Iâ€™m actually surprised someone made a thread about it here; Iâ€™ve been talking about it around the web the last few days but Fur Affinity didnâ€™t seem like the right place to make a topic. Hell, my previous discussions on the PPU mostly went unnoticed. 

Anyway as I said elsewhere this has all sorts of implications, both good and bad. 

Nvidia buying Ageia makes perfect sense: when Intel bought up Havok, killing Havok FX in the process, Nvidia was left with a GPU designed to support physics on the GPU but no physics middleware to allow developers to effectively utilise it. Ageia has several years experience in parallel processor physics programming, a physics API with a good market penetration of 140+ games and large support for hardware based physics. For Nvidia is was logically the next best choice.

From Ageiaâ€™s perspective this means their dream of a dedicated physics processor is dead; they will have to make do with GPU graphics/physics multitasking, it is not what they wanted but it is the closest thing. Having Nvidia as an investor on the other hand will allow them to fund development in areas they previously couldnâ€™t consider.

The problem with Havok FX is it was extremely limited in what it could do; it tapped performance from the GPU via the shader pipelines and couldnâ€™t fully take advantage of the hardware, not being able to do game play physics was one impact of this. CUBA on the other hand takes full advantage of the GPU and can enhance game play, the problem with this however is only the Nvidia 8 series is â€˜fullyâ€™ compatible with CUBA programming so it limits what they can do with previous generation cards. There is also the big issue of CUBA not being compatible with ATI graphics cards, meaning a game that takes advantage of it will only run on Nvidia cards.

You can see this is a big problem.

Regardless it could be a year or two before we actually see the fruits of Ageia/Nvidia GPU physics; CUBA ready GPUs need time for further market penetration and the Ageia PhysX API needs to be remade to target the GPU, until then Nvidia has stated they will support Ageiaâ€™s current investments in the PPU until it is finished.


----------



## Zero_Point (Feb 6, 2008)

Hmm, I figured it meant possible PPU integration into nVidia graphics cards, off-loading the physics processing so the GPU can keep doing it's thing without hindering performance. Or hell, I've even heard mention of a chipset with integrted PhysX being a possibility.
And it's CUDA, not CUBA. FYI.


----------



## ADF (Feb 6, 2008)

Sorry but stuff like that has become a bit of a pet peeve for me, for over a year I have constantly heard people talk about GPU physics as if it is a physical physics processor built into the GPU and it really irritates me. GPGPU uses existing resources with a programmable interface, which in the case of Nvidia is â€˜CUDAâ€™.

Anywayâ€¦ Nvidia has already made and incorporate their own GPGPU solution into GPUs, there is no need to integrate a new physics chipset directly into the next GPUs because everything needed is already there now. Besides, waiting till next generation to start GPU physics will only extend what is going to already be a long wait. They may improve it based on what they learn from the PPU, but nothing radical if they plan to push any standards soon.


----------



## Ron Overdrive (Feb 6, 2008)

Well the rumor floating around is that on their new line of mobo's that utilize the 8400 MCP the onboard GPU would become the PPU if another graphics card was put into the case.


----------



## ADF (Feb 6, 2008)

Integrated motherboard GPUs are crap, they wouldn't even come anywhere near the performance needed to compare to the existing PPU. The original GPU physics tech demos needed SLI 7900GTs just for rigid body/particle collision, I doubt an integrated chip has that much power even today.

The more I think about this the more it seems likely that Nvidia is just going to push effect physics for a year or two. The PPU didnâ€™t have to take varying hardware performance into account, whether low or high end everyone had the same hardware. With GPUs game play will be decided by the lowest common denominator performance, anything higher will have to be effects so not to alienate the mainstream.


----------



## Ceceil Felias (Feb 6, 2008)

I'm betting it'll just be a matter of AGEIA gaining the advantage of nVidia's power, much like AMD and ATI. They didn't say AGEIA was being entirely absorbed.


----------



## Ron Overdrive (Feb 6, 2008)

Actually they are according to nVidia's website: http://www.nvidia.com/object/io_1202161567170.html


----------



## ADF (Feb 7, 2008)

Considering CUDA doesn't work on ATI cards and the install base for 8 series Nvidia cards isn't the majority yet, anyone else seeing this is going to result in only effect physics games? What developer in their right mind would add something to their game that will make it not work on a particular brand of GPU? That's like Intel making Havok not work on AMD CPUs, you are going to severely damage the engines popularity among developers who only want the broadest audience possible.

It is not really something you want to fix by encouraging Nvidia GPU adoption, a hardware monopoly is the last thing PC gaming needs right now.


----------



## Ron Overdrive (Feb 7, 2008)

You're going by the assumption that nVidia plans to phase out the PPU. nVidia did announce they don't plan to phase  out the PhysX cards any time soon and honestly it would be a bad idea. The PhysX card would be a good alternative if you don't want to go with SLi or have an ATI card. Hell I don't need SLi and I rather pay an extra $150 for the PhysX card then another $250 - $350 for a 2nd video card. The PhysX engine is also still mainly software as well, nVidia won't nix the software as it would be a marketing disaster otherwise.


----------



## indrora (Feb 7, 2008)

Just out of paranoia,
but has anybody in the thread other than myself played with a physX Card?
the driver for the acctual Physics is handled in software. the firmware is simply handled as a DLL thats sent down the pipeline to the card. you see however, that both of them have striking similarities in their signatures to ATI cards within the files files (Aegia-n.n.n.dll and phsxproc-n.n.n.dll, hidden away in some small corner)

Also, i've noticed that even when i'm doing hardware rendering with the phsx prosessor, i still get a noticable amount of CPU load, and its mostly for the rendering! when doing tests with water and such, it was very slow at high resolutions.


----------



## indrora (Feb 7, 2008)

Just out of paranoia,
but has anybody in the thread other than myself played with a physX Card?
the driver for the acctual Physics is handled in software. the firmware is simply handled as a DLL thats sent down the pipeline to the card. you see however, that both of them have striking similarities in their signatures to ATI cards within the files files (Aegia-n.n.n.dll and phsxproc-n.n.n.dll, hidden away in some small corner)

Also, i've noticed that even when i'm doing hardware rendering with the phsx prosessor, i still get a noticable amount of CPU load, and its mostly for the rendering! when doing tests with water and such, it was very slow at high resolutions.

Just curious, because if nVidia is buying the company, would that mean that they're going to move everything to nVidia hardware? wont that break some things? Just curiosity.


----------



## ADF (Feb 7, 2008)

Ron Overdrive said:
			
		

> You're going by the assumption that nVidia plans to phase out the PPU.


Well they did announce that is what they are going to do, remake the engine for CUDA based GPU physics instead of the PPU. But that takes time, so Nvidia will continue to support Ageia's existing solution until demand drops or they finnish the GPU alternative.


----------



## Ron Overdrive (Feb 8, 2008)

ADF said:
			
		

> Ron Overdrive said:
> 
> 
> 
> ...



I would like for you to cite your findings on that because I've checked several articles including the nvidia website and there's no indication they will dissolve dedicated PPU cards in favor of GPU physics in either short or long term. I've checked a few articles on Slashdot, nvidia.com, and PC Perspective. The only article that indicates any idea what nvidia has planned is on PC Perspective and it seems like nVidia will be producing both dedicated PPU's and GPU's respectively.



			
				PC Perspective said:
			
		

> NVIDIA was very excited about GPU based physics, and they will likely pursue that avenue again.  Not only will they support standalone physics cards, but they will likely integrate GPU physics into the PhysX framework.



http://www.pcper.com/article.php?aid=515


----------



## ADF (Feb 8, 2008)

Ok let me just grab the purchase announcement from Nvidia's website...

Here we go.



> "The AGEIA team is world class, and is passionate about the same thing we areâ€”creating the most amazing and captivating game experiences," stated Jen-Hsun Huang, president and CEO of NVIDIA. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForceÂ®-accelerated PhysX to hundreds of millions of gamers around the world."
> 
> "NVIDIA is the perfect fit for us. They have the world's best parallel computing technology and are the thought leaders in GPUs and gaming. We are united by a common culture based on a passion for innovating and driving the consumer experience," said Manju Hegde, co-founder and CEO of AGEIA.
> 
> ...



Everything about this article screams moving Ageia to processing on the GPU, you cannot really beat an official source either.


----------



## Ron Overdrive (Feb 8, 2008)

ADF said:
			
		

> Ok let me just grab the purchase announcement from Nvidia's website...
> 
> Here we go.
> 
> ...



The official source says nothing about completely phasing out PPU units. They're excited about using the GPU as a physics option again yes, but there's nothing there stating they will definitely phase out PPU physics in favor of GPU physics. If you think about it from the corp view, you are widening your market by having both. Obviously you can't force people to buy your brand, but you can give them incentives to move over like not having to buy an additional card for physics. But so long as you supply that additional card you're still making money off the competition's "brand whores" who won't switch over regardless. Its a win/win situation for nVidia to continue releasing the PPU standalone units and I doubt they're that stupid to pass up such a marketing jackpot considering how good nVidia is with marketing. Besides, they're excited that they can continue where they left off with HavokFX which still has yet to prove itself as a viable technology. There's no reason to get excited over having a PPU to market right away because its already proven itself to sell so they will most likely continue to sell it and make improvements to it as they develop GPU physics alternatives.

Until there is a definite announcement declaring PPU physics dead lets not assume anything is in their plans for removing the PPU. We have yet to see what Intel is cooking up with Havok. For all we know they could start offering a standalone PPU of their own besides the rumored CPU/PPU hybrid.


----------



## ADF (Feb 8, 2008)

We will hopefully find out on the 14th were Nvidia has said they will release more details on the purchase. Honestly though I donâ€™t see Nvidia supporting PPU technology longer than they need to, this purchase was mostly about the parallel processor optimised physics engine.


----------



## ADF (Feb 14, 2008)

Nvidia has gone insane; if they go through with their plans they will sooner kill PC gaming than make physics more innovative.

Link

This is the reason I was against GPU physics to begin with; it is going to bump up the cost for PC gaming across all levels, the mainstream won't be able to cope and will probably run off to console.

They are expecting everyone to start buying higher end GPUs, SLI, or even Tri SLI? Those greedy bastards, most people struggle for a powerful single GPU while SLI is an enthusiast setup. Developers would be mad to use this system for game play physics.


----------



## Ceceil Felias (Feb 14, 2008)

Hahaha oh wow. This is just a mess.

It seems every company with an ego to speak of is going off the deep end, I swear.


----------



## ADF (Feb 14, 2008)

All they are thinking about is how this could raise their GPU sales, they are not taking into account the feasibility to expect every gamer to go spend a butt load more on GPUs.



> Our expectation is that this is going to encourage people to *buy even better GPUs*. It might and probably will encourage people to *buy a second GPU for the SLI* slot and for the highest-end gamers, it will encourage them to *buy three GPUs*, potentially *two for graphics and one for physics* or *one for graphics and two for physics*, or any dynamic combination thereof.


Getting a little ahead of yourself there, we need a 'MOAR!' image for this one...






There we go, that suits the greedy fucks.



> I wouldn't be surprised if this helps our *GPU sales even advance*, and the reason for that is in the end, it's just going to be a software download, he explained. "Every single *GPU that is CUDA enabled* will be able to run the physics engine when it comes.



As I said earlier CUDA only works properly on 8 series GPUs; that means no ATI, No Geforce 7 series or earlier and none of the next gen consoles. That is a really, really small potential audience to design a game for. This makes it much more likely that it will only be physics effects, that way they can be turned off to avoid impacting game play for the majority.

That being the case CUDA is worse than Havok FX and less open than the PPU; but it has a brand name people know and trust, so people online are going to love it regardless.


----------



## Zero_Point (Feb 15, 2008)

What's needed here is a unified physics API, possibly (or maybe even definitely, from what I've heard) with DX 11. That way such monopolies can't ever form.


----------



## Ron Overdrive (Feb 15, 2008)

Zero_Point said:
			
		

> What's needed here is a unified physics API, possibly (or maybe even definitely, from what I've heard) with DX 11. That way such monopolies can't ever form.



You're just switching it from video cards to microsoft. You're not stopping a monopoly by giving the tech to a wannabe monopoly. Honestly the physics engines are good how they are now: software with optional hardware. Both PhysX and Havok are available for Mac, Linux, Windows, and consoles like the Wii, PS3, and Xbox360. Fuzing all Physics with DX pretty much means the only systems that will be allowed to use physics would be the Xbox series and Windows computers. There's a much wider spectrum of hardware out there that rely on open api's like OpenGL and OpenAL to function with games because they can't run DX or can't get liscensing to use DX with their hardware/OS as DX support is only available on M$ products without some form of emulation.


----------



## ADF (Feb 15, 2008)

Just to put it in perspective how much GPU power they are expecting you to buy for these CPU beating physics, here is a chart back when Havok FX was promoting GPU physics.






Now that is just a rigid body simulation, I doubt dev artists would want to throw away their polygon budget just for a mass amount of bouncing objects. The real quantity of the horsepower will be directed to smaller scale calculations; such as soft body physics, cloth, smoke, hydromechanics, animation, procedural mesh deformation & damage etc.

The most realistic implementation is to encourage everyone  to either up their GPU a tier or buy a x6xx card for their SLI slot. Either way it is an extra Â£80-Â£100 eaten out of your upgrade budget just for the GPU.


----------

