# Nvidia makes world history.



## Nanakisan (Dec 16, 2008)

http://www.engadget.com/2008/11/10/nvidias-quadro-fx-5800-with-4gb-graphics-memory-is-the-most-po/

this is not a joke.

NVidia has broken and or shattered the graphics card word history with the first and most powerful graphics card known to man rated at a mind blowing drool inducing 4 gb's of power
read article above for more info.


----------



## lilEmber (Dec 16, 2008)

4gb's of power? Memory isn't power.
This isn't for gaming, this is for servers; if this card is anything it's the "most powerful, consumer server Graphics Card", this thing will not be what you want for anything else.



> Just $3,499 for you big spender -- pennies for the companies who can harness the power for the purposes of oil and gas exploration, 4D modeling, and graphics design.


What is 4d modeling?


----------



## Wait Wait (Dec 16, 2008)

TIME


----------



## Eevee (Dec 16, 2008)

I think the concept of "making world history" has been diluted considerably somewhere along the line here.  Almost EVERY new card or chip or whatsit is going to be the most powerful in history. Shoving more bits into the same finite space just isn't quite so impressive any more.


----------



## yak (Dec 16, 2008)

> What is 4d modeling?



I'm going to guess it's a very fancy name for ray tracing, 3D modeling of sequential frames, etc... But even so, the usage of the term is incorrect.


It's definitely not for gaming and would probably suck at them.
Plus, ATI has always put more graphical horsepower in their products, so I'm definitely waiting for their answer to this. I'm thinking it's going to be the specs of this hardware doubled, only with GDDR5.


----------



## lilEmber (Dec 16, 2008)

yak said:


> I'm going to guess it's a very fancy name for ray tracing, 3D modeling of sequential frames, etc... But even so, the usage of the term is incorrect.
> 
> 
> It's definitely not for gaming and would probably suck at them.
> Plus, ATI has always put more graphical horsepower in their products, so I'm definitely waiting for their answer to this. I'm thinking it's going to be the specs of this hardware doubled, only with GDDR5.



Hmmm, I agree with you entirely. For gaming this wouldn't even work, these are not designed for gaming in the slightest so the shaders would be the wrong model, either making the game look a LOT worse than a card for gaming, or simply making the framerate drop to near nothing.


Also, I agree with eevee, this is done every few months.


----------



## HyBroMcYenapants (Dec 16, 2008)

Hell fuck "Can it play crysis?"


Can it play outside?


No? Allright ill stay in my basement.


----------



## Nanakisan (Dec 16, 2008)

lol

i do 3D modeling and this even if i could ever use it would make my heaven a reality.
seeing as i'm doing photo realistic effects now.
joke


----------



## Archibald Ironfist (Dec 16, 2008)

Considering the Wildcat series of cards from Oxygen came with 4GiB of VRAM back in 2003, I think nVidia is once again artificially inflating their own ego with utter bullshit.

4GB Video RAM is not new.


----------



## lilEmber (Dec 16, 2008)

Archibald Ironfist said:


> Considering the Wildcat series of cards from Oxygen came with 4GiB of VRAM back in 2003, I think nVidia is once again artificially inflating their own ego with utter bullshit.
> 
> 4GB Video RAM is not new.



I love people that actually read the article, also, the amount of memory is only as good as the type of memory; more than likely this is GDDR3.



			
				the link you didn't look at said:
			
		

> 240 CUDA-programmable parallel cores and the industry's first card with 4GB of graphics memory


----------



## net-cat (Dec 16, 2008)

how many fps's does it get lol

Nice card.

... though, is it just my imagination or is this less a "Graphics Card" and more a "High Performance Vector Processing Unit that happens to have a DVI output?"


----------



## Runefox (Dec 16, 2008)

It's a workstation card. Ergo, it's meant to be used to provide ultra-high quality display, not ultra-fast display - These things can crank anti-aliasing up to 32x and have very strict methods for texture filtering and overall image quality. They aren't meant to be performers at anything other than what they're designed for, and I'm pretty sure, actually, that it would suck at games.

The FireGL series from AMD/ATi are also beasts, though the highest I've seen those is 2GB. I believe the FireGL series was also a "first" in having 1GB on-card before the competition (nVidia), and honestly, the FireGL's have been around for much longer and from what I've seen offer more horsepower than the Quadros. That said, I haven't seen *many* of them. That kind of system build only comes around once in a while at our shop.


----------



## LizardKing (Dec 16, 2008)

I got one of these and clocked it 1mhz faster. I now own the bestest card in the world, EVER!


----------



## Aden (Dec 16, 2008)

yak said:


> I'm going to guess it's a very fancy name for ray tracing, 3D modeling of sequential frames, etc... But even so, the usage of the term is incorrect.



I'm guessing typo.


----------



## DarkMettaur (Dec 17, 2008)

lol bottleneck


----------



## Magnus (Dec 17, 2008)

doesn't say "Quadro" enough?


----------



## freshmeat999 (Dec 18, 2008)

NewfDraggie said:


> 4gb's of power? Memory isn't power.
> This isn't for gaming, this is for servers; if this card is anything it's the "most powerful, consumer server Graphics Card", this thing will not be what you want for anything else.
> 
> 
> What is 4d modeling?


 
Ive seen this thing demoed at Nvision 2008 and it fucking blew my mind to shreds. The had a 3D model of a lambourghini and they flew around it with free camera and there was no video lag or anything and all the textures were beautiful. I almost believed that was a real lambourghini. My dad works for this company designing the chipsets that go into these cards. All of them are beautiful and will work wonders if you know how to tweak things.

O and these cards, the Quadro FX 5800, is 100% purely NOT made for gaming. Get a GTX 280 for that, or 2. Getting 2 will not even come close to half the price of this thing. Plus when u get a 280, u get an extra graphics setting that all other cards except for the 260 cannot access due to the fact that it will crash the game immediately.


----------



## Archibald Ironfist (Dec 18, 2008)

My PC can do that, too!  MPEG4 is nice.


----------



## Tilt (Dec 18, 2008)

I'd rather have a Tesla when they update the cuda instructions so it can be used for video encoding/decoding. Right now its Limited to the 9000 series of cards.

The C1060 Nvidia Tesla 

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4259469&CatId=4044


----------



## Neybulot (Dec 20, 2008)

freshmeat999 said:


> Get a GTX 280 for that, or 2. Getting 2 will not even come close to half the price of this thing. Plus when u get a 280, u get an extra graphics setting that all other cards except for the 260 cannot access due to the fact that it will crash the game immediately.



Don't get a GTX 280 yet if you're looking for the absolute best Nvidia card. They're putting out the 295 within the next 3-4 months, I believe?

EDIT: Never mind. January 8th.


----------



## LizardKing (Dec 20, 2008)

Neybulot said:


> Don't get a GTX 280 yet if you're looking for the absolute best Nvidia card. They're putting out the 295 within the next 3-4 months, I believe?



Don't get a 295 yet if you're looking for the absolute best Nvidia card. They're putting out the 310 within the next 3-4 months, I believe?


----------



## Magnus (Dec 20, 2008)

Geforce GTX 9001 should be out soon, believe it doods, OVER 9000 MEGABYTES OF RAMZ at the speed of over 9000 hertz 

:3


----------



## WarMocK (Dec 20, 2008)

Magnus said:


> Geforce GTX 9001 should be out soon, believe it doods, OVER 9000 MEGABYTES OF RAMZ at the speed of over 9000 hertz
> 
> :3



9000 hertz? 

9 KHz, wow, what a speed.  *scnr*


----------



## Archibald Ironfist (Dec 20, 2008)

LizardKing said:


> Don't get a 295 yet if you're looking for the absolute best Nvidia card. They're putting out the 310 within the next 3-4 months, I believe?



Don't get a 310, there's going to be a 320 about a month after that.


----------



## whoadamn (Dec 22, 2008)

jeez good luck getting that hunk of plastic into ur pcie slot o.o


----------



## Neybulot (Dec 23, 2008)

LizardKing said:


> Don't get a 295 yet if you're looking for the absolute best Nvidia card. They're putting out the 310 within the next 3-4 months, I believe?



Pfft...Doubt it.


----------



## Oryxe (Dec 26, 2008)

I've learned not to get my hopes up, mainly since I'm running a Mac Pro octo core.

Even if I could flash this cards ROMS, I've already got 2 GTX 8800's in SLI for windows gaming. And NVIDIA is infamous for lagging behind ATI when it comes to pro apps graphics. (My ATI x1900 still kicks the shit out of any NVIDIA card I use when I run Shake, Motion, Adobe CS3...)


----------



## Oryxe (Dec 26, 2008)

Runefox said:


> It's a workstation card. Ergo, it's meant to be used to provide ultra-high quality display, not ultra-fast display - These things can crank anti-aliasing up to 32x and have very strict methods for texture filtering and overall image quality. They aren't meant to be performers at anything other than what they're designed for, and I'm pretty sure, actually, that it would suck at games.
> 
> The FireGL series from AMD/ATi are also beasts, though the highest I've seen those is 2GB. I believe the FireGL series was also a "first" in having 1GB on-card before the competition (nVidia), and honestly, the FireGL's have been around for much longer and from what I've seen offer more horsepower than the Quadros. That said, I haven't seen *many* of them. That kind of system build only comes around once in a while at our shop.



Yeah, I scored 45% higher in a 3D marks test using 2 ATIS with a total of 1GB onboard memory compared to two NVIDIA Geforce cards with a total of 2GB. Then again, the geforce's were gaming cards so yeah..



Magnus said:


> doesn't say "Quadro" enough?



I am beginning to assume it's just a souped-up Quadro..


----------

