Graphic Cards nVIDIA PhysX vs DX11 physics

@Raghu: Okay... so I stand corrected. I guess I mis-interpreted Physx libraries being open to use for commercial/non-commercial purposes, to being open source.

Yes, I do feel nVidia is pushing more and more towards GPGPU applications and have made significant progress in that area. Although we are yet to see worthy GPGPU apps for day-to-day computing, nVidia has put in lot of effort in wide-spreading CUDA.

But that also is the reason for my worries, that, in the end of the day CUDA is proprietary platform, just as Stream and application written for one will be tied to single vendor. That's why I have high hopes on OpenCL, as it makes it easier for creating vendor agnostic solutions and thus making more and more developers use it.

I've been ATi user for some time now, and I'm quite happy with it's performance in games. But the whole idea of using GPU for more than just games does sound enticing. Moreover, now we have good enough video players which can use GPU for video decoding and even our computing interfaces (Windows Aero, OSX GUI, Ubuntu's Beryl(?) interface) are leveraging GPUs for responsiveness and acceleration. I would love to see more and more general computing apps leveraging idle power of GPU. Recently I even bought a RAW file viewer (FastPictureViewer) that can use GPU for image viewing and zooming operations. I'll be keeping a close watch on Fermi's development, but I just hope that they price it right and at least have varied SKUs for different budget segments. Otherwise, I'll be picking up another ATi card. Afterall, we all want to play games. :)

@atiamd: I don't think looking better has anything to do with physics engine being used. Physics engines such as Physx or Havok add more realism to game by making environments behave as close as real life. Stuff bouncing off the surfaces like wall or floor like it would do in real life or shooting the enemy and watching him fall like real person are kind of stuff physics engine is supposed to take care of.

The key difference is, who or what does all this physics processing. Running it off the CPU isn't the best way as it creates significant impact on gameplay performance. GPUs on the other hand, have massive number of parallel processors which can run this task with minimal impact on framerates. At least that's the basic idea behind running GPU bound physics processing engine. Having said that, a well coded game in Havok is going to look almost as good as Physx, but while Havok will strain your CPU as complexity in physics processing increases, Physx will be able to keep performance hit minimal as it will use your GPU for the job.
 
A small clarification:

There really isn't much you need to do other than ensure that libraries are available for multiple platforms to write multiplatform code. This holds 100% true for OpenGL

Also, OpenGL by itself is not Open Source in the literal sense. OpenGL is a STANDARD, the specification for which is Open(not Open Source, since a specification does not have code in it).

The most commonly used implementations of OpenGL include that by SGI(which was OpenSourced back in the 90s) and Mesa3D, which is a truly Open Source project.

PS - @Raghu, you really oughta think twice before picking products to defend :p
 
Raghunandan said:
Please state reasons to back this up. The whole idea of having a common library is to have extremely easy portability.

And how much of what you said have you 'backed up'...? Should we always post a link, to what we say..? Honestly -- tell me..have you done any programing, your region shows "Bangalore", so I am assuming you have. It is not that easy to create libraries and then just "port" the code over. Are not the hardware different for the platforms, the drivers. We have seen ports and they were really not that nice. I think RE4, Dead Space (the TPS view). Also we have to keep in mind, that each platform is using a different CPU. Please do not tell me, that the 'so said' common library can 100% take care of this. Controls play a big issue too. And since you like substantial proof, and not 'blatant copy paste'. Here are some links for you to read up.

Here. The last paragraph at the bottom of the page.
Here.
Here.
Here.

Raghunandan said:
If most of your post is going to be from Wikipedia it might be easier to link to the article instead of writing it here. And information abt how Xbox was named is relevant to a physics engine comparison thread? Really?

Well in my prior post, I posted the Wikipedia extract, if you compare to what I said, it is different. The crux is the same. Better to summarize and synthesize for forums, then just paste a link. I cannot assume that all can understand things the way I do, or you do. Or have the time to read up huge wikipedia articles, and comprehend what I am trying to say, or extract from it what I want to show here.

And please next time, do not try to tell me my posting style, or set a standard for this forum.

Raghunandan said:
How exactly are they forcing the world to use PhysX? I dont see how its possible to force something on game developers. Maybe there isnt/wasnt any good alternative. Are there any other GPU/hardware accelerated physics engines available now? PhysX has been available for quite a while. CPU based engines are no where close in terms of realism, detail and efficiency.

Somehow you are seem comfortable with Microsoft forcing everyone to use DX11 though. What are the reasons you prefer DX11 physics over PhysX?

Where in this whole thread have I said, I prefer Dx11 Physics over Aegia Physx. I was just debating both sides...AND..AND....hitting at the business practices of nvidia. It seems you are putting words in my mouth here, and leading me down a path, where for sure you can kick my butt.

Raghunandan said:
Heres a short table comparing DX11 physics and NVIDIA PhysX. Please feel free to add to it or let me know if somethings wrong.

Feature | DX11 Physics | NVIDIA PhysX
Standard | Proprietary | Open
Platforms Supported | Window Vista, 7 | Windows XP, Vista, 7, Linux, Microsoft Xbox 360, Sony PS3, Nintendo Wii
SDK Availability | Free | Free
Source Code Availability | No | Yes

You still feel that DX11 physics is a good choice for game developers?

Table makes sense..have nothing to add to it. I still feel Dx11 may see the light of day, and on page one mentioned the same about Havok. Not at one place did I hit at the technical feasibility of Aegia PhySx. Heck..if it can run on so many systems...WOW to the creators.

Raghunandan said:
Your only gripe is probably that ATI doesnt support PhysX. But dont blame NVIDIA for it. It is an open standard after all and ATI is welcome to write its own softwar to support it on their cards or develop hardware for it.

I believe executing/skipping portions of code based on presence/absence of a feature is called having different codepaths. Its ok if you want to call it conditioning though.

Well, when it comes to gaming world usually the codepath is referred to dx9,10,11. Probably a gauge issue here, between us two. No problem. :)

Am not griping man, about ATI not supporting Phyx. I am perfectly happy with running it off my CPU. Scorpion Disfigured and UT3 were the ones I played. Never was a fan of BA:AA.

iGo said:
Actually... Raghu is right, I have long though about CUDA and Physx and their close ties to nVidia specific hardware, but the fact remains that of all things nVidia did wrong, they did one right thing and that was of making CUDA/Physx open source. Anyone can use them on multitude of platforms and that IS NOT restrictive as many here are thinking.

aa..erre..it is actually multi-platform. Not open source. Kudos to nVidia for sure, for making an engine that can run on so many harware platforms.

atiamd said:
.

And, asingh, you might wanna try red faction guerilla for havok, if you havent already....

And raghu man, you are so right about what all you have said regarding physX and DX11. I havent ever seen physX in works as I dont have an nvidia setup. Does it really look better than havok based games like timeshift and red faction guerilla and bioshock?

Thanks will try RFG. Have seen some Physics of my CPU, it is quite good. And all green card owners swear by it. It is good. No doubt.

@Raghu:
This is a debatable topic, and that is why I am happy the OP opened it up. I was putting my debate across. Not making any 'set in stone' comments. Read all my posts..NEVER I said Aegia PhysX is bad. I was dabbling, about DirectX Physics ramping up.

I am making my peace here. You might have though my posts to be BS. I think they were not. Your posts are/were good and put up a strong debate. Just getting personal leaves a sour taste in the mouth. You can choose.
 
Aces170 said:
Havok was bought over by Intel in 2007, and will amalgamate into Larrabee project. DX 11 is not an open source API by any definition, Open GL is on the other hand. Due to the $ power of M$, open GL is more or less given a backseat.
In the end, both Nvidia and AMD are creating massive parallel processors so they would want to utilize it more then just graphics rendering, physics rendering is one of the utilizations.
Thats a good thing..

Ati will have Havok as well as Bullet...double dhamaka.
 
you ppl cant compare directx11 and physx .directx includes a lot more features than game physics like tesselation and all MS just made it easier for developers so they dont have to PAY nvidia to use physics in games .moreover DX11 works on all compatible dx11 hardware not physx which works only on nvidia hardware .
Nvidia is just a stupid company thats trying to recoup the investment they made in buying out ageia which in all sense was a bad move .
 
stalker said:
^^ Raghu works for nVidia

I surely -- 100% make my peace..!
:)

vijaysurendran said:
you ppl cant compare directx11 and physx .directx includes a lot more features than game physics like tesselation and all MS just made it easier for developers so they dont have to PAY nvidia to use physics in games .moreover DX11 works on all compatible dx11 hardware not physx which works only on nvidia hardware .
Nvidia is just a stupid company thats trying to recoup the investment they made in buying out ageia which in all sense was a bad move .

Vijay we were comparing to debate about the topic. It was the Directx11 Physics component that we want to see vs Aegia PhysX. Aegia PhysX is multi-platform but not open source. So it can be harnessed onto multiple hardware platforms. True, I also do not agree with their business practices -- but they are far from stupid.
 
iGo said:
There is also a lot of talk about Bullet Physics libraries, but we are yet to see mainstream implementation of it. ATi, so far have not kept their pace with GPGPU development and their Stream implementation in general apps leaves lot to be desired. The way I see it, the only savior in true vendor-agnostic physics engine will be the one with OpenCL implementation. But once again, nVidia has been ahead of ATi for OpenCL adoption and release in their drivers. ATi really need to get their act together in building rock-solid software support for amazing hardware that they are building.

Actually, ATI has better OpenCL drivers and that shows in Sisoft Sandra 2010 OpenCL GPGPU benchmark

SiSoftware Zone
AMD Press

:)
 
Muzu... I'm not saying they are bad. In fact, the benchmark very well shows the massive amount of stream processors they have dumped in their GPUs.

But you have to agree that ATi has not been aggressive enough with GPGPU development. They have dabbled in different things, while nVidia has been pushing CUDA everywhere. Today, you'll find more CUDA apps compared to ATi. I really want to see OpenCL adopted widely.
 
^^ The reason ATI is not gung ho about GPGPU is because its already there in the X86 market, NVDA desperately wants to eat into the X86 pie, and GPGPU is a means to it.

Also Raghu, I wouldnt bet on M$ not supporting DX 11 or a variant into the next MS console. Look at it from a game developers POV the whole of PC market combined with Xbox market as compared with Nvidia video cards or Nvidia based video chipsets consoles. MS surely has a good hand out there.

AMD on the other hand, is more concerned about integrating the CPU and GPU
 
Aces170 said:
Also Raghu, I wouldnt bet on M$ not supporting DX 11 or a variant into the next MS console. Look at it from a game developers POV the whole of PC market combined with Xbox market as compared with Nvidia video cards or Nvidia based video chipsets consoles. MS surely has a good hand out there.

It would be a good move by MS if they would have console support too. But NVIDIA PhysX is not restricted to their own chips. The Xbox 360 for instance, doesnt have *ANY* chip from NVIDIA, but NVIDIA PhysX still works on it. I doubt MS will support PS3 or Wii though.

vijaysurendran said:
developers so they dont have to PAY nvidia to use physics in games .

Oh god, here we go again!! Have you even read the thread?
 
It would be a good move by MS if they would have console support too. But NVIDIA PhysX is not restricted to their own chips. The Xbox 360 for instance, doesnt have *ANY* chip from NVIDIA, but NVIDIA PhysX still works on it. I doubt MS will support PS3 or Wii though.

How does that benefit Nvidia? Why dont the PHYSX games work on AMD hardware on PC?
 
It increases the adoption rate of PhysX by making it easier for developers to port games across platforms.

On a PC, they do work with AMD processors because it falls back to the x86 CPU when a NVIDIA graphics card is not the primary adapter. If you meant ATI, then I guess it comes down to simple competition and implementation efficiency. PhysX is an open standard and ATI understands their GPU architecture the best. IMO, expecting NVIDIA to write an efficient PhysX implementation on Stream is a a bit too much to ask for.
 
Raghunandan said:
It would be a good move by MS if they would have console support too. But NVIDIA PhysX is not restricted to their own chips. The Xbox 360 for instance, doesnt have *ANY* chip from NVIDIA, but NVIDIA PhysX still works on it. I doubt MS will support PS3 or Wii though.

Actually if M$ starts to support PS3 and Wii then they are supporting a direct competitor to one of their core products the XBox. Doubt this will happen.

Aces170 said:
How does that benefit Nvidia? Why dont the PHYSX games work on AMD hardware on PC?

Cause the driver detects an AMD GPU, and does not render it on the GPU. It is you could say 'off loaded' to the CPU..and takes a toll on the FPS. Only for the PC platform this fix is in place. That is why they are getting a lashing for this. But its their product, so they make the business decisions. As Raghu said, Aegia PhysX runs well on the other platforms. Gaming companies will this way adopt it readily since one aspect of the game (PhysX), will be common for all platforms.

Raghunandan said:
It increases the adoption rate of PhysX by making it easier for developers to port games across platforms.

On a PC, they do work with AMD processors because it falls back to the x86 CPU when a NVIDIA graphics card is not the primary adapter. If you meant ATI, then I guess it comes down to simple competition and implementation efficiency. PhysX is an open standard and ATI understands their GPU architecture the best. IMO, expecting NVIDIA to write an efficient PhysX implementation on Stream is a a bit too much to ask for.

Raghu, as of now...suppose if that driver 'check' is removed will Aegia PhysX run seamlessly on ATI graphic boards.
 
btw, side note, anyone following the news of US FTC looking at Intel? and probably nVidia waiting to get into the x86 market - they have ex-Transmeta engineers on their payroll too... they are well poised to launch their own unified/fusion computing platforms (handhelds, nettops etc)...
 
asingh said:
Actually if M$ starts to support PS3 and Wii then they are supporting a direct competitor to one of their core products the XBox. Doubt this will happen.

Yup, exactly! Which is why DX11 physics can never be truly cross-platform.

asingh said:
Raghu, as of now...suppose if that driver 'check' is removed will Aegia PhysX run seamlessly on ATI graphic boards.

NVIDIA PhysX will not run on ATI GPUs even if the check were not present, because its not been written in Stream (ATIs equivalent of CUDA). You might have misunderstood whats happening.

Heres the actual situation - When an ATI and NVIDIA card are present, the NVIDIA driver disables GPU accelerated physics. The current drivers require that the NVIDIA GPU doing PhysX should also be the one doing the graphics. Apparently previous versions of the driver didnt have this requirement.

This has been a reason for a lot of NVIDIA bashing recently. NVIDIAs excuse is that it does not test this type of ATI+NVIDIA setup while writing their drivers. So doesnt want to face issues arising out of such a combo. I can understand why the customers are angry though.

vishalrao said:
btw, side note, anyone following the news of US FTC looking at Intel? and probably nVidia waiting to get into the x86 market - they have ex-Transmeta engineers on their payroll too... they are well poised to launch their own unified/fusion computing platforms (handhelds, nettops etc)...

Not many sites reported it the article here. There is no ruling from the court yet. FTC has a bunch of recommendations in its lawsuit to prevent Intel from misusing its dominance.
 
asingh said:
Actually if M$ starts to support PS3 and Wii then they are supporting a direct competitor to one of their core products the XBox. Doubt this will happen.

Cause the driver detects an AMD GPU, and does not render it on the GPU. It is you could say 'off loaded' to the CPU..and takes a toll on the FPS. Only for the PC platform this fix is in place. That is why they are getting a lashing for this. But its their product, so they make the business decisions. As Raghu said, Aegia PhysX runs well on the other platforms. Gaming companies will this way adopt it readily since one aspect of the game (PhysX), will be common for all platforms.

Raghu, as of now...suppose if that driver 'check' is removed will Aegia PhysX run seamlessly on ATI graphic boards.

Its dumb on NVIDIA's part to not allow mixed cards for Physx as they'd be selling a card irrespective of what the other card is. Anyway there are already hacks in place to fix it. However asingh you are comparing apples to oranges comparing DirectCompute to PhysX.

Coming to the API itself, having seen both, I can safely say PhysX is a much better API as it has a software fallback which DirectCompute doesn't. The latter is basically a general purpose API with some calls that make the job of coding things like collisions easier and one has to write their Physics engine in any case. The simulation bits can be done in more detail by offloading to DirectCompute but one still has to write a lot of boilerplate code which to be honest no game developer likes doing.

On the other hand PhysX is written from ground up to do only one thing - run physics simulations irrespective of hardware and one can scale the simulations depending upon what hardware exists on the host system. If a game uses the PhysX API and you don't have an NV card, it still uses a software fallback that runs the simulations at lower detail. You are *still using* the very same code path. So as far as ease of use is concerned, if I were a developer, I'd choose PhysX blindly without blinking.

The question about why it doesn't run on an ATI card is ATI's problem not NVIDIA's. The PhysX spec is public and if they want, ATI can use the same OpenGL ICD model and create a PhysX driver for their card.
 
Raghunandan said:
NVIDIA PhysX will not run on ATI GPUs even if the check were not present, because its not been written in Stream (ATIs equivalent of CUDA). You might have misunderstood whats happening.
Yes, makes sense.

Raghunandan said:
Heres the actual situation - When an ATI and NVIDIA card are present, the NVIDIA driver disables GPU accelerated physics. The current drivers require that the NVIDIA GPU doing PhysX should also be the one doing the graphics. Apparently previous versions of the driver didnt have this requirement.

Yes, this is what the public is angry about. Sorry mixed the above with this.
 
'The Witcher 2: Assassins of Kings' revealed by internal video leak
The upcoming Witcher 2 uses Havok physics engine.
There's a huge number of games,both console as well as pc,that have made use of Havok physics : Titles using Havok. The list includes some awesome games like Uncharted 2, Far Cry 2, Dead Space, and Assassin's Creed II. Not everyone is going the nvidia physx way which is a good thing because being forced to buy an nvidia card just to experience physics in games is very lame :no: :no:.Havok is not limited by hardware like nvidia's phyx or dx11 cards. IMHO it would be better if games use this rather than nvidia physx or dx11 physics.
 
Havok is not GPU accelerated. I doubt it will come close to any GPU accelerated engine (PhysX included) in terms of features, performance and scalability.
 
Back
Top