8

I wrote a 2D fluid-solver in OpenGL (code here) some time back.

While it runs flawlessly on my onboard Intel GPU,

when running on Intel GPU

the simulation very quickly "blows-up" when same code is run using nvidia card :

mere half a second into the program

In the second picture, the fluid is being "added" to the system and "diffusing away" too, but unlike the 1st picture there is no advection.

I would like to know what might possibly cause this. Could this be because different vendors might be interpreting the standard differently?

PS : The "red" and "green" colors represent magnitude of vector-field in x and y directions respectively.

nilspin
  • 81
  • 2
  • 4
    There can definitely be differences between IHV implementations. It could be due to a driver bug, or different interpretations of ambiguities in the standard, or even possibly differences in how the compilers treat floating-point arithmetic, etc. Would need some more detailed debugging to understand what's going on. – Nathan Reed Aug 21 '15 at 20:20
  • 1
    @NathanReed ,what kind of information could be helpful? – nilspin Aug 22 '15 at 00:20
  • 7
    Start by debugging it like any other graphics/shader problem. Isolate each pass and see in which pass the error is being introduced, then isolate where in that shader is something going wrong. – Nathan Reed Aug 22 '15 at 00:47
  • This might be a vsync issue if you're using a variable timestep per-frame. – Mokosha Aug 28 '15 at 04:09
  • Are you using any vendor-dependent GLSL functions like noise*(which as far as I know, most vendors don't implement anyway)? – Sam Sep 08 '15 at 22:42
  • @Sam no. I am only using texture(). That shouldn't cause a problem, right? – nilspin Sep 09 '15 at 10:00
  • @nilspin I don't think so. They're both running on the same OpenGL version, right? – Sam Sep 09 '15 at 23:25
  • @Sam yes they are. – nilspin Sep 10 '15 at 04:36

0 Answers0