[osg-users] running OSG in SLI mode

Robert Osfield robert.osfield at gmail.com
Wed Nov 28 02:42:48 PST 2007


On Nov 28, 2007 1:44 AM, Loong Hin <yonglh at stee.stengg.com> wrote:
> I am running OSG in dual Nvidia 8800 GTS 320 MB graphics cards in SLI mode. But I notice there is not any improvement in update rate whether I am running in SLI or not. I wonder will OSG able to take advantages of SLI? Thanks.

SLI is all down to the OpenGL driver doing stuff being the OSG's back,
it doesn't have any control over it, its all driver settings
controlled.   Whether SLI is possible to enable depends upon the type
of OpenGL features the application is using, you'll need to read up on
NVidia docs for the limitations.

As for potential performance benefits, its all depends upon what you
bottleneck is, if its CPU then no amount of extra GPU power will help,
its is a bandwidth/memory limitation then SLI could well make it
worse, if its vertex process throughput then its again unlikely to
make any difference, if you a fill limited/fragment shader limited
then you might just see an improvement with using SLI.

Frustratingly the OSG is well conditioned for
multi-thread/multi-context usage, so can use multiple CPU cores and
multiple GPU efficiently - more efficiently than SLI can in fact, but
unfortunately NVidia haven't published APIs for controlling the
compositing backend used by their SLI implementation, if they did we
could tweak osgViewer to use the dual cards to output to one display,
and we'd see much better scaling of performance than SLI typically

Alas Nvidia is too used to trying to make games appear a couple of fps
faster by any hacks possible, rather than actually wanting to work
with the developer community to solve the scalability problem


More information about the osg-users mailing list