[osg-users] Rendering to multiple graphics cards

Wojciech Lewandowski lewandowski at ai.com.pl
Tue Jul 1 03:38:58 PDT 2008


Hi Bob and Robert,

We managed to get our system to run stable. Intel Core 2 Duo E6600 + 2 X 
GeForce 8800 GTS 320 MB + 3 monitors. Former crashes were indeed power 
related. It turned out we did not connect aditional 12V/5V connector to 
motherboard. It was needed when two 16xPCIex  lanes were used.

 We also observed performance issues. With our demo code we had around 20 
fps. Its not much but I would not consider this total disaster either. This 
was only a test for us. In target solution we could probably use much better 
graphics and better cpu. Plus we may try to make some additional 
optimizations if needed.

So in the end I consider this a complete success. OSG works in Vista in 
multithreaded / multi gfxcard / multimonitor configurations. Thanks for the 
hints and all support.

Cheers,
Wojtek


>I can draw successfully with slave cameras to all 4 screens in various 
>configs in Vista, either the osgviewer across all screens, 4 independent 
>screens (i.e. screens 0-3, each set @ 1440x900), or 2 double-wide screens 
>(screen 0 & 2, each set to 2880x900). Note that in the Vista display 
>settings all 4 screens are set up as 1 big "extended desktop".
>
> But I'm having TROUBLE MAINTAINING 60HZ, even though my rendered scene 
> consists solely of a textured cube.
>
> For example, with osgviewer.exe (ver 2.4) spread across all 4 screens (4 
> graphics contexts):
> Threading model: CullDrawThreadperContext
> Event, Update, & all 4 Cull,Draw,GPU vals are small (0.xx)
> The scene achieves 60Hz at times, but feels choppy, and I notice the 
> middle two GPUs at times show long orange bars, which are killing the 60Hz 
> rate.
> All 4 CPUs are running @ approx 52% (osgviewer.exe @ 30% and dwm.exe @ 
> 20%)
> So I also "disabled desktop composition" on the osgviewer.exe task, which 
> sent the dwm.exe (desktop window manager) down to 0% (disabled), which 
> made the rotating cube less choppy, but now can only achieve 45Hz???
>
> I am experiencing similar results with my own osg2.4-based fourDscape 
> task, with 4 cameras rendered to screen 0-3 (each @ 1440x900), with the 
> same very small Cull/Draw/GPU numbers (all 4 CPUS @ 22%), but can only 
> achieve 47Hz.
>
> Any suggestions as to where to look for who/what in Vista is stealing 
> resources and how to make it stop? With a single screen, 60Hz is no 
> problem.
>
> thanks.
>
> Bob.
>
> ---------------------------------------------------------------------------------------------
> Robert Osfield wrote:
>> Hi Wojteck,
>>
>> As a hardware sanity check you could just dual boot the machine with
>> Linux, this itself will bring its own learning curve, but at least
>> you'll have a setup that others have known work just fine.
>>
>> Robert.
>>
>> On Mon, Jun 30, 2008 at 3:24 PM, Wojciech Lewandowski
>> <lewandowski at ai.com.pl> wrote:
>>
>>> Hi Everyone,
>>>
>>> I have done some OSG testing with Vista 64 bit + 2 x GeForce 8800 GTS 
>>> 320 MB
>>> + 3 monitors. I have attached only 3 monitors but I am sure 4 would work
>>> either.
>>>
>>> Unfortunately results are mixed. I was able to start osgviewer and our 
>>> demo
>>> app on 3 monitors. I was able to get some visual but after few frames 
>>> driver
>>> was dying and after few moments Vista popped up with msgbox that 
>>> graphics
>>> driver failed but now it is ok (or something like this).  I may not have 
>>> the
>>> power supply decent enough to provide the stable current so this crash 
>>> could
>>> be power related. I will try to arrange some heavy duty power supply and
>>> redo the tests again later this week. Anyway it looks like at this stage
>>> results are more promising than with XP and two different GeForces.
>>>
>>> Cheers,
>>> Wojtek
>>>
>>>
>>>
>>>
>>> ----- Original Message -----
>>> From: Wojciech Lewandowski
>>> To: OpenSceneGraph Users
>>> Sent: Monday, June 30, 2008 10:34 AM
>>> Subject: Re: [osg-users] Rendering to multiple graphics cards
>>> Thanks, Bob.
>>> This is a relief ;-).  I don't have the acces to similar Vista setup
>>> yet.  Will try to grab two boards and do the checks this week.Will post 
>>> a
>>> meesage with results when I am done.
>>>
>>> Thanks again.
>>> Wojtek
>>>
>>>
>>> ----- Original Message -----
>>> From: Bob Balfour
>>> To: OpenSceneGraph Users
>>> Sent: Sunday, June 29, 2008 2:32 AM
>>> Subject: Re: [osg-users] Rendering to multiple graphics cards
>>> I have two Nvidia 8800 cards on a Windows Vista platform (HP), and 
>>> setting
>>> traits->screenNum to 0 or 1 as Robert indicated does send the rendering 
>>> to
>>> the appropriate graphics cards.  I used code very similar to the 
>>> osgcamera
>>> example, multipleWindowMultipleCameras method.
>>>
>>> Does anyone have experience setting up a 4-monitor (each 1440x900) 
>>> system on
>>> Windows Vista? I have 2 osgcameras/contexts rendering 
>>> left-half/right-half
>>> of the scene.  With 2 Nvidia graphics cards both in dual-head mode, if I
>>> render each camera to a viewport 2880x900 I'm hoping that the scene will 
>>> be
>>> spread properly over the 4 monitors (will try it shortly once I round up
>>> four HD monitors), without configuring the "extending the desktop" 
>>> option in
>>> Windows, which seems to take considerable resources in the dwm.exe task
>>> ("dynamic window manager"), which appears to have an OSG rendering
>>> performance impact.  Anyone have similar experience here, or other
>>> configuration suggestions?
>>>
>>> Thanks.
>>>
>>> Bob.
>>>
>>> --------------------------------------------------------------------------------------
>>> Wojciech Lewandowski wrote:
>>>
>>> Hi Bob,
>>>
>>> Seems like you asked related question to mine. I would be very 
>>> interested in
>>> your results. I tried to run osgviewer on two graphics boards withour
>>> success. But I was on XP and boards were different. I have not given up
>>> completely, hoping that Vista or identical cards may work. Then I have 
>>> read
>>> your post and I see that you have exactly such setup. Does osgviewer 
>>> outputs
>>> to monitors cards attached to both cards ?
>>>
>>> Cheers,
>>> Wojtek
>>>
>>> Hi Bob,
>>>
>>> Using the screenNum set to 0 or 1 should select the appropriate card.
>>> When you run osgviewer on your system it should open up two
>>> windows/two slave cameras automatically, press 'f' to toggle off full
>>> screen to see the actual windows.
>>>
>>> Robert.
>>>
>>> On Tue, Jun 24, 2008 at 8:56 PM, Bob Balfour <bob at bal4.com> wrote:
>>>
>>> I also have an HP Windows (Vista) box with 2 Nvidia graphics cards
>>> (independent, not SLI-configured). In order to configure two 
>>> osg::cameras,
>>> each one rendering to its own specific Nvidia card, how do you specify a
>>> camera graphics context for a specific graphics card.  Is it simply
>>> specifying the appropriate traits->screenNum, or does it take more than
>>> that?  Has anyone done this in Windows Vista?
>>>
>>> Thanks.
>>>
>>> Bob.
>>> --
>>>
>>> Robert E. Balfour, Ph.D.
>>> Exec. V.P. & CTO,  BALFOUR Technologies LLC
>>> 960 So. Broadway, Suite 108, Hicksville NY 11801
>>> Phone: (516)513-0030  Fax: (516)513-0027  email: bob at BAL4.com
>>> "Solutions in four dimensions" with fourDscape(R)
>>>
>>> _______________________________________________
>>>
>
> _______________________________________________
> osg-users mailing list
> osg-users at lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 




More information about the osg-users mailing list