[osg-users] render to texture only delivers black texture

Steffen Kim mailto.stk at web.de
Wed Jul 9 22:59:48 PDT 2008


Hi everyone.

I have some issues with rendering to a texture.

I am writing a plug-in for an application.
This application has a main-viewer-window, where it displays a scene.
My plug-in has an own window with a new viewer and everything.

The new window now needs a texture showing what the main window displays right at this moment.
I need it to use it later on with an API for interlacing but for now I just want to show it in the plug-in-window to see whether rendering-to-texture works as it should.

That is why the plug-in-window show a screen-aligned-quad that displays this texture. If I use an image as texture of the quad everything works as it should, so my plug-in-window should be fine the way it is.
But if I use the texture I render using a camera in the main application's scene I just get a black texture.

I create the following camera using information of the camera of the main application which I get via an interface named "_billInterface" and attach the texture:

	osg::Camera* camera = new osg::CameraNode();
	
	camera->setCullingActive(false);
	camera->setClearColor(osg::Vec4(1.0f,1.0f,1.0f,1.0f));
	camera->setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

	// Set camera properties according to the main camera of the main application
	osg::Camera* billCamera = _billInterface->getCamera();
	camera->setViewport(0,0,_billInterface->getGraphicsContext()->getTraits()->width,_billInterface->getGraphicsContext()->getTraits()->height);
	camera->setProjectionMatrix(billCamera->getProjectionMatrix());
	camera->setViewMatrix(billCamera->getViewMatrix());
   	camera->setReferenceFrame(billCamera->getReferenceFrame());
	camera->setStateSet(billCamera->getStateSet());
	
	camera->setRenderOrder(osg::CameraNode::PRE_RENDER,1);
	camera->setRenderTargetImplementation(osg::CameraNode::FRAME_BUFFER_OBJECT);
	camera->attach(osg::CameraNode::COLOR_BUFFER, texture.get());


Then I add the scene of the main application to the camera so that it shows this scene.

	camera->addChild(_billInterface->getScene());

Now I take this camera and put it into the scene graph of the main application:

	_billInterface->getRootNode()->getParent(0)->addChild(camera);

(That way it is on one depth with the rootNode of the scene and the node containing the HUD. I put it in there so that the camera's output is rendered into the texture everytime the viewer of the main application is updated.)


I know that I get a texture that way because the quad that holds this texture is normally red and after creating the camera and attaching the texture it turns black.

I guess that either some camera properties are wrong or I have to put the camera somewhere else.
Can someone give me a hint on what I might be doing wrong? What are the main things I have to take care of when doing render-to-texture?

Is the texture automatically updated everytime the scene containing my render-to-texture-camera is rendered?


Thanks in advance for any help,
Steffen
_______________________________________________________________________
EINE FÜR ALLE: die kostenlose WEB.DE-Plattform für Freunde und Deine
Homepage mit eigenem Namen. Jetzt starten! http://unddu.de/?kid=kid@mf2




More information about the osg-users mailing list