780 views
in Platform Packages by
Hello,

i currently try to get dual screen output working. I want my application to be displayed on two framebuffer devices simultaneously.

What i got so far is two initializations of OpenGL-ES (one for each screen/FB) That i can switch between via: eglMakeCurrent();

Both working fine seperately when selected before the EwInitGraphicsEngine(); command call.

How can i get EwInitGraphicsEngine() so that it can be initialized to both framebuffers simultaneously?

Help would be greatly apprechiated.

Thank you,

Moritz Wagner

1 Answer

0 votes
by

Displaying two framebuffers simultaneously is a very interesting application!

First of all, there is always only one Graphics Engine within a GUI application - as a result, there is only one call to EwInitGraphicsEngine() necessary.

Since you already have both framebuffers initialized via EGL, it is necessary to create two viewports (one for each framebuffer) and to update the content of the current GUI application into both viewports.
Prerequisite is, that your EGL is configured so that both framebuffers are visible and accessible for drawing operations via OpenGL ES 2.0.

First, we need some variables to access both destinations:

static EGLDisplay EglDisplay1  = 0;
static EGLSurface EglSurface1  = 0;
static XViewport* Viewport1    = 0;
static GLint      Framebuffer1 = 0xFFFFFFFF;

static EGLDisplay EglDisplay2  = 0;
static EGLSurface EglSurface2  = 0;
static XViewport* Viewport2    = 0;
static GLint      Framebuffer2 = 0xFFFFFFFF;

static EGLContext EglContext = 0; 

The intialization should look like this:

int main( int argc, char** argv )
{
  ...

  /* initialize EGL to get access to display 1 and display 2 */
  EglInit( &EglDisplay1, &EglSurface1, &EglDisplay2, &EglSurface2, &EglContext );

  /* get the framebuffer 1 and its size in pixel */
  glGetIntegerv( GL_FRAMEBUFFER_BINDING, &Framebuffer1 );
  ...
  
  /* get the framebuffer 2 and its size in pixel */
  glGetIntegerv( GL_FRAMEBUFFER_BINDING, &Framebuffer2 );
  ...
  
  /* initialize the Graphics Engine as always */
  EwInitGraphicsEngine( 0 );

  /* create the applications root object and initialize it */
  rootObject = (CoreRoot)EwNewObjectIndirect( EwApplicationClass, 0 );
  EwLockObject( rootObject );
  CoreRoot__Initialize( rootObject, EwScreenSize );

  /* create viewport 1 object to provide uniform access to the framebuffer 1 */
  Viewport1 = EwInitViewport( EwScreenSize, EwNewRect( 0, 0, w, h ), 0, 255,
    &Framebuffer1, EglDisplay1, EglSurface1, ViewportProc );

  /* create viewport 2 object to provide uniform access to the framebuffer 2 */
  Viewport2 = EwInitViewport( EwScreenSize, EwNewRect( 0, 0, w, h ), 0, 255,
    &framebuffer2, EglDisplay2, EglSurface2, ViewportProc );

  ...

In principle, the initialization of EGL is doubled in order to get a double set of EglDisplay, EglSurface - please take care that only one EglContext is created. The two EwInitViewport() calls create two independent drawing destinations (viewports).
The rest of the main loop is unchanged and the GUI application operates as always. However, the Update() method has to be enhanced, in order to draw into both destinations:

static void Update( CoreRoot aApplication )
{
  XBitmap*       bitmap;
  GraphicsCanvas canvas = EwNewObject( GraphicsCanvas, 0 );
  XRect          updateRect = {{ 0, 0 }, { 0, 0 }};

  if ( !canvas )
    return;

  /* ensure that framebuffer 1 is current drawing destination for OpenGL */
  eglMakeCurrent( EglDisplay1, EglSurface1, EglSurface1, EglContext ); 

  /* begin update with viewport 1 */
  bitmap = EwBeginUpdate( Viewport1 );
  if ( !bitmap )
    return;

  /* redraw the dirty area of the screen */
  GraphicsCanvas__AttachBitmap( canvas, (XUInt32)bitmap );
  updateRect = CoreRoot__UpdateGE20( aApplication, canvas );
  GraphicsCanvas__DetachBitmap( canvas );

  /* complete the update of viewport1 */
  EwEndUpdate( Viewport1, updateRect );
    
  /* invalidate the previously drawn area again */
  CoreRoot__InvalidateArea( aApplication, updateRect ); 

  /* ensure that framebuffer 2 is current drawing destination for OpenGL */
  eglMakeCurrent( EglDisplay2, EglSurface2, EglSurface2, EglContext ); 

  /* begin update with viewport 2 */
  bitmap = EwBeginUpdate( Viewport2 );
  if ( !bitmap )
    return;

  /* redraw the dirty area of the screen */
  GraphicsCanvas__AttachBitmap( canvas, (XUInt32)bitmap );
  updateRect = CoreRoot__UpdateGE20( aApplication, canvas );
  GraphicsCanvas__DetachBitmap( canvas );

  /* complete the update of viewport2 */
  EwEndUpdate( Viewport2, updateRect );
}

The idea of this Update() function is: The update the first framebuffer is done "as always" and the returned dirty area is used to invalidate the area again to force a second update. This second update is done into the second drawing destination.

Please note, that the above code is not compiled and tested - there might be some minor adaptations necessary, depending on your particular EGL configuration.
In principle "it should work" - please let us know your results!

by

Hi ! Sorry to dig out this post but i'm trying to modify my embedded-wizard main app to get it running on two screens (two frame buffer) at the same time.

I have tested both of the screens (HDMI and LVDS) and everything seems to be good at low level side. My HDMI is bind to /dev/fb0 and my LVDS is bind to /dev/fb1.

But I don't get how/where to define the two framebuffers in the main.c code.

For now I tried to follow the previous sample code without any success. My main concern is to initialize correctly OpenGL/ES to work with two framebuffers and display the same app in two screens / fb.

Things get a bit more confusing because the main display of my app seems to be inherited from wayland. There is this function to connect to the wayland display.

static struct wl_display *    Display     = NULL;

...

static void ConnectDisplay( void )
{
  struct wl_registry * registry = NULL;

  Display = wl_display_connect( NULL );
  if ( Display == NULL )
  {
    fprintf( stderr, "Can't connect to display\n" );
    exit( 1 );
  }

  registry = wl_display_get_registry( Display );
  wl_registry_add_listener( registry, &RegistryListener, NULL );

  wl_display_dispatch( Display );
  wl_display_roundtrip( Display );

  if ( Compositor == NULL || Shell == NULL )
  {
    fprintf( stderr, "Can't find compositor or shell\n" );
    exit( 1 );
  }
  else
  {
    fprintf( stderr, "Found compositor and shell\n" );
  }
}

...

And then in the initialization I do as follow :

static EGLDisplay    eglDisplay1 = 0;
static EGLSurface    eglSurface1 = 0;
static EGLDisplay    eglDisplay2 = 0;
static EGLSurface    eglSurface2 = 0;
static XViewport*    viewport1   = 0;
static XViewport*    viewport2   = 0;
static EGLContext    eglContext  = 0; 

...


static int EglInit( struct wl_surface* aWlSurface)
{
  struct wl_egl_window * eglWindow;

 
  EGLConfig   eglConfig1 = 0;

  EGLConfig   eglConfig2  = 0;


  EGLint      contextAttribs[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE };

  EGLint      major, minor, num_configs;
  EGLint      configAttribs[] =
  {
    EGL_SURFACE_TYPE,      EGL_WINDOW_BIT,
    EGL_RENDERABLE_TYPE,   EGL_OPENGL_ES2_BIT,
    EGL_BUFFER_SIZE,       32,
    EGL_COLOR_BUFFER_TYPE, EGL_RGB_BUFFER,
    EGL_RED_SIZE,          8,
    EGL_GREEN_SIZE,        8,
    EGL_BLUE_SIZE,         8,
    EGL_ALPHA_SIZE,        8,
    EGL_NONE
  };

  eglDisplay1 = eglGetDisplay( (EGLNativeDisplayType)Display );
  eglInitialize( eglDisplay1, &major, &minor );
  eglBindAPI( EGL_OPENGL_ES_API );
  eglChooseConfig( eglDisplay1, configAttribs, &eglConfig1, 1, &num_configs );

  if ( num_configs != 1 )
  {
    printf( "No of configs: %d\n", num_configs );
    return 0;
  }

  eglDisplay2 = eglGetDisplay( (EGLNativeDisplayType)Display );
  eglInitialize( eglDisplay2, &major, &minor );
  eglBindAPI( EGL_OPENGL_ES_API );
  eglChooseConfig( eglDisplay2, configAttribs, &eglConfig2, 1, &num_configs );

  if ( num_configs != 1 )
  {
    printf( "No of configs: %d\n", num_configs );
    return 0;
  }

  // 1920x780
  // 1280x480
  // 640x480

  eglWindow  = wl_egl_window_create( aWlSurface, APP_WIDTH, APP_HEIGHT );

  eglSurface1 = eglCreateWindowSurface( eglDisplay1, eglConfig1, eglWindow, 0 );
  eglContext = eglCreateContext( eglDisplay1, eglConfig1, 0, contextAttribs );

  eglSurface2 = eglCreateWindowSurface( eglDisplay2, eglConfig2, eglWindow, 0 );

  #if EW_PERFORM_FULLSCREEN_UPDATE == 0
    eglSurfaceAttrib( eglDisplay1, eglSurface1, EGL_SWAP_BEHAVIOR,
                      EGL_BUFFER_PRESERVED );
  #else
    eglSurfaceAttrib( eglDisplay1, eglSurface1, EGL_SWAP_BEHAVIOR,
                      EGL_BUFFER_DESTROYED );
  #endif

  #if EW_PERFORM_FULLSCREEN_UPDATE == 0
    eglSurfaceAttrib( eglDisplay2, eglSurface2, EGL_SWAP_BEHAVIOR,
                      EGL_BUFFER_PRESERVED );
  #else
    eglSurfaceAttrib( eglDisplay2, eglSurface2, EGL_SWAP_BEHAVIOR,
                      EGL_BUFFER_DESTROYED );
  #endif

  return 1;
}

Not very sure if this is correct. 

Then when it's time to create the two viewports, I'm not very sure that the framebuffers objects are well initialized.

Any help would be really appreciated as I'm pretty new with EW and OpenGL/ES.

Thank you in advance !

Mathias

by
Hi Mathias,

I think the difference to the original post is, that in your case the initialization of EGL to operate with two framebuffers is not successful. This is the prerequisite before adapting the Embedded Wizard main loop in order to operate with two viewports.

Unfortunately, EGL is very platform specific - so the only way is to check every EGL function call for error codes.

Best regards,

Manfred.

Embedded Wizard Website | Privacy Policy | Imprint

...