How to implement a dual display setup with embedded Linux and Qt

A modern cabin display concept of an off-highway vehicle often contains several displays that are integrated in the cockpit. One display can serve as an instrument cluster, another display is designed to be the central configuration and visualization screen, a third may even be placed further above to show the live video of a digital ip camera, acting as a digital mirror or surveillance camera. In such a situation, the question arises, how to design the software and hardware architecture for a multi-display cabin concept. There are the following possibilities:

  • (1) Each display has its own CPU/GPU
  • (2) One CPU/GPU renders the pixels for multiple displays

In case of option (1), you need to implement separate processes that can run independently on the hardware. You do not need to worry about window managers as you probably just use EGLFS to run each application on each CPU in fullscreen mode. This setup brings the advantage, that you have a rather simple hardware setup. Probably, all displays have the same size and thus, the reuse of hardware components can be maximized. Especially for low to mid volume projects, a homogenic hardware architecture brings about lower hardware costs and better serviceability.

In case of option (2), you need to decide how to configure the embedded linux system in order to render pixels on multiple screens. In the context of using Qt to implement the UI software, there are the following possibilities:

  • (2.1) One application and one QML window that stretches over multiple displays
  • (2.2) One application, but several QML window instances
  • (2.3) Two applications that run independently on each screen.

Option (2.1) has the problem, that only one main-event-loop is used to render all the pixels. This means, that a single component on any screen that may need longer to render results that both screens stutter. Additionally, layout components and positioning are not quite designed for such a setup which results in a rather clumsy layout-code implementation.

Option (2.2) has the challenge, that you somehow have to define the position of window instances on the display setup. For this, you need a way to interact with the window manager (probably wayland/weston). You can do this with the Qt Wayland Compositor, but this means to add some further complex Qt module to your software solution and additionally, you probably will need a commercial Qt license to use this module (or you choose to use the GPL version). In addition, the drawback of a single main-event loop is also present in this option.

Option (2.3) also has the requirement, that you need to define the position of the window instances on the display setup. But as you have separate UI-linux processes, you can make use of the configuration files your window manager offers. Like with a conventional multi-window desktop environment, you can define which linux-process should be rendered on which display. The challenge of this option is, that you need to build a modular software architecture that allows inter-process communication via an embedded-linux-middleware business logic. As we have made good experinces using message-brokers (like MQTT) here, this could be handled very well in our software team. Now let us go through one concrete example:

The Toradex Verdin iMx8M Plus Quad offers the possibility to render two displays with its two HDMI interfaces (1x native, 1x via Verdin DSI to HDMI adapter). The use of the kiosk-shell of the Wayland compositor Weston makes it possible to assign applications specifically to a display. This is configured in weston.ini [https://man.archlinux.org/man/weston.ini.5]. If an application with a specified app-id is started, it will be opened on the assigned display. In the simplest case, the app-id is simply the name of the executable file. Alternatively, it would also be possible to virtually increase the window size within the application so that it fills both displays. However, this method only works in Weston’s default desktop shell.
A simple example of defining app-ids would look like this:


[output]
name=HDMI-A-1
mode=1920x1080
app-ids=VerdinDualDisplayApplication1
[output]
name=HDMI-A-2
mode=1920x1080
app-ids=VerdinDualDisplayApplication2

To enable the most fluid and responsive user experience, it is recommended to actually launch two different applications. Thus, each application runs in its own process with its own event loop. This means that if, for example, one application performs a complex calculation, the other application still runs smoothly and does not stall. Even if you open two windows in one application, they influence each other, because only one process is running in the background.
In order to calibrate the touchscreens correctly and assign them to the correct screen, a custom udev rule must be defined. By means of the DEVPATH of the respective touchscreen, a USB input can be specifically assigned to a display.

Here some impressions of our test setup with the Toradex Verdin iMx8M Plus Quad on our youtube channel: