I am trying to use an HMDI/USB touchscreen with a QT application on Ubuntu. I am able to use the touchscreen to navigate in the OS, but my QT application is completely unresponsive to the touch events. The application type is a Desktop QT 6.8.1 with a simple button: enter image description here
This is the content of my mainwindow.cpp
#include "mainwindow.h"
#include "./ui_mainwindow.h"
MainWindow::MainWindow(QWidget *parent)
: QMainWindow(parent)
, ui(new Ui::MainWindow)
{
ui->setupUi(this);
setAttribute(Qt::WA_AcceptTouchEvents, true);
}
MainWindow::~MainWindow()
{
delete ui;
}
void MainWindow::on_pushButton_pressed()
{
qDebug() << "button pressed";
}
void MainWindow::on_pushButton_released()
{
qDebug() << "button released";
}
bool MainWindow::event(QEvent* event)
{
switch (event->type())
{
case QEvent::TouchBegin:
qDebug() << "touch!";
return true;
case QEvent::TouchEnd:
qDebug() << "touch end!";
return true;
case QEvent::MouseButtonDblClick:
qDebug() << "double click";
return true;
case QEvent::MouseButtonPress:
qDebug() << "pressed";
return true;
case QEvent::MouseButtonRelease:
qDebug() << "released";
return true;
default:
// call base implementation
return QMainWindow::event(event);
}
}
I am enabling touch events in the constructor. When I run the application, I can see the events debug messages when using the mouse. However, when using the touchscreen, none of the events are triggered. Not even the mouse ones!
Is there a reason why the touch event are not recognized?
Note: touch events are working completely fine on another UI framework (flutter).
-
Does tapping on the button trigger those functions (assuming they're properly set as slots and they also work when using the mouse)?musicamante– musicamante2024年12月31日 18:03:32 +00:00Commented Dec 31, 2024 at 18:03
-
@musicamante yes, but only with the mouse. not with the touchscreen.ESD– ESD2024年12月31日 18:05:22 +00:00Commented Dec 31, 2024 at 18:05
-
@AhmedAEK It should still print a debug message if the event is triggered at all no?ESD– ESD2024年12月31日 18:06:21 +00:00Commented Dec 31, 2024 at 18:06
-
Note that you're using a QMainWindow with an UI set, which means that it has a central widget. Since you only set the attribute on the main window, the central widget will receive the event as a mouse event, and will then propagate it to the main window as such. As far as I know, in this case the event is not "restored" as a touch event, even if it wasn't accepted by the children. Try setting the attribute to the central widget and the button as well.musicamante– musicamante2024年12月31日 18:07:32 +00:00Commented Dec 31, 2024 at 18:07
-
1Is touch screen functional otherwise? It needs a correct configuration. Desktop and X11 should react on that. Also, HID-compatible driver for touch screen generates normal pointer events, including mouse ones, that goes to central widget, not to main window.Swift - Friday Pie– Swift - Friday Pie2024年12月31日 18:09:13 +00:00Commented Dec 31, 2024 at 18:09
1 Answer 1
I had a similar problem (Qt5, however), and found out that accepting touch events wasn't recursively transmitted to children widgets...
So I wrote this function:
void AddTouchEventSupport ( QWidget* qw ) {
qw->setAttribute(Qt::WA_AcceptTouchEvents, true) ;
for ( auto&& w : qw->findChildren<QWidget*>(QString(),Qt::FindChildrenRecursively))
w->setAttribute(Qt::WA_AcceptTouchEvents, true) ;
}
Replace your setAttribute(Qt::WA_AcceptTouchEvents, true); by AddTouchEventSupport(this); and test if it works.
1 Comment
Explore related questions
See similar questions with these tags.