266 views
in GUI Development by
Hello Everyone,

In my project, I have an Outline Box that contains multiple buttons, dropdown menus, and text editors. I used the Outline Box to make the page scrollable, but I am facing the following problem, mostly with the text editors:

When I scroll up or down (both in the target and simulation), if I start the swipe gesture with my finger or mouse inside a text editor, the virtual keyboard briefly opens and then instantly closes as the swipe gesture is recognized. This not only causes the screen to flash but also immediately blocks the scrolling effect, making user input almost useless.

I tried rearranging the Z-order of the components and using retarget conditions more intelligently, but this hinders normal usage, as the user would need to press and hold a button or menu item for a long time before it is triggered.
If you need more information I can provide a little example project.

 

Thanks in advance,
Dario

1 Answer

0 votes
by

Hello Dario,

without knowing your implementation it is difficult to deduce what is the cause of this behavior. What I suppose is the stacking order of the touch handler. Or the handlers are incorrectly configured. You should notice that the Text Editor itself contains touch handlers to scroll the text and move the caret. Furthermore, it seems you have another touch handler to activate the Virtual Keyboard. When the user touches the screen, the top most handler will react first. This handler can take over the interaction or it can decline and retarget the actual touch interaction to other handler lying behind it. Possibly, the multiple handlers interfere in your implementation?

Please see the section Combine several Touch Handlers together. It explains the effect of stacking multiple handlers and describes the property RetargetCondition to allow the redirection of touch interactions between multiple touch handler. The intention is to allow multiple touch handler to work together.

Also review the implementation of the handler responsible for displaying the keyboard. For example, if it is the Simple Touch Handler and the handler is configured to decline (redirect the interaction) when the user performs a gesture, you should display the keyboard when the user releases the finger. Then performing the action in the moment when the user presses the finger on the screen is too early. The system does not know the intention of the user. It does not know, that the user eventually will start to move the finger. In this case note the status variable AutoDeflected which is true if the corresponding handler has declined and redirected the interaction. Don't show the keyboard if AutoDeflected is true.

Best regards

Paul Banach

Ask Embedded Wizard

Welcome to the question and answer site for Embedded Wizard users and UI developers.

Ask your question and receive answers from the Embedded Wizard support team or from other members of the community!

Embedded Wizard Website | Privacy Policy | Imprint

...