1.1k views
in System Integration by
Good day,

I have a simple touch handler on top of a slide handler. The simple touch handler has the following retarget specifications:

.RetargetCondition = Core::RetargetReason[WipeDown, WipeLeft, WipeRight, WipeUp]

.RetargetOffset = 10

.RetargetDelay = 1

In the prototyper

* A slide motion riases the slide event within the slide handler

* A press action raised the pressed event within the simple touch handler

 

But as soon as I take this to the hardware, the slide motion (even a fast motion) always raises the pressed event first and then the slide event.

Our hardware is healthy, and servises the EW application timeously.

Is this a bug in EW? Is it just the reality of things working out on paper but the real world has extra considerations that I need to know about?

 

I have 2 videos, as well as the EW project if you like, but I will need an email address or something to send it to,

 

Kind Regards,

Rob
by
Hi Manfred,

Thanks for the prompt response. I'll have a look. I receive a single event for press, and then a single event for release,

Kind Regards,

Rob
by
Hi Manfred,

Just to confirm, it is as you say. Single press event, multiple move events and Single release event are fed into EW. I logged to a buffer and verified this.

My code is from the STM32F469-DISCO example, and whilst it does not look exactly the same as yours, the result is the same.

Rob

1 Answer

0 votes
by

Hi Rob,

in principle the different touch handlers and the retargetting is working fine - also on a real hardware.

Maybe the problem is more related to the system integration (on your particular hardware) or to the touch driver of your hardware. What target are you using?

You can modify the code sequence within your main loop, in order to analyze the touch sequences that are feed into the Embedded Wizard GUI application.

    /* Step 3: Receive cursor inputs (mouse/touch events)... */
    touch = GetTouchPosition( &point );

    /* ...and provide it to the application. */
    if ( touch > 0 )
    {
      /* Begin of touch cycle */
      if ( touch == 1 )
      {
        EwPrint( "Down\n" );
        events |= CoreRoot__DriveCursorHitting( rootObject, 1, 0, pos );
      }

      /* Movement during touch cycle */
      else if ( touch == 2 )
      {
        EwPrint( "Move\n" );
        events |= CoreRoot__DriveCursorMovement( rootObject, pos );
      }

      /* End of touch cycle */
      else if ( touch == 3 )
      {
        EwPrint( "Up\n" );
        events |= CoreRoot__DriveCursorHitting( rootObject, 0, 0, pos );
      }
    }

When you now make one slide gesture, do you get only one 'Down' - 'Move'....'Move' - 'Up' sequence or are there several 'Down' / 'Up' events?

Best regards,

Manfred.

by
Hi Manfred,

Can you suggest anything else I may try?

In my understanding the touch events are faithfully injected into EW

The example I recorded (would be great if I can forward the project to you) was literally copied and pasted onto the hardware.

Rob
by
Hi Rob,

first of all, some questions:

Can you reproduce the problem within a modified 'HelloWorld' example?

Are you using the provided Build Environment? Which version?

Are you using an operating system?

A simple 'HelloWorld' project, just added with your code, would be great. Can you post it here?

Best regards,

Manfred.
by
Hi Manfred,

Replaying to the email does not work, as I get an undelivered message. Does this forum allow one to attach files?

Kind Regards,

Rob
by
I have a modified HellowWorld app, which I will send as soon as you can let me know how.

To answer your other questions:

* We're using EW 8.30

* We're not using an operating system
by
Hi Rob,

of course, you can upload a file - just use the link button (chain symbol) and then select the upload page.

Looking forward to see the modified HelloWorld project.

Best regards,

Manfred.
by

Hi Rob,

thanks for providing the example - the result is: The bahavior on the target is the same as within the Prototyper. First you get the up/down sequence of the simple touch handler then you get the slide events of the slide touch handler. Within the Prototyper you can sometimes not see the fast changes of your text label, due to the high update rate.

You can see that, when you modify your example, so that the simple touch handler just makes a trace message when it is pressed or released, and the slide touch handler in case of start sliding or end sliding:

Within the Prototyper you see the following result in case of a fast swipe gesture:

And on the target you get the following sequence:

So from my point of view everything is fine...

Best regards,

Manfred.

 

  

by
Hi Manfred, I accept that. Can you say what I am doing incorrectly here? I'm wanting the simple touch to be retargetted due to the swipe motion.

.RetargetOffset = 10
.RetargetDelay = 1

 

Should .RetargetDelay be zero to avoid an instantaneous onPress?
by
Hi Rob,

please consider:

(1) You have a simple touch handler on top and you are touching the screen => Pressed.

(2) Now you slide on the screen => Release the simpel touch handler and start of slide touch cycle.

After (1) it cannot be foreseen if you want to start a slide gesture (2) or if you release the screen.

In case you do not want to react on the 'press' event, you can use the 'hold' event and evaluate the HoldPeriod of the touch handler.

Hope this helps...

Manfred.
by
Hi Manfred,

I've noted what you say about the prototyper raising the touch event.

A screen full of buttons with the requirement to swipe to another screen is a common requirement.

Would you recommend what you suggested in your last post as the optimal way (regarding usability) to go about this, of do you think it would be better to place the slide handler in front of the simple touch handler and Retarget after a delay ?

Kind Regards,

Rob
by
Hi Rob,

in case of buttons that are placed on a component that should be scrolled you can make different implementations. In the simplest case you place the buttons (including their simple touch handler) over the panel with its slide touch handler. As a result, the user can press on the buttons or slide the panel, as soon as he touches outside the buttons.

If it should be possible to slide the panel everywhere, you need to redirect the move gesture from the buttons to the panel. In this case:

Implement your buttons including a simple touch handler. As soon as the button is pressed, the button just appears pressed, without starting the buttons action. The action of the button will be signaled, when the user releases the button and the cursor event is not deflected.

Using the RetargetCondition of the button makes it possible to forward the touch event to the underlying panel.

Best regards,

Manfred.
by
Thanks Manfred, I came right.

The slide handler is placed behind the button touch handlers, so that the button touch handler services the event first. When the press is detected, a flag is set, and a timer started for 50ms. If the buttons onDrag event occurs within that time frame, then the flag is cleared, and the press event is ignored. The button touch handler also retargets to the slide touch handler, so that it can service the movement. If after 50ms, no onDrag event is generated, then the buttons action is executed

Ask Embedded Wizard

Welcome to the question and answer site for Embedded Wizard users and UI developers.

Ask your question and receive answers from the Embedded Wizard support team or from other members of the community!

Embedded Wizard Website | Privacy Policy | Imprint

...