Notice (2018-05-24): bugzilla.xamarin.com is now in
Please join us on
Visual Studio Developer Community and in the
Mono organizations on
GitHub to continue tracking issues. Bugzilla will remain
available for reference in read-only mode. We will continue to work
on open Bugzilla bugs, copy them to the new locations
as needed for follow-up, and add the new items under Related
Our sincere thanks to everyone who has contributed on this bug
tracker over the years. Thanks also for your understanding as we
make these adjustments and improvements for the future.
Please create a new report on
Developer Community or GitHub with
your current version information, steps to reproduce, and relevant error
messages or log files if you are hitting an issue that looks similar to
this resolved bug and you do not yet see a matching new report.
One of our customers is facing issues with TouchEvents in CustomRenderers
I wrote a custom page with custom renderers that work fine on android and iOS.
They intercept touch events (mostly) as expected and dispatch them to the view, only windows phone seems to ignore them completely.
After placing some breakpoints, I can say that the windows phone renderer is called and initialized so it's registering just fine.
The renderer extends from PageRenderer and hooks into the "OnElementChanged" call where I have tried various methods of hooking into the touch event handlers.
On android, this was accomplished by simply overriding some methods (DispatchTouchEvent for instance),
On iOs, it was slightly trickier because it too ignores the "TouchesBegan" and other methods.
I managed to work around that by accessing the "NativeView" property and placing gesture recognizers on that.
On windows phone, however, I had no such luck.
* adding listeners to the renderer's "ManipulationStarted", "ManipulationDelta" and "ManipulationCompleted" events yielded no results (placing breakpoints revealed they are never called)
* Trying to add the listeners for those same events on the "ContainerElement" property made no difference
* The "Element" property doesn't have any events for touch interactions
* As far as I can tell I can't access the native view either
to the people at xamarin: If you're testing the project I suggest running it on Android first to see how it's supposed to look.
There seems to be an issue with capturing touch events on both iOS and windowsPhone as described in more detail below.
Unable to intercept touch events in the renderer by overriding relevant methods (TouchesBegan, TouchesMoved, touchesEnded, ... etc.). These methods don't get called unless you set InputTransparent to true for the "detail" view (but that has some very annoying side-effects)
A partial workaround is to add a UISwipeGestureRecognizer to the NativeView property underneath, that way you can do a swipe-like gesture left and right and the menu will open/close.
But this is far from ideal.
Unable to intercept the manipulation events by overriding them in the renderer (like with iOS, they are not called).
The difference here is that I have been unable to find a workaround, so far I have been unable to find a way of either listening to touch inputs on the view directly nor add gesture recognizers.
Here you will notice that neither flinging nor dragging will result in the menu opening.
WP8 is deprecated and as such this is being closed.