72

=I am trying to develop a simple drag/drop UI in my web application. An item can be dragged by a mouse or a finger and then can be dropped into one of several drop zones. When an item is dragged over a drop zone (but not yet released), that zone is highlighted, marking safe landing location. That works perfectly fine with mouse events, but I'm stuck with touchstart/touchmove/touchend family on the iPhone/iPad.

The problem is that when an item's ontouchmove event handler is called, its event.touches[0].target always points to the originating HTML element (the item) and not the element which is currently under the finger. Moreover, when an item is dragged by finger over some drop zone, that drop zone's own touchmove handlers isn't called at all. That essentially means I can't determine when a finger is above any of the drop zones, and therefore can't highlight them as needed. At the same time, when using a mouse, mousedown is correctly fired for all HTML elements under the cursor.

Some people confirm that it's supposed to work like that, for instance http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/: For those of you coming from the normal web design world, in a normal mousemove event, the node passed in the target attribute is usually what the mouse is currently over. But in all iPhone touch events, the target is a reference to the originating node.

Question: is there any way to determine the actual element under a finger (NOT the initially touched element which can be different in many circumstances)?

0

6 Answers 6

60

That's certainly not how event targets are supposed to work. Yet another DOM inconsistency that we're probably all now stuck with forever, due to a vendor coming up with extensions behind closed doors without any review.

Use document.elementFromPoint to work around it.

document.elementFromPoint(event.clientX, event.clientY);
6
  • 13
    Thank you very much! That's what I needed. I would also like to share a little bit more of magic for those who faced the same problem: add "pointer-events: none" on your draggable element to make elementFromPoint fall through it and catch the real target. Commented Dec 12, 2010 at 16:07
  • 1
    I think it's pretty clear that this is a performance compromise. Consider the extra processing that the mobile device would clearly need to do in order to correctly determine the element under the point provided from hardware 60 times per second. It pretty much requires DOM tree traversal and AABB point checks (which admittedly could be made quite fast). My guess is that from Apple's perspective, whether the event target works like it is supposed to or not, sluggish & choppy event performance is less favorable than an easily overlooked bug.
    – Steven Lu
    Commented Jan 4, 2013 at 5:22
  • 3
    @StevenLu Oh really? What about mouseover events? Is that not the exact same performance compromise? We've had that for forever, why is this different?
    – B T
    Commented Aug 24, 2013 at 6:36
  • 2
    Mobile devices have 100x less compute and up to 10 touches for an up to 3 orders of magnitude impediment.
    – Steven Lu
    Commented Aug 24, 2013 at 14:19
  • 1
    Furthermore, even the behavior of iOS's native touch events do not track the view that is underneath the touch, the touch event is "owned" by the view that it originally started on, and you must explicitly write code to pass the event up to the parent and/or override hit-testing to find out the real actual view that the finger is over. This is all because the task of finding that information isn't necessary most of the time so it isn't performed all of the time.
    – Steven Lu
    Commented Aug 24, 2013 at 14:28
56

The accepted answer from 2010 no longer works: touchmove does not have a clientX or clientY attribute. (I'm guessing it used to since the answer has a number of upvotes, but it doesn't currently.)

Current solution is:

var myLocation = event.originalEvent.changedTouches[0];
var realTarget = document.elementFromPoint(myLocation.clientX, myLocation.clientY);

Tested and works on:

  • Safari on iOS
  • Chrome on iOS
  • Chrome on Android
  • Chrome on touch-enabled Windows desktop
  • FF on touch-enabled Windows desktop

Does NOT work on:

  • IE on touch-enabled Windows desktop

Not tested on:

  • Windows Phone
5
  • 3
    this post needs more +1
    – romuleald
    Commented Dec 6, 2015 at 18:33
  • 9
    I'm not seeing originalEvent on TouchEvents, in Chrome v52 anyway. touches is available directly on the TouchEvent.
    – ericsoco
    Commented Sep 8, 2016 at 18:58
  • 4
    @ericsoco: Looks like some jquery (originalEvent) to me. Something like this should do it: (event.touches && event.touches.length) ? event.touches[0].clientX : event.clientX
    – Lain
    Commented Dec 1, 2016 at 9:45
  • This answer is amazing! Works perfectly and is much simpler, than the other ones. Just added a "data-..." to my tag and now I know exactly what was hovered! Commented Jan 13, 2020 at 21:30
  • 💯 Check event instanceof TouchEvent and use the code above. BTW, on Chrome I'm seeing changedTouches on TouchEvent object, even though React Typescript definition disagrees.
    – rpggio
    Commented Sep 21, 2020 at 16:01
17

Try using event.target.releasePointerCapture(event.pointerId) in the pointerdown handler.

We're now in 2022, this is intended and specified behavior - it's called "Implict Pointer Capture"

See the W3 spec on Pointer Events

Direct manipulation devices should behave exactly as if setPointerCapture was called on the target element just before the invocation of any pointerdown listeners. The hasPointerCapture API may be used (eg. within any pointerdown listener) to determine whether this has occurred.

elementFromPoint is a possible solution, but it seems you can also use releasePointerCapture as shown in the following demo. Touching and holding on the green div will get mouse move events for targets outside of it, whereas the red div has the default behavior.

const outputDiv = document.getElementById('output-div');
const releaseDiv = document.getElementById('test-release-div');
const noreleaseDiv = document.getElementById('test-norelease-div');

releaseDiv.addEventListener('pointerdown', function(e) {
  outputDiv.innerHTML = "releaseDiv-pointerdown";
  if (e.target.hasPointerCapture(e.pointerId)) {
      e.target.releasePointerCapture(e.pointerId);
  }
});

noreleaseDiv.addEventListener('pointerdown', function(e) {
  outputDiv.innerHTML = "noreleaseDiv-pointerdown";
});

document.addEventListener('pointermove', function(e) {
  outputDiv.innerHTML = e.target.id;
});
<div id="output-div"></div>
<div id="test-release-div" style="width:300px;height:100px;background-color:green;touch-action:none;user-select:none">Touch down here and move around, this releases implicit pointer capture</div>

<div id="test-norelease-div" style="width:300px;height:100px;background-color:red;touch-action:none;user-select:none">Touch down here and move around, this doesn't release implicit pointer capture<div>

1
  • 4
    This should be the accepted answer. The current accepted answer is outdated and causes performance problems, while also being bug-prone (if there is another element on top of the current) Commented Aug 23, 2023 at 23:23
9

So touch events have different "philosophy" when it comes to how they interact:

  • Mouse moves = "hover" like behavior
  • Touch moves = "drags" like behavior

This difference comes from the fact that there can not be a touchmove without touchstart event preceding it as a user has to touch screen to start this interaction. With mouse of course a user can mousemove all over the screen without ever pressing buttoon (mousedown event)

This is why basically we can't hope to use things like hover effects with touch:

element:hover { 
    background-color: yellow;
}

And this is why when user touches the screen with 1 finger the first event (touchstart) acquires the target element and the subsequent events (touchmove) will hold the reference to the original element where touch started. It feels wrong but there is this logic that you might need original target info as well. So ideally in future there should be both (source target and current target) available.

So common practice of today (2018) where screens can be mouse AND touch at the same time is still to attach both listeners (mouse and touch) and then "normalize" event coordinates and use above mentioned browser api to find element in those coordinates:

  // get coordinates depending on pointer type:
  var xcoord = event.touches? event.touches[0].pageX : event.pageX;
  var ycoord = event.touches? event.touches[0].pageY : event.pageY;
  // get element in coordinates:
  var targetElement = document.elementFromPoint(xcoord, ycoord);
  // validate if this is a valid element for our case:
  if (targetElement && targetElement.classList.contains("dropZone")) {
  }
8

I've encountered the same problem on Android (WebView + Phonegap). I want to be able to drag elements around and detect when they are being dragged over a certain other element. For some reason touch-events seem to ignore the pointer-events attribute value.

Mouse:

  • if pointer-events="visiblePainted" is set then event.target will point to the dragged element.
  • if pointer-events="none" is set then event.target will point to the element under the dragged element (my drag-over zone)

This is how things are supposed to work and why we have the pointer-events attribute in the first place.

Touch:

  • event.target always points to the dragged element, regardless of pointer-events value which is IMHO wrong.

My workaround is to create my own drag-event object (a common interface for both mouse and touch events) that holds the event coordinates and the target:

  • for mouse events I simply reuse the mouse event as is
  • for touch event I use:

    DragAndDrop.prototype.getDragEventFromTouch = function (event) {
        var touch = event.touches.item(0);
        return {
            screenX: touch.screenX,
            screenY: touch.screenY,
            clientX: touch.clientX,
            clientY: touch.clientY,
            pageX: touch.pageX,
            pageY: touch.pageY,
            target: document.elementFromPoint(touch.screenX, touch.screenY)
        };
    };
    

And then use that for processing (checking whether the dragged object is in my drag-over zone). For some reason document.elementFromPoint() seems to respect the pointer-events value even on Android.

2

JSP64's answer didn't fully work since event.originalEvent always returned undefined. A slight modification as follows works now.

var myLocation = event.touches[0];
var realTarget = document.elementFromPoint(myLocation.clientX, myLocation.clientY);

Not the answer you're looking for? Browse other questions tagged or ask your own question.