Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compute Head Direction and Angular Head Velocity #238

Closed
b-peri opened this issue Jul 15, 2024 · 3 comments · Fixed by #276 · May be fixed by #315
Closed

Compute Head Direction and Angular Head Velocity #238

b-peri opened this issue Jul 15, 2024 · 3 comments · Fixed by #276 · May be fixed by #315
Assignees
Labels
enhancement New optional feature

Comments

@b-peri
Copy link
Collaborator

b-peri commented Jul 15, 2024

2-D head direction and angular head velocity are important navigation-related variables which can be computed directly from an animal's pose tracks. Since these are relatively low-hanging fruit, I think these variables would probably be a good start for a larger movement module focusing on spatial navigation.

Basic Implementation

From a cursory review of the literature, it seems like head direction is most commonly tracked using markers placed on either side of the head. Assuming that most users will take a similar approach, we can begin calculating the head direction by taking the 2-D vector from the right keypoint to the left keypoint at every frame. Note here that, because most pose-tracking packages place the origin in the top-left of the image, the y-values on our pose tracks will be "inverted" (at least, relative to the more traditional, cartesian reference frame, in which the origin would be at the bottom left). Thus, to ensure that we ultimately end up with angles that make sense to us, we can simply multiply the y-coordinate of these vectors by -1 at this point.

The head vector can then be found by computing the cross-product between the ("corrected") right-to-left vector and the vector [0, 0, 1]. The result is a vector perpendicular to the right-to-left vector which faces in the direction of the animal's nose. We can then find the head direction (as an angle) by passing this vector into np.arctan2(), converting the result into degrees, and adding 360° to all negative values to ensure all values are scaled between 0-360° (as in the figure below).

image

An angular head velocity function can then be built off of this. Here, we can simply multiply frame-to-frame difference in head direction with the input DataArray's fps attribute.

Concerns

My primary concern here is that this implementation expects users to have tracked head direction in a very specific way (i.e. by tracking the left and right side of the head), and while this may cover the majority of users, this does not capture all possible approaches. While I think it's okay to just stick with a specific method for a first implementation, I'd be curious to hear if anyone has particular thoughts on how we might be able to make the head direction function more general/accommodate for different keypoint configurations (or whether this is important at all)!

@b-peri b-peri added the enhancement New optional feature label Jul 15, 2024
@b-peri b-peri self-assigned this Jul 15, 2024
@niksirbi niksirbi moved this from 🤔 Triage to 🚧 In Progress in movement progress tracker Jul 15, 2024
@niksirbi
Copy link
Member

Thanks for the working on this @b-peri and for the nice description.

A note on calculating the angular velocity once head direction is known: you might be able to just use the _compute_approximate_derivative() function.

@niksirbi
Copy link
Member

Also, it might be useful (especially for visualisation) to also save the 2D head direction vector itself, not just the resulting angle.

@niksirbi
Copy link
Member

Hey @b-peri, following up from our meeting today:

You can find my example implementation of computing the head direction vector in this notebook.

That notebook does many other things, but if I were to abstract only the head direction vector as a separate function, I would do sth like below:

def compute_head_direction(
    data: xr.DataArray,
    right_keypoint: str,
    left_keypoint: str,
):
    """
    Compute the head direction vector given two keypoints on the head.

    The head direction vector is computed as a vector perpendicular to the
    line connecting two keypoints on either side of the head, pointing
    forwards (in a rostral direction).

    Parameters
    ----------
    data : xarray.DataArray
        The input data representing position. It must contain the chosen
        right and left keypoints.
    right_keypoint : str
        The name of the right keypoint, e.g., 'right_ear'.
    left_keypoint : str
        The name of the left keypoint, e.g., 'left_ear'.

    Returns
    -------
    xarray.DataArray
        An xarray DataArray representing the head direction vector,
        with dimensions matching the input data array, but without the
        ``keypoints`` dimension.
    """

    # Select the right and left keypoints
    # drop=True removes the `keypoints` dimension
    head_right = data.sel(keypoints=right_keypoint, drop=True)
    head_left = data.sel( keypoints=left_keypoint, drop=True)

    # Vector going from the right to the left ear
    inter_ear_vector = head_left - head_right
    # unit vector along z-axis
    z_vector = np.array([0, 0, -1])  

    # Initialise the head vector as a copy of the array with the right keypoint
    # This will ensure it has the same dimensions (without 'keypoints'),
    # coordinates, and attributes as the input data array.
    head_vector = head_right.copy()
    # Replace the values with the cross product of the inter-ear vector
    # and the z-axis unit vector
    # We discard the z-component of the cross product
    head_vector.values = np.cross(inter_ear_vector.values, z_vector)[:,:,:-1]
    
    return head_vector

@b-peri b-peri mentioned this issue Aug 22, 2024
7 tasks
@sfmig sfmig linked a pull request Aug 27, 2024 that will close this issue
7 tasks
@b-peri b-peri mentioned this issue Oct 3, 2024
7 tasks
@github-project-automation github-project-automation bot moved this from 🚧 In Progress to ✅ Done in movement progress tracker Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New optional feature
Projects
Development

Successfully merging a pull request may close this issue.

2 participants