Model: MediaPipe Hands

A person holding their hands up towards webcam, with a green hand skeleton overlaid on top of the hands in 2D
handsfree.update({hands: true})
  • 22 2D hand landmarks
  • Track up to 4 hands total

This model includes dozens of Pinch Events and helper styles to get you going quickly, along with a plugin for scrolling pages handsfree.

Usage

With defaults

const handsfree = new Handsfree({hands: true})
handsfree.start()

With config

const handsfree = new Handsfree({
  hands: {
    enabled: true,
    // The maximum number of hands to detect [0 - 4]
    maxNumHands: 2,

    // Minimum confidence [0 - 1] for a hand to be considered detected
    minDetectionConfidence: 0.5,

    // Minimum confidence [0 - 1] for the landmark tracker to be considered detected
    // Higher values are more robust at the expense of higher latency
    minTrackingConfidence: 0.5
  }
})

handsfree.start()

Data

Hand Landmarks

.landmarks and .landmarksVisible

You can access the landmarks for each hand through:

// handIndex [0 - 3] An array of landmark points for each detected hands
handsfree.data.hands.landmarks

// Left hand, person #1
handsfree.data.hands.landmarks[0]
// Right hand, person #1
handsfree.data.hands.landmarks[1]
// Left hand, person #2
handsfree.data.hands.landmarks[2]
// Right hand, person #2
handsfree.data.hands.landmarks[3]

Each of these has 22 {x, y} landmarks. To check if the hand is detected, you can use handsfree.data.hands.landmarksVisible:

// Left hand, person #1
handsfree.data.hands.landmarksVisible[0]
// Right hand, person #1
handsfree.data.hands.landmarksVisible[1]
// Left hand, person #2
handsfree.data.hands.landmarksVisible[2]
// Right hand, person #2
handsfree.data.hands.landmarksVisible[3]

Original data

It’s not recommended to use these as the hands are not always in the correct index, however it’s exposed here to provide backward compatibility for those switching to Handsfree.js from using MediaPipe Hands directly.

// handIndex [0 - 3] An array of landmark points for each detected hands
handsfree.data.hands.multiHandLandmarks[handIndex] == [
  // Landmark 0
  {x, y},
  // Landmark 1
  {x, y},
  // ...
  // Landmark 20
  {x, y}
]

// hand 0, landmark 0
handsfree.data.hands.multiHandLandmarks[0][0].x
handsfree.data.hands.multiHandLandmarks[0][0].y

Is it the right or left hand?

// handIndex [0 - 3] An array of landmark points for each detected hands
handsfree.data.hands.multiHandedness[handIndex] == {
  // "Right" or "Left"
  label,
  // The probability that it is "Right" or "Left"
  score
}

// hand 0
handsfree.data.hands.multiHandedness[0].label
handsfree.data.hands.multiHandedness[0].score

Examples of accessing the data

handsfree = new Handsfree({hands: true})
handsfree.start()

// From anywhere
handsfree.data.hands.landmarks

// From inside a plugin
handsfree.use('logger', data => {
  if (!data.hands) return

  // Show a log whenever the left hand is visible
  if (data.hands.landmarksVisible[0]) {
    console.log(data.hands.landmarks[0])
  }
})

// From an event
document.addEventListener('handsfree-data', event => {
  const data = event.detail
  if (!data.hands) return

  // Show a log whenever the right hand for person #2 is visible
  if (data.hands.landmarksVisible[3]) {
    console.log(data.hands.landmarks[3])
  }
})

Projects

The following projects all use MediaPipe Hands, however, they weren’t all necessarily done with Handsfree.js:

Menu