Cookie Consent by Free Privacy Policy Generator Aktuallisiere deine Cookie Einstellungen 📌 【visionOS/ARKit】Simplest sample code for hand tracking


📚 【visionOS/ARKit】Simplest sample code for hand tracking


💡 Newskategorie: Programmierung
🔗 Quelle: dev.to

The basis of ARKit's hand tracking API in visionOS is to obtain data such as coordinates and rotation of each joint of the hand.

Joints Example SessionVideo

Joints List SessionVideo

Here is a sample code of ARKit's hand tracking API in visionOS that works with only 41 lines of code.

Basic Knowledge

First, watch this session video. It gives a good overview of ARKit and hand tracking in visionOS.

Meet ARKit for spatial computing - WWDC23 - Videos - Apple Developer

Hand tracking is explained from 15:05.

Two APIs

There are two APIs to get joint data. Both can be accessed from the HandTrackingProvider instance.

  • anchorUpdates: receives the latest values via AsyncSequence.
  • latestAnchors: contains the latest values.

Observing hand anchor data

In this case, I implemented it with anchorUpdates.

Overview of the Sample Code

  • Starts hand tracking when the app launches.
  • Places simple sphere objects at all joints of both hands.
  • Updates each object’s position to match the latest joint position.
  • Ensures the objects are not hidden by the hand.

Example

Full Source Code

import SwiftUI
import RealityKit
import ARKit

@main
struct MyApp: App {
    private let session = ARKitSession()
    private let provider = HandTrackingProvider()
    private let rootEntity = Entity()

    var body: some SwiftUI.Scene {
        ImmersiveSpace {
            RealityView { content in
                content.add(rootEntity)
                for chirality in [HandAnchor.Chirality.left, .right] {
                    for jointName in HandSkeleton.JointName.allCases {
                        let jointEntity = ModelEntity(mesh: .generateSphere(radius: 0.006),
                                                      materials: [SimpleMaterial()])
                        jointEntity.name = "\(jointName)\(chirality)"
                        rootEntity.addChild(jointEntity)
                    }
                }
            }
            .task { try! await session.run([provider]) }
            .task {
                for await update in provider.anchorUpdates {
                    let handAnchor = update.anchor
                    for jointName in HandSkeleton.JointName.allCases {
                        guard let joint = handAnchor.handSkeleton?.joint(jointName),
                              let jointEntity = rootEntity.findEntity(named: "\(jointName)\(handAnchor.chirality)") else {
                            continue
                        }
                        jointEntity.setTransformMatrix(handAnchor.originFromAnchorTransform * joint.anchorFromJointTransform,
                                                       relativeTo: nil)
                    }
                }
            }
        }
        .upperLimbVisibility(.hidden)
    }
}

Copy and paste this code to use it.

Additional Steps

  • Set any text for “NSHandsTrackingUsageDescription” in Info.plist.
  • Set “Preferred Default Scene Session Role” to “Immersive Space” in Info.plist.

Info.plist Screenshot

Comments

The explanations in the session video and the basic knowledge of SwiftUI and RealityKit are omitted.

Managing Each Entity

jointEntity.name = "\(jointName)\(chirality)"
let jointEntity = rootEntity.findEntity(named: "\(jointName)\(handAnchor.chirality)")

Each entity is managed by its name using HandSkeleton.JointName for joint names and HandAnchor.Chirality for left or right hand.

Setting Access Permissions

To request access permissions, you need to set any text for “NSHandsTrackingUsageDescription” in Info.plist.

This key does not appear in the pull-down menu, so enter it directly.

Xcode Info.plist pull-down-menu

Launch the App in Full Space

@main
struct MyApp: App {
    ...
    var body: some SwiftUI.Scene {
        ImmersiveSpace {
            ...
        }
        ...
    }
}

When you create a new visionOS app project in Xcode, it generates code to launch in a window. For simplicity, I made it launch in full space.

Set “Preferred Default Scene Session Role” to “Immersive Space” in Info.plist. If you do not make this setting, the app will crash right after launch.

Note: A Physical Device is Required

You need a physical device to test the ARKit Hand Tracking API. It does not work at all in the simulator.

Next Step

  • Check the current authorization status: session.queryAuthorization(for:)
  • Explicitly request authorization: session.requestAuthorization(for:)
  • Check the state of anchors: AnchorUpdate.Event
  • Check if each anchor or joint is being tracked: TrackableAnchor.isTracked
  • Observe the session state: ARKitSession.Events
  • Check if the current runtime environment supports it: HandTrackingProvider.isSupported

Links

Meet ARKit for spatial computing - WWDC23 - Videos - Apple Developer

ARKit in visionOS | Apple Developer Documentation

upperLimbVisibility(_:) | Apple Developer Documentation

HandsRuler on the App Store

FlipByBlink/HandsRuler: Measure app by hand tracking for Apple Vision Pro

...



📌 【visionOS/ARKit】Simplest sample code for hand tracking


📈 94.6 Punkte

📌 Apple Rolls Out visionOS 1.1.2 and visionOS 1.2 Beta 1 to Vision Pro


📈 31.79 Punkte

📌 visionOS 2 vs. visionOS 1: 8 Top Features


📈 31.79 Punkte

📌 Monado OpenXR hand tracking: hand-waving our way towards a first attempt


📈 29.06 Punkte

📌 MadWifi 0.9.0/0.9.1/0.9.2/0.9.2.1 ath_rate/sample/sample.c ath_rate_sample denial of service


📈 29.03 Punkte

📌 MadWifi 0.9.0/0.9.1/0.9.2/0.9.2.1 ath_rate/sample/sample.c ath_rate_sample denial of service


📈 29.03 Punkte

📌 Apple QuickTime 7.2 Sample Table Sample Descriptor Heap-based memory corruption


📈 29.03 Punkte

📌 libgig 4.1.0 DLS.cpp DLS::Sample::Sample denial of service


📈 29.03 Punkte

📌 Medium CVE-2022-31517: Mercury sample manager project Mercury sample manager


📈 29.03 Punkte

📌 visionOS 2: Wenn Personas sich die Hand geben können


📈 26.6 Punkte

📌 visionOS 2: Wenn Personas sich die Hand geben können


📈 26.6 Punkte

📌 Ikea nutzt Apples ARKit für eigene Einrichtungs-App


📈 21.9 Punkte

📌 Epic Games: Unreal Engine 4.17 unterstützt Apples ARKit und Xbox One X


📈 21.9 Punkte

📌 Augmented Reality: Unreal Engine unterstützt nun Apples ARKit


📈 21.9 Punkte

📌 Augmented Reality mit iOS 11: Große Firmen zeigen ARKit-Anwendungen


📈 21.9 Punkte

📌 Google Unveils ARCore, Its Answer To Apple's ARKit


📈 21.9 Punkte

📌 Neue Regeln von Apple für die Gesichtserkennung in ARKit-Apps


📈 21.9 Punkte

📌 Augmented Reality: Ikea-App für ARKit unter iOS 11 kommt


📈 21.9 Punkte

📌 Virtuelles Möbelrücken: Amazon integriert Apples ARKit in Shopping-App


📈 21.9 Punkte

📌 Pokémon Go integriert ARKit


📈 21.9 Punkte

📌 Apple iOS 11.3: Mehr Animoji, schöneres ARKit und neue Akku-Einstellungen


📈 21.9 Punkte

📌 ARKit-Apps: 13 Millionen Downloads in 6 Monaten


📈 21.9 Punkte

📌 iOS 11.3 bringt neuen iPhone-Akkustatus und Support für ARKit 1.5


📈 21.9 Punkte

📌 iOS 11.3 bringt neuen iPhone-Akkustatus und Support für ARKit 1.5


📈 21.9 Punkte

📌 ARKit 2: Apple stellt beeindruckende AR-Anwendungen für iOS 12 vor


📈 21.9 Punkte

📌 Augmented Reality: Das kann Apples arKit 2


📈 21.9 Punkte

📌 Apple Releases iOS 12 With Faster Performance, Memoji, Siri Shortcuts, Screen Time, Revamped Maps App, ARKit 2.0, and More


📈 21.9 Punkte

📌 ARKit 2


📈 21.9 Punkte

📌 ARKit 1.5 Now Available


📈 21.9 Punkte

📌 Apple stellt ARKit 3, Core ML 3 und weitere Entwickler-Tools vor


📈 21.9 Punkte

📌 ARKit 3.5 Now Available


📈 21.9 Punkte

📌 Apple veröffentlicht ARKit in Version 3.5


📈 21.9 Punkte

📌 Apples neues ARKit 3.5 mit mehr Potenzial – von Laser-Ausmessung, Bewegungserfassung und Instant AR


📈 21.9 Punkte











matomo