WWDC24: Explore object tracking for visionOS | Apple

WWDC24: Explore object tracking for visionOS | Apple

4.189 Lượt nghe
WWDC24: Explore object tracking for visionOS | Apple
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs. Discuss this video on the Apple Developer Forums: https://developer.apple.com/forums/topics/spatial-computing?cid=yt-w-0011 Explore related documentation, sample code, and more: Exploring object tracking with ARKit: https://developer.apple.com/documentation/visionOS/exploring_object_tracking_with_arkit Meet Object Capture for iOS: https://developer.apple.com/videos/play/wwdc2023/10191 Create enhanced spatial computing experiences with ARKit: https://developer.apple.com/videos/play/wwdc2024/10100 Build a spatial drawing app with RealityKit: https://developer.apple.com/videos/play/wwdc2024/10104 What’s new in Create ML: https://developer.apple.com/videos/play/wwdc2024/10183 Compose interactive 3D content in Reality Composer Pro: https://developer.apple.com/videos/play/wwdc2024/10102 00:00 - Introduction 05:07 - Create reference object 09:28 - Anchor virtual content More Apple Developer resources: Video sessions: https://apple.co/VideoSessions Documentation: https://apple.co/DeveloperDocs Forums: https://apple.co/DeveloperForums App: https://apple.co/DeveloperApp