I made a Machine Learning AR experience which labels arbitrary objects in the camera feed. It uses SwiftUI, RealityKit, ARKit, LiDAR, a custom RealityKit text swift package that I made, CoreML, and Vision.
I made a LiDAR point cloud colored by elevation instead of depth.
I challenged myself to rig a custom 3D character to use as a BodyTrackedEntity.
I created a body tracking app in SwiftUI and RealityKit. It allows you to select the joint groups you would like to display, and it has 2D and 3D options.
Worked with Metal and ARKit person segmentation to make this fun little demo!
I thought it would be cool to see if I could make an app for trying on an Apple Watch in AR and it worked!
This uses a video material and people occlusion to make it look like you are under the sea!
Once I heard that ARKit had eye tracking, I had to try it out!
Glad to get to try out ARAppClipCodeAnchor!
I made an app that uses world map persistence to automatically put a virtual portrait at the same position on a wall where you placed it before -- even if you close out of the app and re-open it.
I experimented with an app that would detect when you start running and give you your time when you cross the finish line.
Back to Top