Recently I'm helping other studios make AR Applications for their final presentations.
This video is the process of scanning point clouds via iPhone with depth sensors.
This one is the first demo with object tracking with environment occlusion. It's tracking the object scanned in the first video. And with buttons to turn on and off different layers.
It's quite nice to use the new features in AR Foundation in unity, they're easy to use and easy to collaborate with my teammates.