ARKit hide objects behind walls
How can I use the horizontal and vertical planes tracked by ARKit to hide objects behind walls/ behind real objects? Currently the 3D added objects can be seen through walls when you leave a room and/ or in front of objects that they should be behind. So is it possible to use the data ARKit gives me to provide a more natural AR experience without the objects appearing through walls?
Solutions Collect From Internet About “ARKit hide objects behind walls”
You have two issues here.
How to create occlusion geometry for ARKit/SceneKit?
If you set a SceneKit material’s
colorBufferWriteMask to an empty value (
 in Swift), any objects using that material won’t appear in the view, but they’ll still write to the z-buffer during rendering, which affects the rendering of other objects. In effect, you’ll get a “hole” shaped like your object, through which the background shows (the camera feed, in the case of
ARSCNView), but which can still obscure other SceneKit objects.
You’ll also need to make sure that an occluded renders before any other nodes it’s supposed to obscure. You can do this using node hierarchy ( I can’t remember offhand whether parent nodes render before their children or the other way around, but it’s easy enough to test). Nodes that are peers in the hierarchy don’t have a deterministic order, but you can force an order regardless of hierarchy with the
renderingOrder property. That property defaults to zero, so setting it to -1 will render before everything. (Or for finer control, set the
renderingOrders for several nodes to a sequence of values.)
The example code project on the ARKit developer site does this — it has an option to make detected planes occlude virtual objects. (There’s kind of a lot of other code in that project, so it may take some work to find it. Look in the
Plane class, IIRC.)
How to detect walls/etc so you know where to put occlusion geometry?
Sorry, this one ARKit can’t help with much. It detects only horizontal planes. And a random-looking point cloud that has something to do with the scene features.
You might be able to try inferring the positions of walls based on where plane detection finds a floor, or get the user to tag some walls using hit detection against feature points, or feed the camera image to some other software (an ML model maybe?) that can help you identify walls. As it is, out of the box ARKit only really provides easy help for making (one-)room-scale AR experiences — more is possible, but it’ll take more work on your end.
- Xcode can build a project but can't run it on simulator
- iOS Swift: Displaying asynchronous data on a TableView
- UIPickerView – Loop the data
- Swift 3 – UIButton adding setTitle from plist and database
- How can I check for real touch support on a browser
- presentModalViewController in iOS6
- Parse BaaS iOS LoginFields in Swift – .value should be changed to?
- Parse array images saving and fetching
- Swift 1.2 lowercaseString crashes with enumerateSubstringInRange
- Location services don't stop when application is terminated
- Why are my ASIHTTPRequest files showing ARC errors?
- Changing the UINavigationBar background image
- Trying to create an accurate timer with GCD
- how to use values returned by Swift function
- Why does Swift return an unexpected pointer when converting an optional String into an UnsafePointer?