ARKit hide objects behind walls
How can I use the horizontal and vertical planes tracked by ARKit to hide objects behind walls/ behind real objects? Currently the 3D added objects can be seen through walls when you leave a room and/ or in front of objects that they should be behind. So is it possible to use the data ARKit gives me to provide a more natural AR experience without the objects appearing through walls?
Solutions Collect From Internet About “ARKit hide objects behind walls”
You have two issues here.
How to create occlusion geometry for ARKit/SceneKit?
If you set a SceneKit material’s
colorBufferWriteMask to an empty value (
 in Swift), any objects using that material won’t appear in the view, but they’ll still write to the z-buffer during rendering, which affects the rendering of other objects. In effect, you’ll get a “hole” shaped like your object, through which the background shows (the camera feed, in the case of
ARSCNView), but which can still obscure other SceneKit objects.
You’ll also need to make sure that an occluded renders before any other nodes it’s supposed to obscure. You can do this using node hierarchy ( I can’t remember offhand whether parent nodes render before their children or the other way around, but it’s easy enough to test). Nodes that are peers in the hierarchy don’t have a deterministic order, but you can force an order regardless of hierarchy with the
renderingOrder property. That property defaults to zero, so setting it to -1 will render before everything. (Or for finer control, set the
renderingOrders for several nodes to a sequence of values.)
The example code project on the ARKit developer site does this — it has an option to make detected planes occlude virtual objects. (There’s kind of a lot of other code in that project, so it may take some work to find it. Look in the
Plane class, IIRC.)
How to detect walls/etc so you know where to put occlusion geometry?
Sorry, this one ARKit can’t help with much. It detects only horizontal planes. And a random-looking point cloud that has something to do with the scene features.
You might be able to try inferring the positions of walls based on where plane detection finds a floor, or get the user to tag some walls using hit detection against feature points, or feed the camera image to some other software (an ML model maybe?) that can help you identify walls. As it is, out of the box ARKit only really provides easy help for making (one-)room-scale AR experiences — more is possible, but it’ll take more work on your end.
- Swift 2 – Filter array of strings, including “like” condition
- What tools might I use for generating documentation for an Xcode project?
- Swift: Force show Navigation Bar in Modal
- Crashlytics is not sending Crash report from iPhone
- Auto property synthesis (@property) and inheritance
- Add transparent space around a UIImage
- Major Xcode 7 Sprite Kit Atlas Bug
- Get iPhone to Vibrate
- Change color of Back button in navigation bar
- Autolayout: origin and size should change according to width and height factor
- force landscape ios 7
- image for nav bar button item swift
- IOS: How to print using an IPad
- Presenting a UIViewController from SKScene shows black screen
- iOS : Save image with custom resolution