13

How can I export the ARMeshGeometry generated by the new SceneReconstruction API on the latest iPad Pro to an .obj file?

Here's SceneReconstruction documentation.

Andy Jazz
  • 36,357
  • 13
  • 107
  • 175
zhou junhua
  • 387
  • 2
  • 8

3 Answers3

17

Starting with Apple's Visualising Scene Scemantics sample app, you can retrieve the ARMeshGeometry object from the first anchor in the frame.

The easiest approach to exporting the data is to first convert it to an MDLMesh:

extension ARMeshGeometry {
    func toMDLMesh(device: MTLDevice) -> MDLMesh {
        let allocator = MTKMeshBufferAllocator(device: device);

        let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count);
        let vertexBuffer = allocator.newBuffer(with: data, type: .vertex);

        let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive);
        let indexBuffer = allocator.newBuffer(with: indexData, type: .index);

        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: faces.count * faces.indexCountPerPrimitive,
                                 indexType: .uInt32,
                                 geometryType: .triangles,
                                 material: nil);

        let vertexDescriptor = MDLVertexDescriptor();
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float3,
                                                            offset: 0,
                                                            bufferIndex: 0);
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride);

        return MDLMesh(vertexBuffer: vertexBuffer,
                       vertexCount: vertices.count,
                       descriptor: vertexDescriptor,
                       submeshes: [submesh]);
    }
}

Once you have the MDLMesh, exporting to an OBJ file is a breeze:

    @IBAction func exportMesh(_ button: UIButton) {
        let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor });

        DispatchQueue.global().async {

            let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0];
            let filename = directory.appendingPathComponent("MyFirstMesh.obj");

            guard let device = MTLCreateSystemDefaultDevice() else {
                print("metal device could not be created");
                return;
            };

            let asset = MDLAsset();

            for anchor in meshAnchors! {
                let mdlMesh = anchor.geometry.toMDLMesh(device: device);
                asset.add(mdlMesh);
            }

            do {
                try asset.export(to: filename);
            } catch {
                print("failed to write to file");
            }
        }
    }
swiftcoder
  • 283
  • 1
  • 6
  • Hi @swiftcoder! Thank you for you answer. It looks convincing. Have you tested it? Does `OBJ` export work? I can't test it 'cause I have no iPad with a LiDAR scanner. – Andy Jazz Apr 11 '20 at 03:52
  • Yes, I used this code (added to the sample app), to scan objects in my apartment. Note that if you scan a large area, you will end up with multiple mesh anchors, so you need to run this code for each one, and add them all to the MDLAsset. – swiftcoder Apr 12 '20 at 04:19
  • Thanks @swiftcoder! Where do I place the code let mdlMesh = anchor.geometry.toMDLMesh()... in the example? Did you used an extra ibaction for that? – Florian Thürkow Apr 24 '20 at 08:50
  • Yes, I added a new IBAction (I've updated the answer to include it), and then wired that up to an "Export" button in the UI. – swiftcoder Apr 25 '20 at 14:50
  • Where does the .obj is saved exactly? or how can I access it? – Kevin Oswaldo Jan 29 '21 at 21:19
  • The sample code saves the obj file in the documents directory. You should be able to find the documents directory in the iOS Files app. – swiftcoder Jan 30 '21 at 09:41
  • What sort of size do the files come out as for this? Are we talking a few MBs or GBs? – ThinkOutside Feb 23 '21 at 15:19
8

The answer of the @swiftcoder works great. But in the case of several anchors you need to convert the vertices coordinates to the world coordinate system based on the anchor transform. In the opposite case all meshes will be placed at zero position and you will have a mess.

The updated code looks like this:

extension ARMeshGeometry {
    func toMDLMesh(device: MTLDevice, transform: simd_float4x4) -> MDLMesh {
        let allocator = MTKMeshBufferAllocator(device: device)

        let data = Data.init(bytes: transformedVertexBuffer(transform), count: vertices.stride * vertices.count)
        let vertexBuffer = allocator.newBuffer(with: data, type: .vertex)

        let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive)
        let indexBuffer = allocator.newBuffer(with: indexData, type: .index)

        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: faces.count * faces.indexCountPerPrimitive,
                                 indexType: .uInt32,
                                 geometryType: .triangles,
                                 material: nil)

        let vertexDescriptor = MDLVertexDescriptor()
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float3,
                                                            offset: 0,
                                                            bufferIndex: 0)
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride)

        return MDLMesh(vertexBuffer: vertexBuffer,
                       vertexCount: vertices.count,
                       descriptor: vertexDescriptor,
                       submeshes: [submesh])
    }

    func transformedVertexBuffer(_ transform: simd_float4x4) -> [Float] {
        var result = [Float]()
        for index in 0..<vertices.count {
            let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + vertices.stride * index)
            let vertex = vertexPointer.assumingMemoryBound(to: (Float, Float, Float).self).pointee
            var vertextTransform = matrix_identity_float4x4
            vertextTransform.columns.3 = SIMD4<Float>(vertex.0, vertex.1, vertex.2, 1)
            let position = (transform * vertextTransform).position
            result.append(position.x)
            result.append(position.y)
            result.append(position.z)
        }
        return result
    }
}

extension simd_float4x4 {
    var position: SIMD3<Float> {
        return SIMD3<Float>(columns.3.x, columns.3.y, columns.3.z)
    }
}

extension Array where Element == ARMeshAnchor {
    func save(to fileURL: URL, device: MTLDevice) throws {
        let asset = MDLAsset()
        self.forEach {
            let mesh = $0.geometry.toMDLMesh(device: device, transform: $0.transform)
            asset.add(mesh)
        }
        try asset.export(to: fileURL)
    }
}

I am not an expert in ModelIO and maybe there is more simple way to transform vertex buffer :) But this code works for me.

6

This code snippet allows you save LiDAR's geometry as USD and send it to Mac computer via AirDrop. You can export not only .usd but also .usda, .usdc, .obj, .stl, .abc, and .ply file formats.

Additionally you can use SceneKit's write(to:options:delegate:progressHandler:) method to save a .usdz version of file.

import RealityKit
import ARKit
import MetalKit
import ModelIO

@IBOutlet var arView: ARView!
var saveButton: UIButton!
let rect = CGRect(x: 50, y: 50, width: 100, height: 50)

override func viewDidLoad() {
    super.viewDidLoad()

    let tui = UIControl.Event.touchUpInside
    saveButton = UIButton(frame: rect)
    saveButton.setTitle("Save", for: [])
    saveButton.addTarget(self, action: #selector(saveButtonTapped), for: tui)
    self.view.addSubview(saveButton)
}

@objc func saveButtonTapped(sender: UIButton) {        
    print("Saving is executing...")
    
    guard let frame = arView.session.currentFrame
    else { fatalError("Can't get ARFrame") }
            
    guard let device = MTLCreateSystemDefaultDevice()
    else { fatalError("Can't create MTLDevice") }
    
    let allocator = MTKMeshBufferAllocator(device: device)        
    let asset = MDLAsset(bufferAllocator: allocator)       
    let meshAnchors = frame.anchors.compactMap { $0 as? ARMeshAnchor }
    
    for ma in meshAnchors {
        let geometry = ma.geometry
        let vertices = geometry.vertices
        let faces = geometry.faces
        let vertexPointer = vertices.buffer.contents()
        let facePointer = faces.buffer.contents()
        
        for vtxIndex in 0 ..< vertices.count {
            
            let vertex = geometry.vertex(at: UInt32(vtxIndex))                
            var vertexLocalTransform = matrix_identity_float4x4
            
            vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.0,
                                                          y: vertex.1,
                                                          z: vertex.2,
                                                          w: 1.0)
            
            let vertexWorldTransform = (ma.transform * vertexLocalTransform).position                
            let vertexOffset = vertices.offset + vertices.stride * vtxIndex               
            let componentStride = vertices.stride / 3
            
            vertexPointer.storeBytes(of: vertexWorldTransform.x,
                           toByteOffset: vertexOffset,
                                     as: Float.self)
            
            vertexPointer.storeBytes(of: vertexWorldTransform.y,
                           toByteOffset: vertexOffset + componentStride,
                                     as: Float.self)
            
            vertexPointer.storeBytes(of: vertexWorldTransform.z,
                           toByteOffset: vertexOffset + (2 * componentStride),
                                     as: Float.self)
        }
        
        let byteCountVertices = vertices.count * vertices.stride            
        let byteCountFaces = faces.count * faces.indexCountPerPrimitive * faces.bytesPerIndex
        
        let vertexBuffer = allocator.newBuffer(with: Data(bytesNoCopy: vertexPointer, 
                                                                count: byteCountVertices, 
                                                          deallocator: .none), type: .vertex)
        
        let indexBuffer = allocator.newBuffer(with: Data(bytesNoCopy: facePointer, 
                                                               count: byteCountFaces, 
                                                         deallocator: .none), type: .index)
        
        let indexCount = faces.count * faces.indexCountPerPrimitive            
        let material = MDLMaterial(name: "material", 
                     scatteringFunction: MDLPhysicallyPlausibleScatteringFunction())
        
        let submesh = MDLSubmesh(indexBuffer: indexBuffer, 
                                  indexCount: indexCount, 
                                   indexType: .uInt32, 
                                geometryType: .triangles, 
                                    material: material)
        
        let vertexFormat = MTKModelIOVertexFormatFromMetal(vertices.format)
        
        let vertexDescriptor = MDLVertexDescriptor()
        
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, 
                                                          format: vertexFormat, 
                                                          offset: 0, 
                                                     bufferIndex: 0)
        
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: ma.geometry.vertices.stride)
        
        let mesh = MDLMesh(vertexBuffer: vertexBuffer, 
                            vertexCount: ma.geometry.vertices.count, 
                             descriptor: vertexDescriptor, 
                              submeshes: [submesh])

        asset.add(mesh)
    }

    let filePath = FileManager.default.urls(for: .documentDirectory, 
                                             in: .userDomainMask).first!
    
    let usd: URL = filePath.appendingPathComponent("model.usd")

    if MDLAsset.canExportFileExtension("usd") {
        do {
            try asset.export(to: usd)
            
            let controller = UIActivityViewController(activityItems: [usd],
                                              applicationActivities: nil)
            controller.popoverPresentationController?.sourceView = sender
            self.present(controller, animated: true, completion: nil)

        } catch let error {
            fatalError(error.localizedDescription)
        }
    } else {
        fatalError("Can't export USD")
    }
}

After pressing Save button, in Activity View Controller just choose More and send ready model to Mac's Downloads folder via AirDrop.

P.S. And here you can find an extra info on capturing real-world texture.

Andy Jazz
  • 36,357
  • 13
  • 107
  • 175