[SWIFT] Optical camouflage with ARKit + SceneKit + Metal ②

I tried to express the feeling that the optical camouflage is not in good condition in the continuation of the previous "ARKit + SceneKit + Metal optical camouflage ①"

demo2.gif

How to draw noise texture

① Generate block noise texture with compute shader (2) Added a character drawing path using (1) as a material. ③ Add ② to the last image generation process created last time. -Draw either the optical camouflage image or the block noise image of ② ・ Random drawing timing

If you do Capture GPU Frame with Xcode when running the app, you can check the rendering path as follows (check with Xcode12). This time I added the handwritten red line part. xcode2.png It is convenient because you can check what color and depth are output for each path. Tap the camera icon while debugging to create a Capture GPU Frame. xcode1.png

Block noise generation by compute shader and setting to SCNNode

All you need to generate a noise texture is the time-varying information timeParam and xy coordinates. The value of timeParam, which is incremented each time it is drawn, is passed to the shader, and the shader determines the noise color based on that information and the xy coordinates. The timing of noise generation is renderer (_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval).

・ Shader

shader.metal


//Random number generation
float rand(float2 co) {
    return fract(sin(dot(co.xy, float2(12.9898, 78.233))) * 43758.5453);
}

//Block noise image generation shader
kernel void blockNoise(const device float& time [[buffer(0)]],
                       texture2d<float, access::write> out [[texture(0)]],
                       uint2 id [[thread_position_in_grid]]) {
    //8px block
    float2 uv = float2(id.x / 8, id.y / 8);
    float noise = fract(rand(rand(float2(float(uv.x * 50 + time), float(uv.y * 50 + time) + time))));
    float4 color = float4(0.0, noise, 0.0, 1.0);
    
    out.write(color, id);
}

・ Swift (shader call part)

ViewController.swift


private func setupMetal() {
(Omitted)
    //Compute shader for noise creation
    let noiseShader = library.makeFunction(name: "blockNoise")!
    self.computeState = try! self.device.makeComputePipelineState(function: noiseShader)
    //Buffer of time information to pass to shader
    self.timeParamBuffer = self.device.makeBuffer(length: MemoryLayout<Float>.size, options: .cpuCacheModeWriteCombined)
    self.timeParamPointer = UnsafeMutableRawPointer(self.timeParamBuffer.contents()).bindMemory(to: Float.self, capacity: 1)
    //Thread group grid
    self.threadgroupSize = MTLSizeMake(16, 16, 1)
    let threadCountW = (noiseTetureSize + self.threadgroupSize.width - 1) / self.threadgroupSize.width
    let threadCountH = (noiseTetureSize + self.threadgroupSize.height - 1) / self.threadgroupSize.height
    self.threadgroupCount = MTLSizeMake(threadCountW, threadCountH, 1)
}

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
    //Increment for each drawing
    self.timeParam += 1;
    self.timeParamPointer.pointee = self.timeParam
    
    let commandBuffer = self.commandQueue.makeCommandBuffer()!
    let computeEncoder = commandBuffer.makeComputeCommandEncoder()!
    
    computeEncoder.setComputePipelineState(computeState)
    computeEncoder.setBuffer(self.timeParamBuffer, offset: 0, index: 0)
    computeEncoder.setTexture(noiseTexture, index: 0)
    computeEncoder.dispatchThreadgroups(threadgroupCount, threadsPerThreadgroup: threadgroupSize)
    computeEncoder.endEncoding()
    commandBuffer.commit()
    commandBuffer.waitUntilCompleted()
}

The output of the shader is received by MTLTexture. The point is how to pass the received texture as a character material.

ViewController.swift


//Texture to write noise
let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: .rgba8Unorm,
                                                                 width: noiseTetureSize,
                                                                 height: noiseTetureSize,
                                                                 mipmapped: false)
textureDescriptor.usage = [.shaderWrite, .shaderRead]
self.noiseTexture = device.makeTexture(descriptor: textureDescriptor)!
//Set noise texture to Node material for optical camouflage
let node = self.rootNode.childNode(withName: "CamouflageNode", recursively: true)!
let material = SCNMaterial()
material.diffuse.contents = self.noiseTexture!
material.emission.contents = self.noiseTexture! //Prevent shadows
node.geometry?.materials = [material]

All you have to do is set the generated noise image (texture) to diffuse.contents in SCNMaterial and set it to the geometry of the character node. SceneKit will do the rest. I've been trying this in the direction of using the SCN Program, but the method is described in this article.

Multipath rendering

Replace the optical camouflage part that was output in the previous article with the part to be drawn this time (character with noise texture) (switch the display at random timing to express flickering).

The path added to SCNTechnique is as follows.

technique.json


"pass_noise_node" : {
    "draw" : "DRAW_NODE",
    "includeCategoryMask" : 2,
    "outputs" : {
        "color" : "noise_color_node"
    }
},

That's it because I just draw the character with the noise texture. Color information is output to " color ":" noise_color_node ".

The final image generation shader has been changed as follows. The noiseColorNode output in the above path is added as an argument.

//Noise generation timing generation
bool spike(float time) {
    float flickering = 0.3;     //Flickering condition. Increasing it makes it easier to flicker
    float piriod = -0.8;        //A flickering period. The smaller the size, the longer the flickering time
    if (rand(time * 0.1) > (1.0 - flickering) && sin(time) > piriod) {
        return true;
    } else {
        return false;
    }
}

//Fragment shader for compositing whole scenes and node normals
fragment half4 mix_fragment(MixColorInOut vert [[stage_in]],
                            constant SCNSceneBuffer& scn_frame [[buffer(0)]],  //Drawing frame information
                            texture2d<float, access::sample> colorScene [[texture(0)]],
                            depth2d<float,   access::sample> depthScene [[texture(1)]],
                            texture2d<float, access::sample> colorNode [[texture(2)]],
                            depth2d<float,   access::sample> depthNode [[texture(3)]],
                            texture2d<float, access::sample> noiseColorNode [[texture(4)]])
{
    float ds = depthScene.sample(s, vert.uv);    //Depth when drawing the entire scene
    float dn = depthNode.sample(s, vert.uv);     //Depth when drawing a node
    
    float4 fragment_color;
    if (dn > ds) {
        if (spike(scn_frame.time)) {
            //Uses noise texture color for noise timing
            fragment_color = noiseColorNode.sample(s, fract(vert.uv));
            
        } else {
            //Since the object targeted for optical camouflage is in front of the object drawn in the scene, an optical camouflage effect is produced.
(Omitted)
        }

Random true / false information is created with spike (), and the display color is switched between a noisy character and an optical camouflage character.

Whole source code

-Multipath rendering definition

technique.json


{
    "targets" : {
        "color_scene" : { "type" : "color" },
        "depth_scene" : { "type" : "depth" },
        "color_node"  : { "type" : "color" },
        "depth_node"  : { "type" : "depth" },
        "noise_color_node"  : { "type" : "color" }
    },
    "passes" : {
        "pass_scene" : {
            "draw" : "DRAW_SCENE",
            "excludeCategoryMask" : 2,
            "outputs" : {
                "color" : "color_scene",
                "depth" : "depth_scene"
            },
            "colorStates" : {
                "clear" : true,
                "clearColor" : "sceneBackground"
            },
            "depthStates" : {
                "clear" : true,
                "func" : "less"
            }
        },
        "pass_node" : {
            "draw" : "DRAW_NODE",
            "includeCategoryMask" : 2,
            "metalVertexShader" : "node_vertex",
            "metalFragmentShader" : "node_fragment",
            "outputs" : {
                "color" : "color_node",
                "depth" : "depth_node"
            },
            "depthStates" : {
                "clear" : true,
                "func" : "less"
            }
        },
        "pass_noise_node" : {
            "draw" : "DRAW_NODE",
            "includeCategoryMask" : 2,
            "outputs" : {
                "color" : "noise_color_node"
            }
        },
        "pass_mix" : {
            "draw" : "DRAW_QUAD",
            "inputs" : {
                "colorScene" : "color_scene",
                "depthScene" : "depth_scene",
                "colorNode"  : "color_node",
                "depthNode"  : "depth_node",
                "noiseColorNode" : "noise_color_node"
            },
            "metalVertexShader" : "mix_vertex",
            "metalFragmentShader" : "mix_fragment",
            "outputs" : {
                "color" : "COLOR"
            },
            "colorStates" : {
                "clear" : "true"
            }
        }
    },
    "sequence" : [
        "pass_scene",
        "pass_node",
        "pass_noise_node",
        "pass_mix"
    ]
}

・ Shader

#include <metal_stdlib>
using namespace metal;
#include <SceneKit/scn_metal>

// SceneKit ->Shader delivery type
//The definition is https://developer.apple.com/documentation/scenekit/See scnprogram
struct VertexInput {
    float4 position [[attribute(SCNVertexSemanticPosition)]];   //Vertex coordinates
    float2 texCoords [[attribute(SCNVertexSemanticTexcoord0)]]; //Texture coordinates
    float2 normal [[attribute(SCNVertexSemanticNormal)]];       //Normal
};

// SceneKit ->Shader delivery type(For each node)
//The definition is https://developer.apple.com/documentation/scenekit/See scnprogram
struct PerNodeBuffer {
    float4x4 modelViewProjectionTransform;
};

struct NodeColorInOut {
    float4 position [[position]];
    float4 normal;
};

struct MixColorInOut {
    float4 position [[position]];
    float2 uv;
};

//Random number generation
float rand(float2 co) {
    return fract(sin(dot(co.xy, float2(12.9898, 78.233))) * 43758.5453);
}

//Noise generation timing generation
bool spike(float time) {
    float flickering = 0.3;     //Flickering condition. Increasing it makes it easier to flicker
    float piriod = -0.8;        //A flickering period. The smaller the size, the longer the flickering time
    if (rand(time * 0.1) > (1.0 - flickering) && sin(time) > piriod) {
        return true;
    } else {
        return false;
    }
}

//Vertex shader for nodes
vertex NodeColorInOut node_vertex(VertexInput in [[stage_in]],
                                  constant SCNSceneBuffer& scn_frame [[buffer(0)]],  //Drawing frame information
                                  constant PerNodeBuffer& scn_node [[buffer(1)]])    //Information for each node
{
    NodeColorInOut out;
    out.position = scn_node.modelViewProjectionTransform * in.position;
    out.normal = scn_node.modelViewProjectionTransform * float4(in.normal, 1.0);
    return out;
}

//Fragment shader for nodes
fragment half4 node_fragment(NodeColorInOut vert [[stage_in]])
{
    //The normal used is x,y only. Because it is treated as color information-1.0 ~ 1.0 -> 0.0 ~ 1.Convert to 0
    float4 color =  float4((vert.normal.x + 1.0) * 0.5 , (vert.normal.y + 1.0) * 0.5, 0.0, 0.0);
    return half4(color);        //Output normals as color information. This information distorts the background of the optical camouflage object
}

//Vertex shader for compositing entire scene and node normals
vertex MixColorInOut mix_vertex(VertexInput in [[stage_in]],
                                        constant SCNSceneBuffer& scn_frame [[buffer(0)]])
{
    MixColorInOut out;
    out.position = in.position;
    //Coordinate system-1.0 ~ 1.0 -> 0.0 ~ 1.Converted to 0. The y-axis is inverted.
    out.uv = float2((in.position.x + 1.0) * 0.5 , (in.position.y + 1.0) * -0.5);
    return out;
}

constexpr sampler s = sampler(coord::normalized,
                              address::repeat,    // clamp_to_edge/clamp_to_border(iOS14)No.
                              filter::nearest);

//Fragment shader for compositing whole scenes and node normals
fragment half4 mix_fragment(MixColorInOut vert [[stage_in]],
                            constant SCNSceneBuffer& scn_frame [[buffer(0)]],  //Drawing frame information
                            texture2d<float, access::sample> colorScene [[texture(0)]],
                            depth2d<float,   access::sample> depthScene [[texture(1)]],
                            texture2d<float, access::sample> colorNode [[texture(2)]],
                            depth2d<float,   access::sample> depthNode [[texture(3)]],
                            texture2d<float, access::sample> noiseColorNode [[texture(4)]])
{
    float ds = depthScene.sample(s, vert.uv);    //Depth when drawing the entire scene
    float dn = depthNode.sample(s, vert.uv);     //Depth when drawing a node
    
    float4 fragment_color;
    if (dn > ds) {
        if (spike(scn_frame.time)) {
            //Uses noise texture color for noise timing
            fragment_color = noiseColorNode.sample(s, fract(vert.uv));
            
        } else {
            //Since the object targeted for optical camouflage is in front of the object drawn in the scene, an optical camouflage effect is produced.
            float3 normal_map = colorNode.sample(s, vert.uv).rgb;
            // 0.0 ~ 1.0 -> -1.0 ~ 1.Return to 0 so that it can be used as coordinates
            normal_map.xy = normal_map.xy * 2 - 1.0;
            //The position of the background color to be adopted is the normal direction of the node.(xy plane)Make it a distorted background to get a little staggered
            float2 uv = vert.uv + normal_map.xy * 0.1;
            if (uv.x > 1.0 ||  uv.x < 0.0) {
                //Avoid using colors outside the screen(I wanted to solve it with sampler addressing, but it didn't work)
                fragment_color = colorScene.sample(s, fract(vert.uv));
            } else {
                fragment_color = colorScene.sample(s, fract(uv));
            }
        }
    } else {
        //Since the object targeted for optical camouflage is behind the object drawn in the scene, the color on the scene side is adopted as it is.
        fragment_color = colorScene.sample(s, fract(vert.uv));
    }
    
    return half4(fragment_color);
}

//Block noise image generation shader
kernel void blockNoise(const device float& time [[buffer(0)]],
                       texture2d<float, access::write> out [[texture(0)]],
                       uint2 id [[thread_position_in_grid]]) {
    //8px block
    float2 uv = float2(id.x / 8, id.y / 8);
    float noise = fract(rand(rand(float2(float(uv.x * 50 + time), float(uv.y * 50 + time) + time))));
    float4 color = float4(0.0, noise, 0.0, 1.0);
    
    out.write(color, id);
}

・ Swift

ViewController.swift


import ARKit
import SceneKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet weak var scnView: ARSCNView!
    
    private var rootNode: SCNNode!

    private let device = MTLCreateSystemDefaultDevice()!
    private var commandQueue: MTLCommandQueue!
    
    private var computeState: MTLComputePipelineState! = nil
    private var noiseTexture: MTLTexture! = nil
    
    private let noiseTetureSize = 256
    private var threadgroupSize: MTLSize!
    private var threadgroupCount: MTLSize!
    private var timeParam: Float = 0
    private var timeParamBuffer: MTLBuffer!
    private var timeParamPointer: UnsafeMutablePointer<Float>!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        //Character loading. Borrowed WWDC2017 SceneKit Demo https://developer.apple.com/videos/play/wwdc2017/604/
        guard let scene = SCNScene(named: "art.scnassets/max.scn"),
              let rootNode = scene.rootNode.childNode(withName: "root", recursively: true) else { return }
        self.rootNode = rootNode
        self.rootNode.isHidden = true
        
        //Metal setup
        self.setupMetal()
        //Scene Technique setup
        self.setupSCNTechnique()
        
        //AR Session started
        self.scnView.delegate = self
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal]
        self.scnView.session.run(configuration, options: [.removeExistingAnchors, .resetTracking])
    }
    
    private func setupMetal() {
        self.commandQueue = self.device.makeCommandQueue()!
        let library = self.device.makeDefaultLibrary()!
        //Texture to write noise
        let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: .rgba8Unorm,
                                                                         width: noiseTetureSize,
                                                                         height: noiseTetureSize,
                                                                         mipmapped: false)
        textureDescriptor.usage = [.shaderWrite, .shaderRead]
        self.noiseTexture = device.makeTexture(descriptor: textureDescriptor)!
        //Set noise texture to Node material for optical camouflage
        let node = self.rootNode.childNode(withName: "CamouflageNode", recursively: true)!
        let material = SCNMaterial()
        material.diffuse.contents = self.noiseTexture!
        material.emission.contents = self.noiseTexture! //Prevent shadows
        node.geometry?.materials = [material]
        //Compute shader for noise creation
        let noiseShader = library.makeFunction(name: "blockNoise")!
        self.computeState = try! self.device.makeComputePipelineState(function: noiseShader)
        //Buffer of time information to pass to shader
        self.timeParamBuffer = self.device.makeBuffer(length: MemoryLayout<Float>.size, options: .cpuCacheModeWriteCombined)
        self.timeParamPointer = UnsafeMutableRawPointer(self.timeParamBuffer.contents()).bindMemory(to: Float.self, capacity: 1)
        //Thread group grid
        self.threadgroupSize = MTLSizeMake(16, 16, 1)
        let threadCountW = (noiseTetureSize + self.threadgroupSize.width - 1) / self.threadgroupSize.width
        let threadCountH = (noiseTetureSize + self.threadgroupSize.height - 1) / self.threadgroupSize.height
        self.threadgroupCount = MTLSizeMake(threadCountW, threadCountH, 1)
    }
    
    private func setupSCNTechnique() {
        guard let path = Bundle.main.path(forResource: "technique", ofType: "json") else { return }
        let url = URL(fileURLWithPath: path)
        guard let techniqueData = try? Data(contentsOf: url),
              let dict = try? JSONSerialization.jsonObject(with: techniqueData) as? [String: AnyObject] else { return }
        //Enable multipath rendering
        let technique = SCNTechnique(dictionary: dict)
        scnView.technique = technique
    }
    
    func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
        //Increment for each drawing
        self.timeParam += 1;
        self.timeParamPointer.pointee = self.timeParam
        
        let commandBuffer = self.commandQueue.makeCommandBuffer()!
        let computeEncoder = commandBuffer.makeComputeCommandEncoder()!
        
        computeEncoder.setComputePipelineState(computeState)
        computeEncoder.setBuffer(self.timeParamBuffer, offset: 0, index: 0)
        computeEncoder.setTexture(noiseTexture, index: 0)
        computeEncoder.dispatchThreadgroups(threadgroupCount, threadsPerThreadgroup: threadgroupSize)
        computeEncoder.endEncoding()
        commandBuffer.commit()
        commandBuffer.waitUntilCompleted()
    }
    
    func renderer(_: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard let planeAnchor = anchor as? ARPlaneAnchor, self.rootNode.isHidden else { return }
        self.rootNode.simdPosition = planeAnchor.center
        self.rootNode.isHidden = false
        DispatchQueue.main.async {
            //Display objects on the detected plane
            node.addChildNode(self.rootNode)
        }
    }
}

Recommended Posts

Optical camouflage with ARKit + SceneKit + Metal ①
Optical camouflage with ARKit + SceneKit + Metal ②
Web browsing with ARKit + SceneKit + Metal
Ground collapse with ARKit + SceneKit
Reproduction of "You are in front of King Laputa" with ARKit + SceneKit + Metal
Cursor display when placing objects with ARKit + SceneKit
Everyone is a super saiyan with ARKit + Metal