[Swift5] Start ARKit ~ Motion capture ~

Hello. I'm Sato, who is commonplace. Yes, it's me. Recently I suddenly became interested in AR. I want to touch LIDAR, but I don't have a terminal with LIDAR on it. .. First, learn what you can do with ARKit! I thought, first of all, I started studying! Why from motion capture ... Is it easier to enter if there is movement? I think so I'll try it!


Development environment


ARKit There is an official explanation from Apple, so please check here ^^ Apple ARKit4

ARKit 4 introduces a brand new Depth API to create a new way to access detailed depth information collected by LiDAR scanners on iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro. Location Anchors leverages the high-resolution data from Apple Maps to place the AR experience in specific locations in the world of iPhone and iPad apps. In addition, face tracking support has been extended to all devices with Apple Neural Engine and front camera, allowing even more users to experience the fun of AR with photos and videos.


Let's make a project!


Create New

First, as usual, with Xcode "Welcome to Xcode" Select "Create a new Xcode project" スクリーンショット 2020-11-08 10.43.41.png


Template selection

Select "Augmented Reality App" スクリーンショット 2020-11-08 10.43.59.png


Give the application a name

「MotionCapture」 This time, create a sample motion capture application like this, press "Next", Save the project to any location.

スクリーンショット 2020-11-08 10.46.17.png


Project template

Xcode has been launched and the template app has been created. Connect your iPhone to your Mac and try "Run". Select "OK" for the access permission to the camera, move the iPhone, and search for the jet plane! If you find it, you can check the jet plane from various angles by changing the angle.

You explained it for a long time ... Just in case, there may be people who need it, so I wrote it w


Rush into motion capture

ARSCNViewDelegate is already included. ARSCVViewDelegate adds, updates, deletes, etc. to ARnchor It is prepared for handling the event, Since I want to capture the movement this time, I will use ARSessionDelegate. Change "ARSCNViewDelegate" to "ARSessionDelegate".

ViewController.swift


class ViewController: UIViewController, ARSCNViewDelegate {
↓
class ViewController: UIViewController, ARSessionDelegate {

The sceneView is already defined in the template. Of course, it is also available in StoryBoard.

ViewController.swift


    @IBOutlet var sceneView: ARSCNView!

Remove the sceneView delegate and Set to self in the delegate of sessiton of sceneView so that it can be handled

ViewController.swift


    sceneView.delegate = self
    ↓
    sceneView.session.delegate = self

Set an empty SCNSScene to the scene in the sceneView

ViewController.swift


    sceneView.scene = SCNScene()

This is unnecessary this time, so you can delete it

ViewController.swift


        // Show statistics such as fps and timing information
        //The default is false, but on the display of information that was displayed at the bottom of the screen when it was sent to the jet aircraft/off flag
        sceneView.showsStatistics = true
        
        // Create a new scene
        //Load a jet 3D object
        let scene = SCNScene(named: "art.scnassets/ship.scn")!
        
        // Set the scene to the view
        //Display a jet plane in sceneView
        sceneView.scene = scene

Change the setting to track the movement of people

Originally in viewWillAppear Change "ARWorld Tracking Configuration" to "AR Body Tracking Configuration" Set to sceneView.

ARWorldTrackingConfiguration ARBodyTrackingConfiguration

ViewController.swift



        guard ARBodyTrackingConfiguration.isSupported else {
            fatalError("This feature is only supported on devices with an A12 chip")
        }
        // Create a session configuration
        // let configuration = ARWorldTrackingConfiguration()
        // ↓
        let configuration = ARWorldTrackingConfiguration()

        // Run the view's session
        sceneView.session.run(configuration)

Implement the delegate method of ARSessionDelegate

These two are used this time -** session (ARSession, didAdd: [ARAnchor]) , which is called when ARnchor is added - session (ARSession, didUpdate: [ARAnchor]) **, which is called when ARnchor is updated

https://developer.apple.com/documentation/arkit/arsessiondelegate


func session(ARSession, didAdd: [ARAnchor])

func session(ARSession, didUpdate: [ARAnchor])

Implement func session (ARSession, didAdd: [ARAnchor])

For loop of anchors, check if ARAnchor is ARBodyAnchor, and do not process other than ARBodyAnchor this time return.

ViewController.swift


    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        for anchor in anchors {
            guard let bodyAnchor = anchor as? ARBodyAnchor else {
                return
            }
            // Next
        }
    }

We will implement func session (ARSession, didUpdate: [ARAnchor]) as well.

As with didAdd, check ARAnchor and implement so that only the target is processed.

ViewController.swift


    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        for anchor in anchors {
            guard let bAnchor = anchor as? ARBodyAnchor else {
                return
            }
            // Next
        }
    }

Extract information from ARBodyAnchor and draw

Let's use ARBodyAnchor obtained from both delegate methods and display it on the screen. -Remove skeleton from ARBodyAnchor. skeleton is 3D physical information.

ViewController.swift


        let skeleton = anchor.skeleton

Get the jointNames of skeleton's AR SkeletonDefinition in a for loop. The coordinates of ARBody Anchor are centered on the position of "hips_joint". Use the obtained jointName and use modelTransform to get the amount of movement and rotation from the center. For modelTransform, nil is obtained when a nonexistent jointName is set.

ViewController.swift


        for jointName in skeleton.definition.jointNames {
            let jointType = ARSkeleton.JointName(rawValue: jointName)
            if let transform = skeleton.modelTransform(for: jointType) {

Remarks: jointName

I tried to get the jointName with the debugger. 91 parts. I wonder if there are more ...? - 0 : "root" - 1 : "hips_joint" - 2 : "left_upLeg_joint" - 3 : "left_leg_joint" - 4 : "left_foot_joint" - 5 : "left_toes_joint" - 6 : "left_toesEnd_joint" - 7 : "right_upLeg_joint" - 8 : "right_leg_joint" - 9 : "right_foot_joint" - 10 : "right_toes_joint" - 11 : "right_toesEnd_joint" - 12 : "spine_1_joint" - 13 : "spine_2_joint" - 14 : "spine_3_joint" - 15 : "spine_4_joint" - 16 : "spine_5_joint" - 17 : "spine_6_joint" - 18 : "spine_7_joint" - 19 : "left_shoulder_1_joint" - 20 : "left_arm_joint" - 21 : "left_forearm_joint" - 22 : "left_hand_joint" - 23 : "left_handIndexStart_joint" - 24 : "left_handIndex_1_joint" - 25 : "left_handIndex_2_joint" - 26 : "left_handIndex_3_joint" - 27 : "left_handIndexEnd_joint" - 28 : "left_handMidStart_joint" - 29 : "left_handMid_1_joint" - 30 : "left_handMid_2_joint" - 31 : "left_handMid_3_joint" - 32 : "left_handMidEnd_joint" - 33 : "left_handPinkyStart_joint" - 34 : "left_handPinky_1_joint" - 35 : "left_handPinky_2_joint" - 36 : "left_handPinky_3_joint" - 37 : "left_handPinkyEnd_joint" - 38 : "left_handRingStart_joint" - 39 : "left_handRing_1_joint" - 40 : "left_handRing_2_joint" - 41 : "left_handRing_3_joint" - 42 : "left_handRingEnd_joint" - 43 : "left_handThumbStart_joint" - 44 : "left_handThumb_1_joint" - 45 : "left_handThumb_2_joint" - 46 : "left_handThumbEnd_joint" - 47 : "neck_1_joint" - 48 : "neck_2_joint" - 49 : "neck_3_joint" - 50 : "neck_4_joint" - 51 : "head_joint" - 52 : "jaw_joint" - 53 : "chin_joint" - 54 : "left_eye_joint" - 55 : "left_eyeLowerLid_joint" - 56 : "left_eyeUpperLid_joint" - 57 : "left_eyeball_joint" - 58 : "nose_joint" - 59 : "right_eye_joint" - 60 : "right_eyeLowerLid_joint" - 61 : "right_eyeUpperLid_joint" - 62 : "right_eyeball_joint" - 63 : "right_shoulder_1_joint" - 64 : "right_arm_joint" - 65 : "right_forearm_joint" - 66 : "right_hand_joint" - 67 : "right_handIndexStart_joint" - 68 : "right_handIndex_1_joint" - 69 : "right_handIndex_2_joint" - 70 : "right_handIndex_3_joint" - 71 : "right_handIndexEnd_joint" - 72 : "right_handMidStart_joint" - 73 : "right_handMid_1_joint" - 74 : "right_handMid_2_joint" - 75 : "right_handMid_3_joint" - 76 : "right_handMidEnd_joint" - 77 : "right_handPinkyStart_joint" - 78 : "right_handPinky_1_joint" - 79 : "right_handPinky_2_joint" - 80 : "right_handPinky_3_joint" - 81 : "right_handPinkyEnd_joint" - 82 : "right_handRingStart_joint" - 83 : "right_handRing_1_joint" - 84 : "right_handRing_2_joint" - 85 : "right_handRing_3_joint" - 86 : "right_handRingEnd_joint" - 87 : "right_handThumbStart_joint" - 88 : "right_handThumb_1_joint" - 89 : "right_handThumb_2_joint" - 90 : "right_handThumbEnd_joint"

Multiply the acquired part information by the center point of ARBodyAnchor Acquire the position and rotation of parts in 3D space.

ViewController.swift


        ///Cast the position / rotation of jointType
        let partsPoint = SCNMatrix4(transform)
        ///Cast the position / rotation of the reference point hip
        let hipPoint = SCNMatrix4(anchor.transform)
        /// func SCNMatrix4Mult(_ a: SCNMatrix4, _ b: SCNMatrix4) ->When synthesizing a matrix with SCNMatrix4, consider that a on the left is the one to do later and b on the right is the one to do first.
        let matrix = SCNMatrix4Mult(partsPoint, hipPoint)
        let position = SCNVector3(matrix.m41, matrix.m42, matrix.m43)

Check if there is already the same in SceneeView, update if there is, add if not

From the sceneView, search for the node name in Key and If it exists, set the node position and update. If it does not exist, create a sphere with "SCNSphere (radius: 0.02)" Create SCNNode, set position and name for search AddChildNode to the created SCNNode to sceneView.scene.rootNode.

ViewController.swift


        if let nodeToUpdate = sceneView.scene.rootNode.childNode(withName: jointName, recursively: false) {
            ///Since it has already been added, only update the position
            nodeToUpdate.isHidden = false
            nodeToUpdate.position = position
        } else {
            // GeoSphere
            //Radius The radius of the sphere with an initial value of 1.
            let sphereGeometry = SCNSphere(radius: 0.02)
            //If checked, the surface of the triangular polygon will be evenly constructed. Initial value is false
            sphereGeometry.isGeodesic = true
            //Sphere Color
            sphereGeometry.firstMaterial?.diffuse.contents = UIColor.green
            //Set sphere geometry for node
            let sphereNode = SCNNode(geometry: sphereGeometry)
            //Display position setting
            sphereNode.position = position
            //Name setting for node
            sphereNode.name = jointName
            //Add to root node
            sceneView.scene.rootNode.addChildNode(sphereNode)
        }

Now it's done.

Let's run it. If there are no people, watch a video of a walking person such as youtube through the screen of your iPhone. Please see.

All sources are pushed to Github, so please take them. https://github.com/Satoiosdevelop/ExampleMotionCapture

Recommended Posts

[Swift5] Start ARKit ~ Motion capture ~
[Swift] Capture list