Hello. I'm Sato, who is commonplace. Yes, it's me. Recently I suddenly became interested in AR. I want to touch LIDAR, but I don't have a terminal with LIDAR on it. .. First, learn what you can do with ARKit! I thought, first of all, I started studying! Why from motion capture ... Is it easier to enter if there is movement? I think so I'll try it!
ARKit There is an official explanation from Apple, so please check here ^^ Apple ARKit4
ARKit 4 introduces a brand new Depth API to create a new way to access detailed depth information collected by LiDAR scanners on iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro. Location Anchors leverages the high-resolution data from Apple Maps to place the AR experience in specific locations in the world of iPhone and iPad apps. In addition, face tracking support has been extended to all devices with Apple Neural Engine and front camera, allowing even more users to experience the fun of AR with photos and videos.
First, as usual, with Xcode "Welcome to Xcode" Select "Create a new Xcode project"
Select "Augmented Reality App"
「MotionCapture」 This time, create a sample motion capture application like this, press "Next", Save the project to any location.
Xcode has been launched and the template app has been created. Connect your iPhone to your Mac and try "Run". Select "OK" for the access permission to the camera, move the iPhone, and search for the jet plane! If you find it, you can check the jet plane from various angles by changing the angle.
You explained it for a long time ... Just in case, there may be people who need it, so I wrote it w
ARSCNViewDelegate is already included. ARSCVViewDelegate adds, updates, deletes, etc. to ARnchor It is prepared for handling the event, Since I want to capture the movement this time, I will use ARSessionDelegate. Change "ARSCNViewDelegate" to "ARSessionDelegate".
ViewController.swift
class ViewController: UIViewController, ARSCNViewDelegate {
↓
class ViewController: UIViewController, ARSessionDelegate {
The sceneView is already defined in the template. Of course, it is also available in StoryBoard.
ViewController.swift
@IBOutlet var sceneView: ARSCNView!
Remove the sceneView delegate and Set to self in the delegate of sessiton of sceneView so that it can be handled
ViewController.swift
sceneView.delegate = self
↓
sceneView.session.delegate = self
Set an empty SCNSScene to the scene in the sceneView
ViewController.swift
sceneView.scene = SCNScene()
This is unnecessary this time, so you can delete it
ViewController.swift
// Show statistics such as fps and timing information
//The default is false, but on the display of information that was displayed at the bottom of the screen when it was sent to the jet aircraft/off flag
sceneView.showsStatistics = true
// Create a new scene
//Load a jet 3D object
let scene = SCNScene(named: "art.scnassets/ship.scn")!
// Set the scene to the view
//Display a jet plane in sceneView
sceneView.scene = scene
Originally in viewWillAppear Change "ARWorld Tracking Configuration" to "AR Body Tracking Configuration" Set to sceneView.
ARWorldTrackingConfiguration ARBodyTrackingConfiguration
ViewController.swift
guard ARBodyTrackingConfiguration.isSupported else {
fatalError("This feature is only supported on devices with an A12 chip")
}
// Create a session configuration
// let configuration = ARWorldTrackingConfiguration()
// ↓
let configuration = ARWorldTrackingConfiguration()
// Run the view's session
sceneView.session.run(configuration)
These two are used this time -** session (ARSession, didAdd: [ARAnchor]) , which is called when ARnchor is added - session (ARSession, didUpdate: [ARAnchor]) **, which is called when ARnchor is updated
https://developer.apple.com/documentation/arkit/arsessiondelegate
func session(ARSession, didAdd: [ARAnchor])
func session(ARSession, didUpdate: [ARAnchor])
For loop of anchors, check if ARAnchor is ARBodyAnchor, and do not process other than ARBodyAnchor this time return.
ViewController.swift
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
guard let bodyAnchor = anchor as? ARBodyAnchor else {
return
}
// Next
}
}
As with didAdd, check ARAnchor and implement so that only the target is processed.
ViewController.swift
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors {
guard let bAnchor = anchor as? ARBodyAnchor else {
return
}
// Next
}
}
Let's use ARBodyAnchor obtained from both delegate methods and display it on the screen. -Remove skeleton from ARBodyAnchor. skeleton is 3D physical information.
ViewController.swift
let skeleton = anchor.skeleton
Get the jointNames of skeleton's AR SkeletonDefinition in a for loop. The coordinates of ARBody Anchor are centered on the position of "hips_joint". Use the obtained jointName and use modelTransform to get the amount of movement and rotation from the center. For modelTransform, nil is obtained when a nonexistent jointName is set.
ViewController.swift
for jointName in skeleton.definition.jointNames {
let jointType = ARSkeleton.JointName(rawValue: jointName)
if let transform = skeleton.modelTransform(for: jointType) {
Remarks: jointName
Multiply the acquired part information by the center point of ARBodyAnchor Acquire the position and rotation of parts in 3D space.
ViewController.swift
///Cast the position / rotation of jointType
let partsPoint = SCNMatrix4(transform)
///Cast the position / rotation of the reference point hip
let hipPoint = SCNMatrix4(anchor.transform)
/// func SCNMatrix4Mult(_ a: SCNMatrix4, _ b: SCNMatrix4) ->When synthesizing a matrix with SCNMatrix4, consider that a on the left is the one to do later and b on the right is the one to do first.
let matrix = SCNMatrix4Mult(partsPoint, hipPoint)
let position = SCNVector3(matrix.m41, matrix.m42, matrix.m43)
From the sceneView, search for the node name in Key and If it exists, set the node position and update. If it does not exist, create a sphere with "SCNSphere (radius: 0.02)" Create SCNNode, set position and name for search AddChildNode to the created SCNNode to sceneView.scene.rootNode.
ViewController.swift
if let nodeToUpdate = sceneView.scene.rootNode.childNode(withName: jointName, recursively: false) {
///Since it has already been added, only update the position
nodeToUpdate.isHidden = false
nodeToUpdate.position = position
} else {
// GeoSphere
//Radius The radius of the sphere with an initial value of 1.
let sphereGeometry = SCNSphere(radius: 0.02)
//If checked, the surface of the triangular polygon will be evenly constructed. Initial value is false
sphereGeometry.isGeodesic = true
//Sphere Color
sphereGeometry.firstMaterial?.diffuse.contents = UIColor.green
//Set sphere geometry for node
let sphereNode = SCNNode(geometry: sphereGeometry)
//Display position setting
sphereNode.position = position
//Name setting for node
sphereNode.name = jointName
//Add to root node
sceneView.scene.rootNode.addChildNode(sphereNode)
}
Let's run it. If there are no people, watch a video of a walking person such as youtube through the screen of your iPhone. Please see.
All sources are pushed to Github, so please take them. https://github.com/Satoiosdevelop/ExampleMotionCapture