This is an article from Second Dwango Advent Calendar 2020 Day 20.
Dwango mainly develops iOS apps.
It has nothing to do with the business content, but I tried to verify whether the streaming server can be run on the iOS application by using the function added to AV Asset Writer from iOS 14 and macOS 11.0, so I will write that story.
The code is just an example, so please complement it nicely.
Environment used for verification
macOS 10.15.5
Xcode 12.2
Safari 13.1.1
iPhone 12 Pro iOS 14.2
Get CMSampleBuffer from AVCaptureDevice.
Since it is a common process, I will omit the code, but it is OK if you can get CMSampleBuffer from AVCaptureVideoDataOutputSampleBufferDelegate.
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
....
}
In addition, this time, to avoid complexity, only the video and audio are omitted.
AVAssetWriter
Initialize AVAssetWriter. Specify .mpeg4AppleHLS for outputFileTypeProfile.
self.writer = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!)
writer.delegate = self
writer.outputFileTypeProfile = .mpeg4AppleHLS
writer.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1)
writer.initialSegmentStartTime = CMTime.zero
let videoOutputSettings: [String: Any] = [
AVVideoCodecKey: AVVideoCodecType.h264,
AVVideoWidthKey: 360,
AVVideoHeightKey: 640
]
self.videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = true
writer.add(videoInput)
Next, write the process when CMSampleBuffer is received. When it is received for the first time, it is put in the writing state and the session is started. If it is being written, the PTS is corrected and CMSampleBuffer is written to AVAssetWriterInput.
if writer.status == .unknown {
writer.startWriting()
writer.startSession(atSourceTime: CMTime.zero)
}
if writer.status == .writing {
if let offset = offset {
var copyBuffer: CMSampleBuffer?
var count: CMItemCount = 1
var info = CMSampleTimingInfo()
CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, entryCount: count, arrayToFill: &info, entriesNeededOut: &count)
info.presentationTimeStamp = CMTimeSubtract(info.presentationTimeStamp, offset)
CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault,
sampleBuffer: sampleBuffer,
sampleTimingEntryCount: 1,
sampleTimingArray: &info,
sampleBufferOut: ©Buffer)
if let copyBuffer = copyBuffer, videoInput.isReadyForMoreMediaData {
videoInput.append(copyBuffer)
}
} else {
offset = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
}
}
After a while, AVAssetWriterDelegate's didOutputSegmentData is called and the segment data is passed.
func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) {
...
}
Next, create a segment data file and an index file from the segment data. This time, create a new directory under the Documents directory and save the segment and index files.
The segment file simply saves the segmentData received by didOutputSegmentData to the file as it is. This time, I simply numbered them in the order they were received and saved them with a file name such as segment1.m4s.
Since some segments are required to create an index file, create or update the file when several are created. The index file must comply with the HLS specifications, but it seems that there is no problem just generating it by referring to Apple's Sample Code.
The index file is updated each time a segment is created and passed.
swift-nio-transport-services is an extension of SwiftNIO that can be used on iOS, watchOS, and tvOS, and uses Network.framework.
Applications created using SwiftNIO will work fine with the SwiftNIO Transport Service with only a few rewrites.
In addition to providing first-class support for Apple platforms, NIO Transport Services takes advantage of the richer API of Network.framework to provide more insight into the behaviour of the network than is normally available to NIO applications. This includes the ability to wait for connectivity until a network route is available, as well as all of the extra proxy and VPN support that is built directly into Network.framework.
All regular NIO applications should work just fine with NIO Transport Services, simply by changing the event loops and bootstraps in use.
(From GitHub README)
SwiftNIO Transport Service is compatible with CocoaPods, but it can also be installed with Swift Package Manager, so this time we will install it with Swift Package Magager.
In Swift Package Manager, add SwiftNIO and SwiftNIO Transport Service, and add NIO, NIOHTTP1, and NIOTransportServices packages to your dependencies.
After adding, implement the HTTP server by referring to the sample of SwiftNIO Transport Service.
let group = NIOTSEventLoopGroup()
let channel = try! NIOTSListenerBootstrap(group: group)
.childChannelInitializer { channel in
channel.pipeline.configureHTTPServerPipeline(withPipeliningAssistance: true, withErrorHandling: true).flatMap {
channel.pipeline.addHandler(HTTP1ServerHandler())
}
}
.bind(host: "0.0.0.0", port: 8080)
.wait()
try! channel.closeFuture.wait()
HTTP1ServerHandler conforms to ChannelInboundHandler and implements as follows.
final class HTTP1ServerHandler: ChannelInboundHandler {
typealias InboundIn = HTTPServerRequestPart
typealias OutboundOut = HTTPServerResponsePart
func channelRead(context: ChannelHandlerContext, data: NIOAny) {
let part = unwrapInboundIn(data)
guard case .head(let headData) = part else {
return
}
if headData.uri == "/" {
// index.Processing that returns html as a response
}
}
}
This time, in response to the request for /
, HTML with the player will be returned.
Include the file name as index.html in the Bundle.
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>HLS Stream Server</title>
</head>
<body>
<header>
<h1>HLS Stream Server</h1>
</header>
<div>
<video width="360" height="640" src="index.m3u8" preload="none" onclick="this.play()" controls />
</div>
</body>
</html>
To return a response, implement and call the following method.
private func handleIndexPageRequest(context: ChannelHandlerContext, data: NIOAny) {
do {
let path = Bundle.main.path(forResource: "index", ofType: "html")!
let data = try Data(contentsOf: URL(fileURLWithPath: path))
let buffer = context.channel.allocator.buffer(data: data)
var responseHeaders = HTTPHeaders()
responseHeaders.add(name: "Content-Length", value: "\(data.count)")
responseHeaders.add(name: "Content-Type", value: "text/html; charset=utf-8")
let responseHead = HTTPResponseHead(version: .init(major: 1, minor: 1), status: .ok, headers: responseHeaders)
context.write(wrapOutboundOut(.head(responseHead)), promise: nil)
context.write(wrapOutboundOut(.body(.byteBuffer(buffer))), promise: nil)
context.writeAndFlush(wrapOutboundOut(.end(nil)), promise: nil)
} catch {
let responseHead = HTTPResponseHead(version: .init(major: 1, minor: 1), status: .notFound)
context.write(wrapOutboundOut(.head(responseHead)), promise: nil)
context.writeAndFlush(wrapOutboundOut(.end(nil)), promise: nil)
}
}
Implement the same processing for requests for index files and segment files.
Finally, call the HTTP server, camera, and segment generation process to be executed at the same time.
From Safari, specify the local IP and port of the iPhone to access. If you check the network from the debug menu, you can see that the index file is being updated and the segment file is being read.
Author fragmented MPEG-4 content with AVAssetWriter Live Playlist (Sliding Window) Construction [iOS] I made a Vine-style replenishment shooting app with AVFoundation (AVCaptureVideoDataOutput/AVCaptureAudioDataOutput) https://swiftreviewercom.wordpress.com/2020/03/27/import-swift-nio-to-ios-tvos-in-xcode-11/ https://www.process-one.net/blog/swiftnio-introduction-to-channels-channelhandlers-and-pipelines/
Recommended Posts