且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

使用AVFoundation在时域中记录位置

更新时间:2023-02-26 19:42:40

当然可以.

AVFoundation是上层和下层库的集合,具有许多选项,可在各个阶段进入处理管道.假设您要从Camera捕获,则将使用AVCaptureSession的某种组合,其委托 https://developer.apple.com/reference/avfoundation/avcapturevideodataoutputsamplebufferdelegate 和AVAssetWriter.

AVFoundation is a collection of higher and lower level libraries with lots of options to tap into the processing pipeline at various stages. Assuming you want to capture from the Camera, then you're going to be using some combination of AVCaptureSession, its delegate https://developer.apple.com/reference/avfoundation/avcapturevideodataoutputsamplebufferdelegate and an AVAssetWriter.

AVCaptureVideoDataOutputSampleBufferDelegate正在捕获售卖的CMSampleBuffers(它们将视频数据帧与定时信息结合在一起),在您接收到这一点时,您通常只是写出" CMSampleBuffer来记录视频,但是您也可以对其进行进一步处理对其进行实时过滤,或者根据需要记录其他信息以及定时数据(例如,在视频的这一点上,我有了这些坐标).

The AVCaptureVideoDataOutputSampleBufferDelegate is capturing vended CMSampleBuffers (which combine a frame of video data with timing information), at the point you receive it, you typically just "write out" the CMSampleBuffer to record the video, but you can also further process it to filter it in realtime or, as you want to do, record additional information with timing data (e.g. at this point in the video, I had these coordinates).

研究如何在iOS上使用摄像头编写视频以开始使用并使用Delegate,您将很快看到将代码插入何处以实现所需的功能.

Research how to write video from the camera on iOS to get started and using the Delegate, you'll soon see where to hook into the code to achieve what you're after.