国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 系統(tǒng) > iOS > 正文

iOS視頻錄制、壓縮導(dǎo)出、取幀等

2019-11-09 14:19:01
字體:
供稿:網(wǎng)友

視頻錄制

首先,我們彈出系統(tǒng)的視頻錄制界面,也就是UIImagePickerController控制器來實(shí)現(xiàn),但是我們需要驗(yàn)證用戶授權(quán),只有有錄制視頻的權(quán)限,才能繼續(xù)往下。

我們還需要判斷UIImagePickerControllerSourceTypeCamera是否支持,比如模擬器就不支持,當(dāng)然真機(jī)是否有不支持的并不知道,不過更安全的寫法是要這么寫的。視頻錄制可以設(shè)置錄制的視頻的質(zhì)量,也就是分辨率的高低,通過videoQuality屬性來設(shè)置。我們還可以設(shè)置錄制視頻的最大時(shí)長,通過videoMaximumDuration屬性設(shè)置,比如這里設(shè)置為5分鐘。

// 7.0AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) { NSLog(@"攝像頭已被禁用,您可在設(shè)置應(yīng)用程序中進(jìn)行開啟"); return;}if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) { UIImagePickerController *picker = [[UIImagePickerController alloc] init]; picker.delegate = self; picker.allowsEditing = YES; picker.sourceType = UIImagePickerControllerSourceTypeCamera; picker.videoQuality = UIImagePickerControllerQualityType640x480; //錄像質(zhì)量 picker.videoMaximumDuration = 5 * 60.0f; // 限制視頻錄制最多不超過5分鐘 picker.mediaTypes = @[(NSString *)kUTTypeMovie]; [self PResentViewController:picker animated:YES completion:NULL]; self.shouldAsync = YES;} else { NSLog(@"手機(jī)不支持?jǐn)z像");}

然后實(shí)現(xiàn)代理,就可以拿到錄制的視頻了。

從相冊選擇視頻

從相冊選擇視頻與彈出錄制視頻的代碼差不多,只是sourceType不一樣而已。我們一樣要求先判斷權(quán)限,用戶是否授權(quán),若不允許,就沒有辦法了。

指定sourceType為UIImagePickerControllerSourceTypeSavedPhotosAlbum就是獲取保存到相冊中的media。我們還要指定mediaTypes,只需要設(shè)置為kUTTypeMovie就可以了。

AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) { NSLog(@"攝像頭已被禁用,您可在設(shè)置應(yīng)用程序中進(jìn)行開啟"); return;}if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum]) { UIImagePickerController *picker = [[UIImagePickerController alloc] init]; picker.delegate = self; picker.allowsEditing = YES; picker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; picker.mediaTypes = @[(NSString *)kUTTypeMovie]; [self presentViewController:picker animated:YES completion:NULL]; self.shouldAsync = NO;} else { NSLog(@"手機(jī)不支持?jǐn)z像");}

同樣,實(shí)現(xiàn)代理方法,就可以取到所選擇的視頻了。

保存視頻到相冊

寫入相冊可以通過ALAssetsLibrary類來實(shí)現(xiàn),它提供了寫入相冊的API,異步寫入,完成是要回到主線程更新UI:

NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];dispatch_async(dispatch_get_global_queue(0, 0), ^{ // 判斷相冊是否兼容視頻,兼容才能保存到相冊 if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) { [library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:^(NSURL *assetURL, NSError *error) { dispatch_async(dispatch_get_main_queue(), ^{ // 寫入相冊 if (error == nil) { NSLog(@"寫入相冊成功"); } else { NSLog(@"寫入相冊失敗"); } } }]; }});

獲取視頻幀圖

同步獲取幀圖

同步獲取中間幀,需要指定哪個(gè)時(shí)間點(diǎn)的幀,當(dāng)獲取到以后,返回來的圖片對象是CFRetained過的,需要外面手動CGImageRelease一下,釋放內(nèi)存。通過AVAsset來訪問具體的視頻資源,然后通過AVAssetImageGenerator圖片生成器來生成某個(gè)幀圖片:

// Get the video's center frame as video poster image- (UIImage *)frameImageFromVideoURL:(NSURL *)videoURL { // result UIImage *image = nil; // AVAssetImageGenerator AVAsset *asset = [AVAsset assetWithURL:videoURL]; AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset]; imageGenerator.appliesPreferredTrackTransform = YES; // calculate the midpoint time of video Float64 duration = CMTimeGetSeconds([asset duration]); // 取某個(gè)幀的時(shí)間,參數(shù)一表示哪個(gè)時(shí)間(秒),參數(shù)二表示每秒多少幀 // 通常來說,600是一個(gè)常用的公共參數(shù),蘋果有說明: // 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and // Japan), and 25 fps for PAL (used for TV in Europe). // Using a timescale of 600, you can exactly represent any number of frames in these systems CMTime midpoint = CMTimeMakeWithSeconds(duration / 2.0, 600); // get the image from NSError *error = nil; CMTime actualTime; // Returns a CFRetained CGImageRef for an asset at or near the specified time. // So we should mannully release it CGImageRef centerFrameImage = [imageGenerator copyCGImageAtTime:midpoint actualTime:&actualTime error:&error]; if (centerFrameImage != NULL) { image = [[UIImage alloc] initWithCGImage:centerFrameImage]; // Release the CFRetained image CGImageRelease(centerFrameImage); } return image;}

異步獲取幀圖

異步獲取某個(gè)幀的圖片,與同步相比,只是調(diào)用API不同,可以傳多個(gè)時(shí)間點(diǎn),然后計(jì)算出實(shí)際的時(shí)間并返回圖片,但是返回的圖片不需要我們手動再release了。有可能取不到圖片,所以還需要判斷是否是AVAssetImageGeneratorSucceeded,是才轉(zhuǎn)換圖片:

// 異步獲取幀圖片,可以一次獲取多幀圖片- (void)centerFrameImageWithVideoURL:(NSURL *)videoURL completion:(void (^)(UIImage *image))completion { // AVAssetImageGenerator AVAsset *asset = [AVAsset assetWithURL:videoURL]; AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset]; imageGenerator.appliesPreferredTrackTransform = YES; // calculate the midpoint time of video Float64 duration = CMTimeGetSeconds([asset duration]); // 取某個(gè)幀的時(shí)間,參數(shù)一表示哪個(gè)時(shí)間(秒),參數(shù)二表示每秒多少幀 // 通常來說,600是一個(gè)常用的公共參數(shù),蘋果有說明: // 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and // Japan), and 25 fps for PAL (used for TV in Europe). // Using a timescale of 600, you can exactly represent any number of frames in these systems CMTime midpoint = CMTimeMakeWithSeconds(duration / 2.0, 600); // 異步獲取多幀圖片 NSValue *midTime = [NSValue valueWithCMTime:midpoint]; [imageGenerator generateCGImagesAsynchronouslyForTimes:@[midTime] completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) { if (result == AVAssetImageGeneratorSucceeded && image != NULL) { UIImage *centerFrameImage = [[UIImage alloc] initWithCGImage:image]; dispatch_async(dispatch_get_main_queue(), ^{ if (completion) { completion(centerFrameImage); } }); } else { dispatch_async(dispatch_get_main_queue(), ^{ if (completion) { completion(nil); } }); } }];}

壓縮并導(dǎo)出視頻

壓縮視頻是因?yàn)橐曨l分辨率過高所生成的視頻的大小太大了,對于移動設(shè)備來說,內(nèi)存是不能太大的,如果不支持分片上傳到服務(wù)器,或者不支持流上傳、文件上傳,而只能支持表單上傳,那么必須要限制大小,壓縮視頻。

就像我們在使用某平臺的視頻的上傳的時(shí)候,到現(xiàn)在還沒有支持流上傳,也不支持文件上傳,只支持表單上傳,導(dǎo)致視頻大一點(diǎn)就會閃退。流上傳是上傳成功了,但是人家后臺不識別,這一次讓某平臺坑壞了。直接用file上傳,也傳過去了,上傳進(jìn)度100%了,但是人家那邊還是作為失敗處理,無奈!

言歸正傳,壓縮、導(dǎo)出視頻,需要通過AVAssetExportsession來實(shí)現(xiàn),我們需要指定一個(gè)preset,并判斷是否支持這個(gè)preset,只有支持才能使用。

我們這里設(shè)置的preset為AVAssetExportPreset640x480,屬于壓縮得比較厲害的了,這需要根據(jù)服務(wù)器視頻上傳的支持程度而選擇的。然后通過調(diào)用異步壓縮并導(dǎo)出視頻:

- (void)compressVideoWithVideoURL:(NSURL *)videoURL savedName:(NSString *)savedName completion:(void (^)(NSString *savedPath))completion { // accessing video by URL AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:videoURL options:nil]; // Find compatible presets by video asset. NSArray *presets = [AVAssetExportSession exportPresetsCompatibleWithAsset:videoAsset]; // Begin to compress video // Now we just compress to low resolution if it supports // If you need to upload to the server, but server does't support to upload by streaming, // You can compress the resolution to lower. Or you can support more higher resolution. if ([presets containsObject:AVAssetExportPreset640x480]) { AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPreset640x480]; NSString *doc = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *folder = [doc stringByAppendingPathComponent:@"HYBVideos"]; BOOL isDir = NO; BOOL isExist = [[NSFileManager defaultManager] fileExistsAtPath:folder isDirectory:&isDir]; if (!isExist || (isExist && !isDir)) { NSError *error = nil; [[NSFileManager defaultManager] createDirectoryAtPath:folder withIntermediateDirectories:YES attributes:nil error:&error]; if (error == nil) { NSLog(@"目錄創(chuàng)建成功"); } else { NSLog(@"目錄創(chuàng)建失敗"); } } NSString *outPutPath = [folder stringByAppendingPathComponent:savedName]; session.outputURL = [NSURL fileURLWithPath:outPutPath]; // Optimize for network use. session.shouldOptimizeForNetworkUse = true; NSArray *supportedTypeArray = session.supportedFileTypes; if ([supportedTypeArray containsObject:AVFileTypeMPEG4]) { session.outputFileType = AVFileTypeMPEG4; } else if (supportedTypeArray.count == 0) { NSLog(@"No supported file types"); return; } else { session.outputFileType = [supportedTypeArray objectAtIndex:0]; } // Begin to export video to the output path asynchronously. [session exportAsynchronouslyWithCompletionHandler:^{ if ([session status] == AVAssetExportSessionStatusCompleted) { dispatch_async(dispatch_get_main_queue(), ^{ if (completion) { completion([session.outputURL path]); } }); } else { dispatch_async(dispatch_get_main_queue(), ^{ if (completion) { completion(nil); } }); } }]; }}

解決iOS8上錄視頻引起的偏移bug

在iOS8上有這么一樣bug:彈出錄制視頻頁面,再回來發(fā)現(xiàn)整個(gè)view都往下移動了,可能網(wǎng)上有很多解決辦法,下面只是其中一種:

[picker dismissViewControllerAnimated:YES completion:^{ // for fixing iOS 8.0 problem that frame changed when open camera to record video. self.tabBarController.view.frame = [[UIScreen mainScreen] bounds]; [self.tabBarController.view layoutIfNeeded];}];

Tip:記得在選擇或者取消的代理中都調(diào)用!


發(fā)表評論 共有條評論
用戶名: 密碼:
驗(yàn)證碼: 匿名發(fā)表
主站蜘蛛池模板: 孟津县| 彭州市| 博白县| 兰州市| 安阳县| 广昌县| 慈利县| 房山区| 会宁县| 文山县| 青冈县| 大宁县| 合川市| 承德县| 卓资县| 江山市| 会昌县| 平顶山市| 昂仁县| 金湖县| 桐柏县| 缙云县| 桐乡市| 盱眙县| 白水县| 广元市| 淮滨县| 平阳县| 襄汾县| 白河县| 资中县| 凤台县| 巧家县| 五常市| 徐闻县| 商洛市| 东山县| 慈溪市| 甘孜| 石渠县| 永福县|