使用AVPlayer时保持良好的滚动性能

我正在处理有收集视图的应用程序,并且收集视图的单元格可以包含video。 现在我正在使用AVPlayerAVPlayerLayer来显示video。 不幸的是,滚动性能是可怕的。 AVPlayerAVPlayerItemAVPlayerLayer似乎在主线程上做了很多工作。 他们不断地取出锁,等待信号等,这是阻塞主线,造成严重的帧丢失。

有什么方法可以告诉AVPlayer停止在主线程上做很多事情吗? 到目前为止,我没有尝试过解决了这个问题。

我也尝试使用AVSampleBufferDisplayLayer构build一个简单的video播放器。 使用这个function,我可以确保所有的事情都是在主线程中发生的,在滚动和播放video的同时我可以达到60fps。 不幸的是,这种方法要低得多,并且不提供audio播放和时间清理等function。 有什么方法可以与AVPlayer获得类似的性能? 我宁愿使用它。

编辑:进一步调查之后,使用AVPlayer时看起来不太可能实现良好的滚动性能。 创build一个AVPlayer并与一个AVPlayerItem实例关联起来,这些工作就会蹦蹦跳进主线程,然后等待信号并尝试获取一堆锁。 随着滚动视图中video的数量的增加,主线程停顿的时间也相应增加。

AVPlayer dealloc也似乎是一个巨大的问题。 释放一个AVPlayer也试图同步一堆东西。 同样,当你创造更多的球员时,这会变得非常糟糕。

这是非常令人沮丧的,这使得AVPlayer几乎无法用于我正在做的事情。 阻止这样的主线是这样一个业余的事情,很难相信苹果的工程师会犯这样的错误。 无论如何,希望他们能尽快解决这个问题。

尽可能在后台队列中构buildAVPlayerItem (在主线程中需要执行一些操作,但是您可以执行设置操作并等待video属性加载到后台队列上 – 请仔细阅读文档)。 这涉及与KVO的巫术舞蹈,真的不好玩。

AVPlayer正在等待AVPlayerItem的状态变为AVPlayerItemStatusReadyToPlay时,发生打嗝。 要尽可能减less打嗝的长度,以便在将AVPlayer分配给AVPlayer之前,使AVPlayerItem更靠近后台线程上的AVPlayer

我实际上已经实现了这一点,但是IIRC的主线程块是由于底层的AVURLAsset的属性被延迟加载而引起的,如果你自己没有加载它们,它们会在主线程上忙于加载AVPlayer想玩。

查看AVAsset文档,尤其是AVAsynchronousKeyValueLoading 。 我想我们需要在AVPlayer上使用资源之前加载durationtracks值,以减less主线程块。 有可能我们也必须遍历每个轨道, AVAsynchronousKeyValueLoading每个段执行AVAsynchronousKeyValueLoading ,但我不记得100%。

不知道这是否会有所帮助 – 但这里有一些代码,我用来加载video的背景队列,肯定有助于主线程阻塞(道歉,如果它不编译1:1,我从一个更大的代码库我抽象正在工作):

 func loadSource() { self.status = .Unknown let operation = NSBlockOperation() operation.addExecutionBlock { () -> Void in // create the asset let asset = AVURLAsset(URL: self.mediaUrl, options: nil) // load values for track keys let keys = ["tracks", "duration"] asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: { () -> Void in // Loop through and check to make sure keys loaded var keyStatusError: NSError? for key in keys { var error: NSError? let keyStatus: AVKeyValueStatus = asset.statusOfValueForKey(key, error: &error) if keyStatus == .Failed { let userInfo = [NSUnderlyingErrorKey : key] keyStatusError = NSError(domain: MovieSourceErrorDomain, code: MovieSourceAssetFailedToLoadKeyValueErrorCode, userInfo: userInfo) println("Failed to load key: \(key), error: \(error)") } else if keyStatus != .Loaded { println("Warning: Ignoring key status: \(keyStatus), for key: \(key), error: \(error)") } } if keyStatusError == nil { if operation.cancelled == false { let composition = self.createCompositionFromAsset(asset) // register notifications let playerItem = AVPlayerItem(asset: composition) self.registerNotificationsForItem(playerItem) self.playerItem = playerItem // create the player let player = AVPlayer(playerItem: playerItem) self.player = player } } else { println("Failed to load asset: \(keyStatusError)") } }) // add operation to the queue SomeBackgroundQueue.addOperation(operation) } func createCompositionFromAset(asset: AVAsset, repeatCount: UInt8 = 16) -> AVMutableComposition { let composition = AVMutableComposition() let timescale = asset.duration.timescale let duration = asset.duration.value let editRange = CMTimeRangeMake(CMTimeMake(0, timescale), CMTimeMake(duration, timescale)) var error: NSError? let success = composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error) if success { for _ in 0 ..< repeatCount - 1 { composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error) } } return composition } 

如果您查看Facebook的AsyncDisplayKit (Facebook和Instagram提要背后的引擎),则可以使用其AVideoNode在大多数后台线程上渲染video。 如果你把它编入ASDisplayNode ,并将displayNode.view添加到你正在滚动的任何视图(table / collection / scroll),你可以实现完美的平滑滚动(只要确保在后台线程上创build节点和资产) 。 唯一的问题是当改变video项目,因为这迫使自己到主线程。 如果你只在这个特定的视图上有几个video,你可以使用这种方法!

  dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), { self.mainNode = ASDisplayNode() self.videoNode = ASVideoNode() self.videoNode!.asset = AVAsset(URL: self.videoUrl!) self.videoNode!.frame = CGRectMake(0.0, 0.0, self.bounds.width, self.bounds.height) self.videoNode!.gravity = AVLayerVideoGravityResizeAspectFill self.videoNode!.shouldAutoplay = true self.videoNode!.shouldAutorepeat = true self.videoNode!.muted = true self.videoNode!.playButton.hidden = true dispatch_async(dispatch_get_main_queue(), { self.mainNode!.addSubnode(self.videoNode!) self.addSubview(self.mainNode!.view) }) }) 

以下是在UICollectionView中显示“video墙”的工作解决scheme:

1)将所有的单元存储在一个NSMapTable中(以后你只能从NSMapTable中访问一个单元对象):

 self.cellCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsWeakMemory valueOptions:NSPointerFunctionsStrongMemory capacity:AppDelegate.sharedAppDelegate.assetsFetchResults.count]; for (NSInteger i = 0; i < AppDelegate.sharedAppDelegate.assetsFetchResults.count; i++) { [self.cellCache setObject:(AssetPickerCollectionViewCell *)[self.collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:[NSIndexPath indexPathForItem:i inSection:0]] forKey:[NSIndexPath indexPathForItem:i inSection:0]]; } 

2)将此方法添加到您的UICollectionViewCell子类:

 - (void)setupPlayer:(PHAsset *)phAsset { typedef void (^player) (void); player play = ^{ NSString __autoreleasing *serialDispatchCellQueueDescription = ([NSString stringWithFormat:@"%@ serial cell queue", self]); dispatch_queue_t __autoreleasing serialDispatchCellQueue = dispatch_queue_create([serialDispatchCellQueueDescription UTF8String], DISPATCH_QUEUE_SERIAL); dispatch_async(serialDispatchCellQueue, ^{ __weak typeof(self) weakSelf = self; __weak typeof(PHAsset) *weakPhAsset = phAsset; [[PHImageManager defaultManager] requestPlayerItemForVideo:weakPhAsset options:nil resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) { if(![[info objectForKey:PHImageResultIsInCloudKey] boolValue]) { AVPlayer __autoreleasing *player = [AVPlayer playerWithPlayerItem:playerItem]; __block typeof(AVPlayerLayer) *weakPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:player]; [weakPlayerLayer setFrame:weakSelf.contentView.bounds]; //CGRectMake(self.contentView.bounds.origin.x, self.contentView.bounds.origin.y, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height * (9.0/16.0))]; [weakPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; [weakPlayerLayer setBorderWidth:0.25f]; [weakPlayerLayer setBorderColor:[UIColor whiteColor].CGColor]; [player play]; dispatch_async(dispatch_get_main_queue(), ^{ [weakSelf.contentView.layer addSublayer:weakPlayerLayer]; }); } }]; }); }; play(); } 

3)从你的UICollectionView委托调用上面的方法:

 - (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath { if ([[self.cellCache objectForKey:indexPath] isKindOfClass:[AssetPickerCollectionViewCell class]]) [self.cellCache setObject:(AssetPickerCollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath] forKey:indexPath]; dispatch_async(dispatch_get_global_queue(0, DISPATCH_QUEUE_PRIORITY_HIGH), ^{ NSInvocationOperation *invOp = [[NSInvocationOperation alloc] initWithTarget:(AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath] selector:@selector(setupPlayer:) object:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]]; [[NSOperationQueue mainQueue] addOperation:invOp]; }); return (AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath]; } 

顺便说一下,以下是您将如何使用“照片”应用的“video”文件夹中的所有video填充PHFetchResult集合:

 // Collect all videos in the Videos folder of the Photos app - (PHFetchResult *)assetsFetchResults { __block PHFetchResult *i = self->_assetsFetchResults; if (!i) { static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil]; PHAssetCollection *collection = smartAlbums.firstObject; if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil; PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init]; allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]]; i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions]; self->_assetsFetchResults = i; }); } NSLog(@"assetsFetchResults (%ld)", self->_assetsFetchResults.count); return i; } 

如果你想过滤本地(而不是iCloud)的video,这是我所假设的,看到你正在寻找平滑滚动:

 // Filter videos that are stored in iCloud - (NSArray *)phAssets { NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count]; [[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) { if (asset.sourceType == PHAssetSourceTypeUserLibrary) [assets addObject:asset]; }]; return [NSArray arrayWithArray:(NSArray *)assets]; } 

我设法使用avplayer在每个单元格中创build一个像view一样的水平Feed,如下所示:

  1. 缓冲 – 创build一个pipe理器,以便您可以预加载(缓冲)video。 你想缓冲的AVPlayers的数量取决于你正在寻找的经验。 在我的应用程序中,我只pipe理3个AVPlayers ,所以现在有一名玩家正在玩,前一名和下一名玩家正在被缓冲。 所有的缓冲pipe理器正在做的是pipe理正确的videocaching在任何给定的点

  2. 重用单元格 – 让TableView / CollectionView重用cellForRowAtIndexPath:的单元格cellForRowAtIndexPath:所有你需要做的就是在你将单元格传递给它之后,它是正确的播放器(我只是给单元格上的缓冲索引path并返回正确的单元格)

  3. AVPlayer KVO的 – 每次缓冲pipe理器调用加载一个新的video来缓冲AVPlayer创build他的所有资产和通知,就像这样调用它们:

//玩家

 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{ self.videoContainer.playerLayer.player = self.videoPlayer; self.asset = [AVURLAsset assetWithURL:[NSURL URLWithString:self.videoUrl]]; NSString *tracksKey = @"tracks"; dispatch_async(dispatch_get_main_queue(), ^{ [self.asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:^{ dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{ NSError *error; AVKeyValueStatus status = [self.asset statusOfValueForKey:tracksKey error:&error]; if (status == AVKeyValueStatusLoaded) { self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset]; // add the notification on the video // set notification that we need to get on run time on the player & items // a notification if the current item state has changed [self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:contextItemStatus]; // a notification if the playing item has not yet started to buffer [self.playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferEmpty]; // a notification if the playing item has fully buffered [self.playerItem addObserver:self forKeyPath:@"playbackBufferFull" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferFull]; // a notification if the playing item is likely to keep up with the current buffering rate [self.playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:contextPlaybackLikelyToKeepUp]; // a notification to get information about the duration of the playing item [self.playerItem addObserver:self forKeyPath:@"duration" options:NSKeyValueObservingOptionNew context:contextDurationUpdate]; // a notificaiton to get information when the video has finished playing [NotificationCenter addObserver:self selector:@selector(itemDidFinishedPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem]; self.didRegisterWhenLoad = YES; self.videoPlayer = [AVPlayer playerWithPlayerItem:self.playerItem]; // a notification if the player has chenge it's rate (play/pause) [self.videoPlayer addObserver:self forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:contextRateDidChange]; // a notification to get the buffering rate on the current playing item [self.videoPlayer addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:contextTimeRanges]; } }); }]; }); }); 

其中:videoContainer – 是您要添加播放器的视图

让我知道你是否需要任何帮助或更多的解释

祝你好运 :)