解释ALAssetRepresentation中的XMP-Metadata

当用户在iOS上的内置Photos.app中对照片进行一些更改(剪裁,消除红眼…)时,这些更改不会应用到由相应的fullResolutionImage返回的ALAssetRepresentation

但是,这些更改将应用​​于由ALAssetRepresentation返回的thumbnailfullScreenImage 。 此外,关于应用的更改的信息可以通过键@"AdjustmentXMP" ALAssetRepresentationALAssetRepresentation元数据字典中find

我想将这些更改应用于fullResolutionImage以保持一致性。 我发现iOS6 + CIFilterfilterArrayFromSerializedXMP: inputImageExtent:error:可以将这个XMP-metadata转换为一个CIFilter数组:

 ALAssetRepresentation *rep; NSString *xmpString = rep.metadata[@"AdjustmentXMP"]; NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding]; CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage]; NSError *error = nil; NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData inputImageExtent:image.extent error:&error]; if (error) { NSLog(@"Error during CIFilter creation: %@", [error localizedDescription]); } CIContext *context = [CIContext contextWithOptions:nil]; for (CIFilter *filter in filterArray) { [filter setValue:image forKey:kCIInputImageKey]; image = [filter outputImage]; } 

但是,这只适用于一些filter(裁剪,自动增强),但不适用于其他人,如红眼消除。 在这些情况下, CIFilter没有可见的效果。 所以,我的问题是:

  • 有谁知道一种方法来创build消除CIFilter ? (与Photos.app一致,使用键kCIImageAutoAdjustRedEye的filter是不够的,例如,它不需要眼睛位置的参数)。
  • 是否有可能在iOS 5下生成和应用这些滤镜?
 ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation]; // Create a buffer to hold the data for the asset's image uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer NSUInteger length = [representation getBytes:buffer fromOffset: 0.0 length:representation.size error:nil]; if (length==0) return nil; // Convert the buffer into a NSData object, and free the buffer after. NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES]; // Set up a dictionary with a UTI hint. The UTI hint identifies the type // of image we are dealing with (that is, a jpeg, png, or a possible // RAW file). // Specify the source hint. NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys: (id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil]; // Create a CGImageSource with the NSData. A image source can // contain x number of thumbnails and full images. CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata, (CFDictionaryRef) sourceOptionsDict); [adata release]; CFDictionaryRef imagePropertiesDictionary; // Get a copy of the image properties from the CGImageSourceRef. imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL); CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth); CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight); int w = 0; int h = 0; CFNumberGetValue(imageWidth, kCFNumberIntType, &w); CFNumberGetValue(imageHeight, kCFNumberIntType, &h); // Clean up memory CFRelease(imagePropertiesDictionary); 
    Interesting Posts