๐ Support HDR video playback, editing, and export in your app
๐ก Newskategorie: Programmierung
๐ Quelle: developer.apple.com
You can help people create more vivid and true-to-life video when you support high dynamic range (HDR) in your app. And when you support HDR with Dolby Vision, people with iPhone 12 or iPhone 12 Pro can go even further and shoot, edit, and play cinema-grade videos right from their device. Dolby Vision tuning is provided dynamically to each frame, preserving the intended look of the original shots.
Hereโs how you can provide the best HDR video playback, editing, and export experience.
Get started with HDR video
Your app needs to support iOS 14.1 or later to take advantage of HDR video. To begin, we recommend reviewing a few WWDC sessions which provide a good overview of the process.
Export HDR media in your app with AVFoundation
Edit and play back HDR video with AVFoundation
Note: iPhone 12 and iPhone 12 Pro record HDR video in Dolby Vision Profile 8.4, Cross-compatibility ID 4 (HLG) format, using an HEVC (10-bit) codec. This format is designed to be backwards compatible with HLG, allowing existing HEVC decoders to decode as HLG. Video is recorded by the Camera app as a QuickTime File Format (QTFF) movie (.mov extension). Signaling for Dolby Vision in a QTFF movie is similar to signaling in Dolby Vision Streams within the ISO base media file format.
Learn more about Dolby Vision Profiles
Learn more about Dolby Vision Levels
Learn more about Dolby Vision Streams
Support HDR video playback in your app
Both iOS and macOS support HDR video playback on all eligible devices. Use eligibleForHDRPlayback
on AVPlayer
to check for HDR playback support on the current device. In general, the classes AVPlayer
, AVPlayerlayer
, or AVSampleBufferDisplayLayer
can be used to play Dolby Vision video. If your app uses AVPlayer
, you donโt need to add anything additional to your code: The AVFoundation
framework automatically sets up an HDR playback pipeline to handle Dolby Vision Profile 8.4 if it detects an asset in Dolby Vision and the device supports HDR playback.
If your app usesAVSampleBufferDisplayLayer
to render video, make sure any sample buffers passed to the sample buffer display layer are in formats suitable for HDR and carry Dolby Vision Profile 8.4 per-frame metadata. These sample buffers need to have 10-bit or higher bit-depth. A commonly used 10-bit format is 4:2:0 YโCbCr video range, represented by kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange
. The associated OSType
for this pixel format is โx420โ
.
If your sample buffers are decoded using VTDecompressionSession
, you can carry the Dolby Vision Profile 8.4 per-frame metadata in the buffers by using kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata
. This value is true by default.
Asset inspection
AVMediaCharacteristic
provides options for specifying media type characteristics, including whether a video includes HDR metadata. You can use the Swift media characteristic containsHDRVideo
to identify whether any segment of a track contains HDR so that your app can render it correctly. In Objective-C, you can use AVMediaCharacteristicContainsHDRVideo
, defined in AVMediaFormat.h
.
After loading the tracks property using the Swift method loadValuesAsynchronously(forKeys:completionHandler:)
, you can get HDR tracks using tracks(withMediaCharacteristic:)
. Hereโs how you might get all desired HDR tracks:
let hdrTracks = asset.tracks(withMediaCharacteristic: .containsHDRVideo)
In a similar fashion, you can use the Objective-C method loadValuesAsynchronouslyForKeys:completionHandler:
to load the tracks property and obtain the HDR tracks with the method tracksWithMediaCharacteristic:
, like so:
NSArray<AVAssetTrack *> *hdrTracks
= [asset tracksWithMediaCharacteristic:AVMediaCharacteristicContainsHDRVideo];
The hasMediaCharacteristic(_:)
method can be used to track media characteristics, such as HDR media type, format descriptions, or explicit tagging. For example:
if track.hasMediaCharacteristic(.containsHDRVideo){
// Process HDR track
}
In Objective-C, you can use the same hasMediaCharacteristic:
method for explicit tagging, as demonstrated here:
if([track hasMediaCharacteristic:AVMediaCharacteristicContainsHDRVideo]){
// Process HDR track
}
Support HDR video editing and previewing in your app
To add HDR content editing to your application, use AVVideoComposition
. If youโre using the built-in compositor, you can also use the Swift initializer init(asset:applyingCIFiltersWithHandler:)
or the Objective-C initializer videoCompositionWithAsset:applyingCIFiltersWithHandler:
with built-in CIFilters
to easily incorporate an HDR editing pipeline in your app.
Custom compositors can support HDR content, too: You can use the supportsHDRSourceFrames
property to indicate HDR capability. For Objective-C, the supportsHDRSourceFrames
property is a part of the AVVideoCompositing
protocol defined in AVVideoCompositing.h
.
If your custom compositor needs to operate in 10-bit HDR pixel formats, youโll need to select pixel buffer attributes that your compositor can accept as input by using the sourcePixelBufferAttributes
property. For Objective-C, this property is found in AVVideoCompositing.h
. The value of this property is a dictionary which contains pixel buffer attribute keys defined in CoreVideo
header file CVPixelBuffer.h
.
Additionally, to create new buffers for processing, youโll need the correct pixel buffer attributes required by the video compositor. For this particular purpose, use the property requiredPixelBufferAttributesForRenderContext
.
If your app offers video previewing during editing, modifying the pixel values may invalidate the videoโs existing dynamic metadata and its usage. Because the Dolby Vision Profile 8.4 metadata is completely transparent, you can use AVPlayerItem
to drop any invalid metadata during preview-only scenarios, as well as update dynamic metadata during export to reflect changes in the video content.
To configure HDR settings, you can use the appliesPerFrameHDRDisplayMetadata
property from AVPlayerItem
, which defaults to true. In Objective-C, the property defaults to YES and can be found in AVPlayerItem.h
.
By default, AVFoundation
will attempt to use Dolby Vision metadata if present for a video, but you can tell your app to ignore it: Just set the appliesPerFrameHDRDisplayMetadata
property from AVPlayerItem
to false in Swift, or NO in Objective-C. If your application is using VTDecompressionSession
APIs from VideoToolbox
, you can turn off Dolby Vision tone mapping with kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata
. To use this property in C or Objective-C, make sure to include VideoToolbox in the framework header VTDecompressionProperties.h
.
Support HDR export in your app
You can support HDR video export in your app when you work with AVAssetWriter
and HEVC presets.
Discover presets and AVAssetExportSession
All HEVC presets have been upgraded to support HDR. The output format will match the source format, so if the source file is Dolby Vision Profile 8.4, the exported movie will also be Dolby Vision Profile 8.4. If you need to change the output format, you can use AVAssetWriter
.
Note: H.264 presets will convert HDR to Standard Dynamic Range (SDR).
In order to preserve Dolby Vision Profile 8.4 during export using AVAssetWriter
, you must choose a suitable output format, color properties that support Dolby Vision, and a 10-bit profile level.
To start, note that querying supportedOutputSettingsKeys(for:)
in Swift or supportedOutputSettingsKeysForConnection:
in Objective-C provides a list of output settings keys supported for the current device.
For Dolby Vision export, the video output settings dictionary key AVVideoCompressionPropertiesKey
allows you to control bit rate, B-frame delivery, I-frame interval, and codec quality. The value associated with this key is an instance of NSDictionary
. For Objective-C, this key is found in AVVideoSettings.h
.
For example, a video output settings dictionary for Dolby Vision in Swift would contain these key/value pairs:
let videoOutputSettings: [String: Any] = [
AVVideoCodecKey: AVVideoCodecType.hevc,
AVVideoProfileLevelKey: kVTProfileLevel_HEVC_Main10_AutoLevel,
AVVideoColorPropertiesKey: [
AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020,
AVVideoTransferFunctionKey: AVVideoTransferFunction_ITU_R_2100_HLG,
AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020
],
AVVideoCompressionPropertiesKey: [
kVTCompressionPropertyKey_HDRMetadataInsertionMode: kVTHDRMetadataInsertionMode_Auto
]
]
With Objective-C, your video output settings dictionary would contain the same key/value pairs:
NSDictionary<NSString*, id>* videoOutputSettings = @{
AVVideoCodecKey: AVVideoCodecTypeHEVC,
AVVideoProfileLevelKey: (__bridge NSString*)kVTProfileLevel_HEVC_Main10_AutoLevel,
AVVideoColorPropertiesKey: @{
AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020,
AVVideoTransferFunctionKey: AVVideoTransferFunction_ITU_R_2100_HLG,
AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020
},
AVVideoCompressionPropertiesKey: @{
(__bridge NSString*)kVTCompressionPropertyKey_HDRMetadataInsertionMode: (__bridge NSString*)kVTHDRMetadataInsertionMode_Auto
}
};
In Objective-C, the key kVTCompressionPropertyKey_HDRMetadataInsertionMode
and the value kVTHDRMetadataInsertionMode_Auto
are found in VTDecompressionProperties.h
.
In addition to defining key/value pairs, make sure that the pixel buffers presented to AVAssetWriterInput
are a 10-bit 4:2:0 YโCbCr video range represented by 'x420' OSType.
You may elect to use a separate AVAssetReader
or AVAssetWriter
model for export. In that case, you can use the VideoToolbox property kVTCompressionPropertyKey_PreserveDynamicHDRMetadata
and set it to kCFBooleanFalse or false for C/Objective-C or Swift respectively. When you set the VideoToolbox property, AVAssetWriter
will automatically recompute the Dolby Vision Profile 8.4 metadata for exporting the file. This should be done as your app modifies the output frames from the AVAssetReader
.
Resources
Learn more about Dolby Vision Profiles
...