WebKit Bugzilla
Attachment 368254 Details for
Bug 197171
: Create AVFoundationSoftLink.{h,mm} to reduce duplicate code
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Updated patch for landing
bug-197171-20190425113702.patch (text/plain), 257.00 KB, created by
Eric Carlson
on 2019-04-25 11:37:03 PDT
(
hide
)
Description:
Updated patch for landing
Filename:
MIME Type:
Creator:
Eric Carlson
Created:
2019-04-25 11:37:03 PDT
Size:
257.00 KB
patch
obsolete
>Subversion Revision: 244645 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 5ac42cac99b1d94d81d6acd37fd2ec161914793a..3e047bde45c93d2002ac812883d261d7422143e0 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,3 +1,184 @@ >+2019-04-22 Eric Carlson <eric.carlson@apple.com> >+ >+ Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ <rdar://problem/47454979> >+ >+ Reviewed by Youenn Fablet. >+ >+ Tests: TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm >+ >+ * Modules/plugins/QuickTimePluginReplacement.mm: >+ (WebCore::jsValueWithValueInContext): >+ (WebCore::jsValueWithAVMetadataItemInContext): >+ * WebCore.xcodeproj/project.pbxproj: >+ * platform/audio/ios/AudioSessionIOS.mm: >+ (WebCore::AudioSession::setCategory): >+ (WebCore::AudioSession::category const): >+ (WebCore::AudioSession::routeSharingPolicy const): >+ (WebCore::AudioSession::routingContextUID const): >+ (WebCore::AudioSession::sampleRate const): >+ (WebCore::AudioSession::bufferSize const): >+ (WebCore::AudioSession::numberOfOutputChannels const): >+ (WebCore::AudioSession::tryToSetActiveInternal): >+ (WebCore::AudioSession::preferredBufferSize const): >+ (WebCore::AudioSession::setPreferredBufferSize): >+ * platform/audio/ios/MediaSessionManagerIOS.mm: >+ (-[WebMediaSessionHelper initWithCallback:]): >+ (-[WebMediaSessionHelper startMonitoringAirPlayRoutes]): >+ * platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm: >+ (WebCore::AVTrackPrivateAVFObjCImpl::audioKind const): >+ (WebCore::AVTrackPrivateAVFObjCImpl::videoKind const): >+ (WebCore::AVTrackPrivateAVFObjCImpl::label const): >+ * platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm: >+ (WebCore::AudioSourceProviderAVFObjC::createMix): >+ * platform/graphics/avfoundation/MediaPlaybackTargetMac.mm: >+ * platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm: >+ (WebCore::MediaSelectionGroupAVFObjC::updateOptions): >+ * platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm: >+ (WebCore::AVFoundationMIMETypeCache::canDecodeType): >+ (WebCore::AVFoundationMIMETypeCache::loadMIMETypes): >+ * platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm: >+ (WebCore::CDMInstanceFairPlayStreamingAVFObjC::supportsPersistableState): >+ (WebCore::CDMInstanceFairPlayStreamingAVFObjC::supportsPersistentKeys): >+ (WebCore::CDMInstanceFairPlayStreamingAVFObjC::supportsMediaCapability): >+ (WebCore::CDMInstanceFairPlayStreamingAVFObjC::initializeWithConfiguration): >+ (WebCore::CDMInstanceSessionFairPlayStreamingAVFObjC::updateLicense): >+ (WebCore::CDMInstanceSessionFairPlayStreamingAVFObjC::loadSession): >+ (WebCore::CDMInstanceSessionFairPlayStreamingAVFObjC::removeSessionData): >+ (WebCore::CDMInstanceSessionFairPlayStreamingAVFObjC::ensureSession): >+ * platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm: >+ (WebCore::CDMSessionAVContentKeySession::isAvailable): >+ (WebCore::CDMSessionAVContentKeySession::releaseKeys): >+ (WebCore::CDMSessionAVContentKeySession::update): >+ (WebCore::CDMSessionAVContentKeySession::generateKeyReleaseMessage): >+ (WebCore::CDMSessionAVContentKeySession::contentKeySession): >+ * platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm: >+ * platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm: >+ (WebCore::CDMSessionAVStreamSession::releaseKeys): >+ (WebCore::CDMSessionAVStreamSession::update): >+ (WebCore::CDMSessionAVStreamSession::setStreamSession): >+ (WebCore::CDMSessionAVStreamSession::generateKeyReleaseMessage): >+ * platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm: >+ (WebCore::imageDecoderAssetOptions): >+ (WebCore::ImageDecoderAVFObjC::ImageDecoderAVFObjC): >+ (WebCore::ImageDecoderAVFObjC::firstEnabledTrack): >+ (WebCore::ImageDecoderAVFObjC::readSamples): >+ (SOFT_LINK_CONSTANT_MAY_FAIL): Deleted. >+ * platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm: >+ (WebCore::InbandTextTrackPrivateAVFObjC::label const): >+ * platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm: >+ (WebCore::InbandTextTrackPrivateLegacyAVFObjC::label const): >+ * platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm: >+ (WebCore::MediaPlaybackTargetPickerMac::devicePicker): >+ * platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm: >+ (WebCore::assetCacheForPath): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::clearMediaCache): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::clearMediaCacheForOrigins): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::cancelLoad): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createImageGenerator): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createAVPlayerLayer): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::synchronizeTextTrackState): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createAVPlayer): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createAVPlayerItem): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::supportsType): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::isAvailable): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::tracksChanged): >+ (WebCore::determineChangedTracksFromNewTracksAndOldItems): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::updateAudioTracks): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::updateVideoTracks): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::createVideoOutput): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::outputMediaDataWillChange): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForLegibleMedia): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForAudibleMedia): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForVisualMedia): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::processMediaSelectionOptions): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::setCurrentTextTrack): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::languageOfPrimaryAudioTrack const): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::wirelessPlaybackTargetType const): >+ (WebCore::exernalDeviceDisplayNameForPlayer): >+ (WebCore::metadataType): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::metadataDidArrive): >+ (-[WebCoreAVFMovieObserver observeValueForKeyPath:ofObject:change:context:]): >+ (-[WebCoreAVFPullDelegate outputMediaDataWillChange:]): >+ (-[WebCoreAVFPullDelegate outputSequenceWasFlushed:]): >+ (WebCore::MediaPlayerPrivateAVFoundationObjC::processLegacyClosedCaptionsTracks): Deleted. >+ * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm: >+ (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC): >+ (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::isAvailable): >+ (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::supportsType): >+ (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): >+ (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::streamSession): >+ * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm: >+ (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): >+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): >+ (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers): >+ * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm: >+ (-[WebAVSampleBufferErrorListener observeValueForKeyPath:ofObject:change:context:]): >+ (WebCore::SourceBufferPrivateAVFObjC::SourceBufferPrivateAVFObjC): >+ (WebCore::SourceBufferPrivateAVFObjC::~SourceBufferPrivateAVFObjC): >+ (WebCore::SourceBufferPrivateAVFObjC::trackDidChangeEnabled): >+ (WebCore::SourceBufferPrivateAVFObjC::enqueueSample): >+ * platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm: >+ (WebCore::PlatformCALayerCocoa::layerTypeForPlatformLayer): >+ (WebCore::PlatformCALayerCocoa::PlatformCALayerCocoa): >+ (WebCore::PlatformCALayerCocoa::clone const): >+ (WebCore::PlatformCALayerCocoa::avPlayerLayer const): >+ * platform/graphics/cocoa/HEVCUtilitiesCocoa.mm: >+ (WebCore::validateHEVCParameters): >+ * platform/ios/PlatformSpeechSynthesizerIOS.mm: >+ (getAVSpeechUtteranceDefaultSpeechRate): >+ (getAVSpeechUtteranceMaximumSpeechRate): >+ (-[WebSpeechSynthesisWrapper speakUtterance:]): >+ (WebCore::PlatformSpeechSynthesizer::initializeVoiceList): >+ (SOFT_LINK_CONSTANT): Deleted. >+ * platform/ios/VideoFullscreenInterfaceAVKit.mm: >+ (-[WebAVPlayerLayer init]): >+ (-[WebAVPlayerLayer layoutSublayers]): >+ (-[WebAVPlayerLayer setVideoGravity:]): >+ (-[WebAVPlayerLayer videoRect]): >+ (WebAVPlayerLayerView_startRoutingVideoToPictureInPicturePlayerLayerView): >+ * platform/mac/SerializedPlatformRepresentationMac.mm: >+ (WebCore::jsValueWithValueInContext): >+ (WebCore::jsValueWithAVMetadataItemInContext): >+ * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm: >+ (WebCore::getAVFormatIDKeyWithFallback): >+ (WebCore::getAVNumberOfChannelsKeyWithFallback): >+ (WebCore::getAVSampleRateKeyWithFallback): >+ (WebCore::getAVEncoderBitRateKeyWithFallback): >+ (WebCore::MediaRecorderPrivateWriter::create): >+ (WebCore::MediaRecorderPrivateWriter::setVideoInput): >+ (WebCore::MediaRecorderPrivateWriter::setAudioInput): >+ * platform/mediastream/RealtimeVideoSource.h: >+ * platform/mediastream/VideoPreset.h: >+ * platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm: >+ (WebCore::AVAudioSessionCaptureDeviceManager::refreshAudioCaptureDevices): >+ * platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm: >+ (-[WebCoreAudioCaptureSourceIOSListener initWithCallback:]): >+ (-[WebCoreAudioCaptureSourceIOSListener handleInterruption:]): >+ * platform/mediastream/mac/AVCaptureDeviceManager.mm: >+ (WebCore::deviceIsAvailable): >+ (WebCore::AVCaptureDeviceManager::updateCachedAVCaptureDevices): >+ (WebCore::AVCaptureDeviceManager::refreshCaptureDevices): >+ (WebCore::AVCaptureDeviceManager::isAvailable): >+ (WebCore::AVCaptureDeviceManager::~AVCaptureDeviceManager): >+ * platform/mediastream/mac/AVVideoCaptureSource.mm: >+ (WebCore::AVVideoPreset::create): >+ (WebCore::AVVideoPreset::AVVideoPreset): >+ (WebCore::AVVideoCaptureSource::create): >+ (WebCore::AVVideoCaptureSource::AVVideoCaptureSource): >+ (WebCore::AVVideoCaptureSource::capabilities): >+ (WebCore::sensorOrientationFromVideoOutput): >+ (WebCore::AVVideoCaptureSource::setupSession): >+ (WebCore::AVVideoCaptureSource::frameDurationForFrameRate): >+ (WebCore::AVVideoCaptureSource::setupCaptureSession): >+ (WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection): >+ (WebCore::AVVideoCaptureSource::generatePresets): >+ (-[WebCoreAVVideoCaptureSourceObserver addNotificationObservers]): >+ (-[WebCoreAVVideoCaptureSourceObserver captureOutput:didOutputSampleBuffer:fromConnection:]): >+ > 2019-04-25 Commit Queue <commit-queue@webkit.org> > > Unreviewed, rolling out r244627. >diff --git a/Source/WebCore/PAL/ChangeLog b/Source/WebCore/PAL/ChangeLog >index 00a1d395f091159ec666a1901998d1274716e6f8..6d5ed7eb53ab3819ca0804455637759e8d6c731a 100644 >--- a/Source/WebCore/PAL/ChangeLog >+++ b/Source/WebCore/PAL/ChangeLog >@@ -1,3 +1,15 @@ >+2019-04-22 Eric Carlson <eric.carlson@apple.com> >+ >+ Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ <rdar://problem/47454979> >+ >+ Reviewed by Youenn Fablet. >+ >+ * PAL.xcodeproj/project.pbxproj: >+ * pal/cocoa/AVFoundationSoftLink.h: Added. >+ * pal/cocoa/AVFoundationSoftLink.mm: Added. >+ > 2019-04-25 Commit Queue <commit-queue@webkit.org> > > Unreviewed, rolling out r244627. >diff --git a/Source/WebKit/ChangeLog b/Source/WebKit/ChangeLog >index e32eb49e62072405510b89520fbefc7013e0f7cf..2e19b66ba9a28a150e7eb42d4c060f254aac2c81 100644 >--- a/Source/WebKit/ChangeLog >+++ b/Source/WebKit/ChangeLog >@@ -1,3 +1,21 @@ >+2019-04-22 Eric Carlson <eric.carlson@apple.com> >+ >+ Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ <rdar://problem/47454979> >+ >+ Reviewed by Youenn Fablet. >+ >+ * Shared/ios/WebIconUtilities.mm: >+ (WebKit::iconForVideoFile): >+ * Shared/mac/WebCoreArgumentCodersMac.mm: >+ (IPC::ArgumentCoder<WebCore::MediaPlaybackTargetContext>::encodePlatformData): >+ (IPC::ArgumentCoder<WebCore::MediaPlaybackTargetContext>::decodePlatformData): >+ * UIProcess/Cocoa/UIDelegate.mm: >+ (WebKit::UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest): >+ * WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm: >+ (WebKit::PlatformCALayerRemoteCustom::clone const): >+ > 2019-04-25 Youenn Fablet <youenn@apple.com> > > [Mac iOS WK2] Layout Test http/wpt/cache-storage/cache-quota-after-restart.any.html is a flaky failure >diff --git a/Source/WebKitLegacy/mac/ChangeLog b/Source/WebKitLegacy/mac/ChangeLog >index 8a30d748fe71f014e55d11aad4fb90a52b928aa5..557d3312e583329912594bfc36250c138ecfb97f 100644 >--- a/Source/WebKitLegacy/mac/ChangeLog >+++ b/Source/WebKitLegacy/mac/ChangeLog >@@ -1,3 +1,16 @@ >+2019-04-22 Eric Carlson <eric.carlson@apple.com> >+ >+ Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ <rdar://problem/47454979> >+ >+ Reviewed by Youenn Fablet. >+ >+ * WebView/WebVideoFullscreenController.mm: >+ (-[WebVideoFullscreenController setVideoElement:]): >+ (-[WebVideoFullscreenController windowDidExitFullscreen]): >+ (SOFT_LINK_CLASS): Deleted. >+ > 2019-04-25 Commit Queue <commit-queue@webkit.org> > > Unreviewed, rolling out r244627. >diff --git a/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm b/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >index 263a2036a8d5d69df761aed9dad9af502be9ba72..42ea43957c80f3f8b99bbc599e741a729548fd09 100644 >--- a/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >+++ b/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >@@ -57,18 +57,14 @@ > #import <wtf/text/Base64.h> > > #import <pal/cf/CoreMediaSoftLink.h> >- >-typedef AVMetadataItem AVMetadataItemType; >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >-#define AVMetadataItem getAVMetadataItemClass() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > using namespace PAL; > > #if PLATFORM(IOS_FAMILY) > static JSValue *jsValueWithValueInContext(id, JSContext *); >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *, JSContext *); >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *, JSContext *); > #endif > > static String quickTimePluginReplacementScript() >@@ -329,13 +325,13 @@ static JSValue *jsValueWithValueInContext(id value, JSContext *context) > return jsValueWithArrayInContext(value, context); > else if ([value isKindOfClass:[NSData class]]) > return jsValueWithDataInContext(value, emptyString(), context); >- else if ([value isKindOfClass:[AVMetadataItem class]]) >+ else if ([value isKindOfClass:PAL::getAVMetadataItemClass()]) > return jsValueWithAVMetadataItemInContext(value, context); > > return nil; > } > >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *item, JSContext *context) >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *item, JSContext *context) > { > NSMutableDictionary* dictionary = [NSMutableDictionary dictionaryWithDictionary:[item extraAttributes]]; > >diff --git a/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj b/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >index 4884ed31e3b4a41a4f1f954018efec39bc001cbf..365637835757f06da8d4ec6b777585e42d4eefd8 100644 >--- a/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >+++ b/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >@@ -21,6 +21,8 @@ > /* End PBXAggregateTarget section */ > > /* Begin PBXBuildFile section */ >+ 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */; }; >+ 077E87B2226A460300A2AFF0 /* AVFoundationSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */; }; > 0C00CFD41F68CE4600AAC26D /* MediaTimeAVFoundation.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C00CFD21F68CE4600AAC26D /* MediaTimeAVFoundation.h */; }; > 0C2D9E731EEF5AF600DBC317 /* ExportMacros.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C2D9E721EEF5AF600DBC317 /* ExportMacros.h */; }; > 0C2DA06D1F33CA8400DBC317 /* CFLocaleSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C2DA0671F33CA8400DBC317 /* CFLocaleSPI.h */; }; >@@ -98,7 +100,6 @@ > 0C7785A01F45130F00F4EBB6 /* QuickLookMacSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C7785871F45130F00F4EBB6 /* QuickLookMacSPI.h */; }; > 0C7785A11F45130F00F4EBB6 /* TelephonyUtilitiesSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C7785881F45130F00F4EBB6 /* TelephonyUtilitiesSPI.h */; }; > 0CF99CA41F736375007EE793 /* MediaTimeAVFoundation.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0C00CFD11F68CE4600AAC26D /* MediaTimeAVFoundation.cpp */; }; >- 7A36D0F9223AD9AB00B0522E /* CommonCryptoSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */; }; > 0CF99CA81F738437007EE793 /* CoreMediaSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0CF99CA61F738436007EE793 /* CoreMediaSoftLink.cpp */; }; > 0CF99CA91F738437007EE793 /* CoreMediaSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 0CF99CA71F738437007EE793 /* CoreMediaSoftLink.h */; }; > 1C09D0531E31C44100725F18 /* CryptoDigest.h in Headers */ = {isa = PBXBuildFile; fileRef = 1C09D0521E31C44100725F18 /* CryptoDigest.h */; }; >@@ -121,6 +122,7 @@ > 570AB8F920AF6E3D00B8BE87 /* NSXPCConnectionSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 570AB8F820AF6E3D00B8BE87 /* NSXPCConnectionSPI.h */; }; > 63C7EDC721AFAE04006A7B99 /* NSProgressSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 63E369F921AFA83F001C14BC /* NSProgressSPI.h */; }; > 7A1656441F97B2B900BA3CE4 /* NSKeyedArchiverSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A1656431F97B2B800BA3CE4 /* NSKeyedArchiverSPI.h */; }; >+ 7A36D0F9223AD9AB00B0522E /* CommonCryptoSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */; }; > 7A3A6A8020CADB4700317AAE /* NSImageSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A3A6A7F20CADB4600317AAE /* NSImageSPI.h */; }; > A10265891F56747A00B4C844 /* HIToolboxSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = A10265881F56747A00B4C844 /* HIToolboxSPI.h */; }; > A102658E1F567E9D00B4C844 /* HIServicesSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = A102658D1F567E9D00B4C844 /* HIServicesSPI.h */; }; >@@ -175,6 +177,8 @@ > /* End PBXContainerItemProxy section */ > > /* Begin PBXFileReference section */ >+ 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVFoundationSoftLink.mm; sourceTree = "<group>"; }; >+ 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVFoundationSoftLink.h; sourceTree = "<group>"; }; > 0C00CFD11F68CE4600AAC26D /* MediaTimeAVFoundation.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MediaTimeAVFoundation.cpp; sourceTree = "<group>"; }; > 0C00CFD21F68CE4600AAC26D /* MediaTimeAVFoundation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaTimeAVFoundation.h; sourceTree = "<group>"; }; > 0C2D9E721EEF5AF600DBC317 /* ExportMacros.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ExportMacros.h; sourceTree = "<group>"; }; >@@ -182,7 +186,6 @@ > 0C2DA0681F33CA8400DBC317 /* CFNetworkConnectionCacheSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFNetworkConnectionCacheSPI.h; sourceTree = "<group>"; }; > 0C2DA0691F33CA8400DBC317 /* CFNetworkSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFNetworkSPI.h; sourceTree = "<group>"; }; > 0C2DA06A1F33CA8400DBC317 /* CFUtilitiesSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFUtilitiesSPI.h; sourceTree = "<group>"; }; >- 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CommonCryptoSPI.h; sourceTree = "<group>"; }; > 0C2DA06B1F33CA8400DBC317 /* CoreAudioSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreAudioSPI.h; sourceTree = "<group>"; }; > 0C2DA06C1F33CA8400DBC317 /* CoreMediaSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreMediaSPI.h; sourceTree = "<group>"; }; > 0C2DA11C1F3BE9E000DBC317 /* CoreGraphicsSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreGraphicsSPI.h; sourceTree = "<group>"; }; >@@ -283,6 +286,7 @@ > 570AB8F820AF6E3D00B8BE87 /* NSXPCConnectionSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = NSXPCConnectionSPI.h; sourceTree = "<group>"; }; > 63E369F921AFA83F001C14BC /* NSProgressSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = NSProgressSPI.h; sourceTree = "<group>"; }; > 7A1656431F97B2B800BA3CE4 /* NSKeyedArchiverSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = NSKeyedArchiverSPI.h; sourceTree = "<group>"; }; >+ 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CommonCryptoSPI.h; sourceTree = "<group>"; }; > 7A3A6A7F20CADB4600317AAE /* NSImageSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = NSImageSPI.h; sourceTree = "<group>"; }; > 93E5909C1F93BF1E0067F8CF /* UnencodableHandling.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = UnencodableHandling.h; sourceTree = "<group>"; }; > A10265881F56747A00B4C844 /* HIToolboxSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = HIToolboxSPI.h; sourceTree = "<group>"; }; >@@ -575,6 +579,8 @@ > 1C4876DE1F8D831300CCEEBD /* cocoa */ = { > isa = PBXGroup; > children = ( >+ 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */, >+ 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */, > F44291661FA52705002CC93E /* FileSizeFormatterCocoa.mm */, > A1F63C9D21A4DBF7006FB43B /* PassKitSoftLink.h */, > A1F63C9E21A4DBF7006FB43B /* PassKitSoftLink.mm */, >@@ -672,6 +678,7 @@ > buildActionMask = 2147483647; > files = ( > 2D02E93C2056FAA700A13797 /* AudioToolboxSPI.h in Headers */, >+ 077E87B2226A460300A2AFF0 /* AVFoundationSoftLink.h in Headers */, > 0C7785891F45130F00F4EBB6 /* AVFoundationSPI.h in Headers */, > 0C2DA13E1F3BEB4900DBC317 /* AVKitSPI.h in Headers */, > CDF91113220E4EEC001EA39E /* CelestialSPI.h in Headers */, >@@ -877,6 +884,7 @@ > isa = PBXSourcesBuildPhase; > buildActionMask = 2147483647; > files = ( >+ 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */, > 0C5FFF0F1F78D9DA009EFF1A /* ClockCM.mm in Sources */, > 0CF99CA81F738437007EE793 /* CoreMediaSoftLink.cpp in Sources */, > 1C09D0561E31C46500725F18 /* CryptoDigestCommonCrypto.cpp in Sources */, >diff --git a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h >new file mode 100644 >index 0000000000000000000000000000000000000000..32e6771f6892d597306946645b88271366acb794 >--- /dev/null >+++ b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h >@@ -0,0 +1,287 @@ >+/* >+ * Copyright (C) 2019 Apple Inc. All rights reserved. >+ * >+ * Redistribution and use in source and binary forms, with or without >+ * modification, are permitted provided that the following conditions >+ * are met: >+ * 1. Redistributions of source code must retain the above copyright >+ * notice, this list of conditions and the following disclaimer. >+ * 2. Redistributions in binary form must reproduce the above copyright >+ * notice, this list of conditions and the following disclaimer in the >+ * documentation and/or other materials provided with the distribution. >+ * >+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >+ * THE POSSIBILITY OF SUCH DAMAGE. >+ */ >+ >+#pragma once >+ >+#if USE(AVFOUNDATION) >+ >+#import <AVFoundation/AVFoundation.h> >+#import <wtf/SoftLinking.h> >+ >+SOFT_LINK_FRAMEWORK_FOR_HEADER(PAL, AVFoundation) >+ >+// Note: We don't define accessor macros for classes (e.g. >+// #define AVAssetCache PAL::getAVAssetCacheClass() >+// because they make it difficult to use the class name in source code. >+ >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetCache) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetImageGenerator) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetReader) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetReaderSampleReferenceOutput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetResourceLoadingRequest) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetWriter) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetWriterInput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureConnection) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDevice) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDeviceFormat) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDeviceInput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureOutput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureSession) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureVideoDataOutput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVContentKeyResponse) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVContentKeySession) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVFrameRateRange) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMediaSelectionGroup) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMediaSelectionOption) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMetadataItem) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMutableAudioMix) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMutableAudioMixInputParameters) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVOutputContext) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItem) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItemLegibleOutput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItemVideoOutput) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerLayer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferAudioRenderer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferDisplayLayer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferRenderSynchronizer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVStreamDataParser) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVURLAsset) >+ >+#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVStreamSession) >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAudioSession) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPersistableContentKeyRequest) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechSynthesisVoice) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechSynthesizer) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechUtterance) >+#endif >+ >+#if !PLATFORM(WATCHOS) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVRouteDetector) >+SOFT_LINK_CLASS_FOR_HEADER(PAL, AVVideoPerformanceMetrics) >+#endif >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *) >+#define AVAudioTimePitchAlgorithmSpectral PAL::get_AVFoundation_AVAudioTimePitchAlgorithmSpectral() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *) >+#define AVAudioTimePitchAlgorithmVarispeed PAL::get_AVFoundation_AVAudioTimePitchAlgorithmVarispeed() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicVisual, NSString *) >+#define AVMediaCharacteristicVisual PAL::get_AVFoundation_AVMediaCharacteristicVisual() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicAudible, NSString *) >+#define AVMediaCharacteristicAudible PAL::get_AVFoundation_AVMediaCharacteristicAudible() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeClosedCaption, NSString *) >+#define AVMediaTypeClosedCaption PAL::get_AVFoundation_AVMediaTypeClosedCaption() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeVideo, NSString *) >+#define AVMediaTypeVideo PAL::get_AVFoundation_AVMediaTypeVideo() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeMuxed, NSString *) >+#define AVMediaTypeMuxed PAL::get_AVFoundation_AVMediaTypeMuxed() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeAudio, NSString *) >+#define AVMediaTypeAudio PAL::get_AVFoundation_AVMediaTypeAudio() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeMetadata, NSString *) >+#define AVMediaTypeMetadata PAL::get_AVFoundation_AVMediaTypeMetadata() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *) >+#define AVURLAssetInheritURIQueryComponentFromReferencingURIKey PAL::get_AVFoundation_AVURLAssetInheritURIQueryComponentFromReferencingURIKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *) >+#define AVAssetImageGeneratorApertureModeCleanAperture PAL::get_AVFoundation_AVAssetImageGeneratorApertureModeCleanAperture() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >+#define AVURLAssetReferenceRestrictionsKey PAL::get_AVFoundation_AVURLAssetReferenceRestrictionsKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >+#define AVLayerVideoGravityResizeAspect PAL::get_AVFoundation_AVLayerVideoGravityResizeAspect() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >+#define AVLayerVideoGravityResizeAspectFill PAL::get_AVFoundation_AVLayerVideoGravityResizeAspectFill() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResize, NSString *) >+#define AVLayerVideoGravityResize PAL::get_AVFoundation_AVLayerVideoGravityResize() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *) >+#define AVStreamingKeyDeliveryContentKeyType PAL::get_AVFoundation_AVStreamingKeyDeliveryContentKeyType() >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *) >+#define AVCaptureDeviceWasConnectedNotification PAL::get_AVFoundation_AVCaptureDeviceWasConnectedNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >+#define AVCaptureDeviceWasDisconnectedNotification PAL::get_AVFoundation_AVCaptureDeviceWasDisconnectedNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *) >+#define AVPlayerItemDidPlayToEndTimeNotification PAL::get_AVFoundation_AVPlayerItemDidPlayToEndTimeNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *) >+#define AVStreamSessionContentProtectionSessionIdentifierChangedNotification PAL::get_AVFoundation_AVStreamSessionContentProtectionSessionIdentifierChangedNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*) >+#define AVSampleBufferDisplayLayerFailedToDecodeNotification PAL::get_AVFoundation_AVSampleBufferDisplayLayerFailedToDecodeNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*) >+#define AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey PAL::get_AVFoundation_AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey() >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >+#define AVMediaCharacteristicContainsOnlyForcedSubtitles PAL::get_AVFoundation_AVMediaCharacteristicContainsOnlyForcedSubtitles() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicLegible, NSString *) >+#define AVMediaCharacteristicLegible PAL::get_AVFoundation_AVMediaCharacteristicLegible() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *) >+#define AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly PAL::get_AVFoundation_AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly() >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataCommonKeyTitle, NSString *) >+#define AVMetadataCommonKeyTitle PAL::get_AVFoundation_AVMetadataCommonKeyTitle() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceCommon, NSString *) >+#define AVMetadataKeySpaceCommon PAL::get_AVFoundation_AVMetadataKeySpaceCommon() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeSubtitle, NSString *) >+#define AVMediaTypeSubtitle PAL::get_AVFoundation_AVMediaTypeSubtitle() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >+#define AVMediaCharacteristicIsMainProgramContent PAL::get_AVFoundation_AVMediaCharacteristicIsMainProgramContent() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicEasyToRead, NSString *) >+#define AVMediaCharacteristicEasyToRead PAL::get_AVFoundation_AVMediaCharacteristicEasyToRead() >+ >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *) >+#define AVURLAssetOutOfBandMIMETypeKey PAL::get_AVFoundation_AVURLAssetOutOfBandMIMETypeKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *) >+#define AVURLAssetUseClientURLLoadingExclusively PAL::get_AVFoundation_AVURLAssetUseClientURLLoadingExclusively() >+ >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*) >+#define AVContentKeySystemFairPlayStreaming PAL::get_AVFoundation_AVContentKeySystemFairPlayStreaming() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *) >+#define AVContentKeyRequestProtocolVersionsKey PAL::get_AVFoundation_AVContentKeyRequestProtocolVersionsKey() >+ >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *) >+#define AVVideoCodecTypeHEVCWithAlpha PAL::get_AVFoundation_AVVideoCodecTypeHEVCWithAlpha() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVFileTypeMPEG4, NSString *) >+#define AVFileTypeMPEG4 PAL::get_AVFoundation_AVFileTypeMPEG4() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCodecKey, NSString *) >+#define AVVideoCodecKey PAL::get_AVFoundation_AVVideoCodecKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCodecH264, NSString *) >+#define AVVideoCodecH264 PAL::get_AVFoundation_AVVideoCodecH264() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoWidthKey, NSString *) >+#define AVVideoWidthKey PAL::get_AVFoundation_AVVideoWidthKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoHeightKey, NSString *) >+#define AVVideoHeightKey PAL::get_AVFoundation_AVVideoHeightKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *) >+#define AVVideoExpectedSourceFrameRateKey PAL::get_AVFoundation_AVVideoExpectedSourceFrameRateKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoProfileLevelKey, NSString *) >+#define AVVideoProfileLevelKey PAL::get_AVFoundation_AVVideoProfileLevelKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoAverageBitRateKey, NSString *) >+#define AVVideoAverageBitRateKey PAL::get_AVFoundation_AVVideoAverageBitRateKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *) >+#define AVVideoMaxKeyFrameIntervalKey PAL::get_AVFoundation_AVVideoMaxKeyFrameIntervalKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *) >+#define AVVideoProfileLevelH264MainAutoLevel PAL::get_AVFoundation_AVVideoProfileLevelH264MainAutoLevel() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCompressionPropertiesKey, NSString *) >+#define AVVideoCompressionPropertiesKey PAL::get_AVFoundation_AVVideoCompressionPropertiesKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVEncoderBitRateKey, NSString *) >+#define AVEncoderBitRateKey PAL::get_AVFoundation_AVEncoderBitRateKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVFormatIDKey, NSString *) >+#define AVFormatIDKey PAL::get_AVFoundation_AVFormatIDKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVNumberOfChannelsKey, NSString *) >+#define AVNumberOfChannelsKey PAL::get_AVFoundation_AVNumberOfChannelsKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVSampleRateKey, NSString *) >+#define AVSampleRateKey PAL::get_AVFoundation_AVSampleRateKey() >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetCacheKey, NSString *) >+#define AVURLAssetCacheKey PAL::get_AVFoundation_AVURLAssetCacheKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *) >+#define AVURLAssetOutOfBandAlternateTracksKey PAL::get_AVFoundation_AVURLAssetOutOfBandAlternateTracksKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >+#define AVURLAssetUsesNoPersistentCacheKey PAL::get_AVFoundation_AVURLAssetUsesNoPersistentCacheKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *) >+#define AVOutOfBandAlternateTrackDisplayNameKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackDisplayNameKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *) >+#define AVOutOfBandAlternateTrackExtendedLanguageTagKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackExtendedLanguageTagKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *) >+#define AVOutOfBandAlternateTrackIsDefaultKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackIsDefaultKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *) >+#define AVOutOfBandAlternateTrackMediaCharactersticsKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackMediaCharactersticsKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *) >+#define AVOutOfBandAlternateTrackIdentifierKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackIdentifierKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *) >+#define AVOutOfBandAlternateTrackSourceKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackSourceKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >+#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicDescribesMusicAndSoundForAccessibility() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >+#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >+#define AVMediaCharacteristicIsAuxiliaryContent PAL::get_AVFoundation_AVMediaCharacteristicIsAuxiliaryContent() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *) >+#define AVMediaCharacteristicDescribesVideoForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicDescribesVideoForAccessibility() >+ >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *) >+#define AVMetadataKeySpaceQuickTimeUserData PAL::get_AVFoundation_AVMetadataKeySpaceQuickTimeUserData() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *) >+#define AVMetadataKeySpaceQuickTimeMetadata PAL::get_AVFoundation_AVMetadataKeySpaceQuickTimeMetadata() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceiTunes, NSString *) >+#define AVMetadataKeySpaceiTunes PAL::get_AVFoundation_AVMetadataKeySpaceiTunes() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceID3, NSString *) >+#define AVMetadataKeySpaceID3 PAL::get_AVFoundation_AVMetadataKeySpaceID3() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceISOUserData, NSString *) >+#define AVMetadataKeySpaceISOUserData PAL::get_AVFoundation_AVMetadataKeySpaceISOUserData() >+ >+#if PLATFORM(MAC) >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *) >+#define AVStreamDataParserContentKeyRequestProtocolVersionsKey PAL::get_AVFoundation_AVStreamDataParserContentKeyRequestProtocolVersionsKey() >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *) >+#define AVURLAssetBoundNetworkInterfaceName PAL::get_AVFoundation_AVURLAssetBoundNetworkInterfaceName() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *) >+#define AVURLAssetClientBundleIdentifierKey PAL::get_AVFoundation_AVURLAssetClientBundleIdentifierKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetHTTPCookiesKey, NSString *) >+#define AVURLAssetHTTPCookiesKey PAL::get_AVFoundation_AVURLAssetHTTPCookiesKey() >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *) >+#define AVURLAssetRequiresCustomURLLoadingKey PAL::get_AVFoundation_AVURLAssetRequiresCustomURLLoadingKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) >+#define AVCaptureSessionRuntimeErrorNotification PAL::get_AVFoundation_AVCaptureSessionRuntimeErrorNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >+#define AVCaptureSessionWasInterruptedNotification PAL::get_AVFoundation_AVCaptureSessionWasInterruptedNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *) >+#define AVCaptureSessionInterruptionEndedNotification PAL::get_AVFoundation_AVCaptureSessionInterruptionEndedNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *) >+#define AVCaptureSessionInterruptionReasonKey PAL::get_AVFoundation_AVCaptureSessionInterruptionReasonKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionErrorKey, NSString *) >+#define AVCaptureSessionErrorKey PAL::get_AVFoundation_AVCaptureSessionErrorKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryAmbient, NSString *) >+#define AVAudioSessionCategoryAmbient PAL::get_AVFoundation_AVAudioSessionCategoryAmbient() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *) >+#define AVAudioSessionCategorySoloAmbient PAL::get_AVFoundation_AVAudioSessionCategorySoloAmbient() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryPlayback, NSString *) >+#define AVAudioSessionCategoryPlayback PAL::get_AVFoundation_AVAudioSessionCategoryPlayback() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryRecord, NSString *) >+#define AVAudioSessionCategoryRecord PAL::get_AVFoundation_AVAudioSessionCategoryRecord() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *) >+#define AVAudioSessionCategoryPlayAndRecord PAL::get_AVFoundation_AVAudioSessionCategoryPlayAndRecord() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *) >+#define AVAudioSessionCategoryAudioProcessing PAL::get_AVFoundation_AVAudioSessionCategoryAudioProcessing() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionModeDefault, NSString *) >+#define AVAudioSessionModeDefault PAL::get_AVFoundation_AVAudioSessionModeDefault() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionModeVideoChat, NSString *) >+#define AVAudioSessionModeVideoChat PAL::get_AVFoundation_AVAudioSessionModeVideoChat() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >+#define AVAudioSessionInterruptionNotification PAL::get_AVFoundation_AVAudioSessionInterruptionNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >+#define AVAudioSessionInterruptionTypeKey PAL::get_AVFoundation_AVAudioSessionInterruptionTypeKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *) >+#define AVAudioSessionInterruptionOptionKey PAL::get_AVFoundation_AVAudioSessionInterruptionOptionKey() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *) >+#define AVRouteDetectorMultipleRoutesDetectedDidChangeNotification PAL::get_AVFoundation_AVRouteDetectorMultipleRoutesDetectedDidChangeNotification() >+SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *) >+#define AVAudioSessionMediaServicesWereResetNotification PAL::get_AVFoundation_AVAudioSessionMediaServicesWereResetNotification() >+#endif // PLATFORM(IOS_FAMILY) >+ >+#endif // USE(AVFOUNDATION) >diff --git a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..e571e11ae4413436c15304ce2cb9c8ee7fd77d18 >--- /dev/null >+++ b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm >@@ -0,0 +1,183 @@ >+/* >+ * Copyright (C) 2019 Apple Inc. All rights reserved. >+ * >+ * Redistribution and use in source and binary forms, with or without >+ * modification, are permitted provided that the following conditions >+ * are met: >+ * 1. Redistributions of source code must retain the above copyright >+ * notice, this list of conditions and the following disclaimer. >+ * 2. Redistributions in binary form must reproduce the above copyright >+ * notice, this list of conditions and the following disclaimer in the >+ * documentation and/or other materials provided with the distribution. >+ * >+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE, PAL_EXPORT) >+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >+ * THE POSSIBILITY OF SUCH DAMAGE. >+ */ >+ >+#import "config.h" >+ >+#if USE(AVFOUNDATION) >+ >+#import <AVFoundation/AVFoundation.h> >+#import <wtf/SoftLinking.h> >+ >+SOFT_LINK_FRAMEWORK_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, PAL_EXPORT) >+ >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetCache, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetImageGenerator, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetReader, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetWriter, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetWriterInput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureConnection, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDevice, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceFormat, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceInput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureOutput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSession, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureVideoDataOutput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFrameRateRange, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaSelectionGroup, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaSelectionOption, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataItem, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMutableAudioMix, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMutableAudioMixInputParameters, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutputContext, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItem, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemLegibleOutput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemVideoOutput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerLayer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAsset, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVAssetReaderSampleReferenceOutput, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVAssetResourceLoadingRequest, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVContentKeyResponse, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVContentKeySession, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferAudioRenderer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferRenderSynchronizer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVStreamDataParser, PAL_EXPORT) >+ >+#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamSession, PAL_EXPORT) >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSession, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPersistableContentKeyRequest, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechSynthesisVoice, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechSynthesizer, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechUtterance, PAL_EXPORT) >+#endif >+ >+#if !PLATFORM(WATCHOS) >+SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVRouteDetector, PAL_EXPORT) >+SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVVideoPerformanceMetrics, PAL_EXPORT) >+#endif >+ >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFileTypeMPEG4, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResize, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResizeAspect, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicAudible, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicEasyToRead, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicLegible, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicVisual, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeAudio, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeClosedCaption, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeMetadata, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeMuxed, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeSubtitle, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeVideo, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataCommonKeyTitle, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceCommon, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceID3, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceISOUserData, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceiTunes, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetCacheKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoAverageBitRateKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecH264, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCompressionPropertiesKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoHeightKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoProfileLevelKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoWidthKey, NSString *, PAL_EXPORT) >+ >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVEncoderBitRateKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFormatIDKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVNumberOfChannelsKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleRateKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *, PAL_EXPORT) >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryAmbient, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryPlayback, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryRecord, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionModeDefault, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionModeVideoChat, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionErrorKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *, PAL_EXPORT) >+ >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetHTTPCookiesKey, NSString *, PAL_EXPORT) >+SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *, PAL_EXPORT) >+#endif >+ >+#endif // USE(AVFOUNDATION) >diff --git a/Source/WebCore/WebCore.xcodeproj/project.pbxproj b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >index ee2f1b435374a097057828ee608fb41694e43370..d717275a8b79aaa462797c96125a5b87ddf62554 100644 >--- a/Source/WebCore/WebCore.xcodeproj/project.pbxproj >+++ b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >@@ -100,6 +100,7 @@ > 0709FC4E1025DEE30059CDBA /* AccessibilitySlider.h in Headers */ = {isa = PBXBuildFile; fileRef = 0709FC4D1025DEE30059CDBA /* AccessibilitySlider.h */; }; > 070E09191875EEFC003A1D3C /* PlatformMediaSession.h in Headers */ = {isa = PBXBuildFile; fileRef = 070E09181875ED93003A1D3C /* PlatformMediaSession.h */; settings = {ATTRIBUTES = (Private, ); }; }; > 070E81D11BF27656001FDA48 /* VideoTrackPrivateMediaStream.h in Headers */ = {isa = PBXBuildFile; fileRef = 070E81D01BF27656001FDA48 /* VideoTrackPrivateMediaStream.h */; }; >+ CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDA79825170A279000D45C55 /* AudioSessionIOS.mm */; }; > 070F549817F12F6B00169E04 /* MediaStreamConstraintsValidationClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 070F549717F12F6B00169E04 /* MediaStreamConstraintsValidationClient.h */; }; > 0719427F1D088F21002AA51D /* AVFoundationMIMETypeCache.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07C8AD111D073D630087C5CE /* AVFoundationMIMETypeCache.mm */; }; > 071A9EC2168FBC43002629F9 /* TextTrackCueGeneric.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 071A9EC0168FB56C002629F9 /* TextTrackCueGeneric.cpp */; }; >@@ -4079,7 +4080,6 @@ > CDA29A321CC01A9500901CCF /* PlaybackSessionInterfaceAVKit.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA29A2E1CBF73FC00901CCF /* PlaybackSessionInterfaceAVKit.h */; settings = {ATTRIBUTES = (Private, ); }; }; > CDA595932146DEC300A84185 /* HEVCUtilities.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA595912146DEC300A84185 /* HEVCUtilities.h */; }; > CDA595982146DF7800A84185 /* HEVCUtilitiesCocoa.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA595962146DF7800A84185 /* HEVCUtilitiesCocoa.h */; }; >- CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDA79825170A279000D45C55 /* AudioSessionIOS.mm */; }; > CDA7982A170A3D0000D45C55 /* AudioSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA79821170A22DC00D45C55 /* AudioSession.h */; settings = {ATTRIBUTES = (Private, ); }; }; > CDA98E0B1603CD6000FEA3B1 /* LegacyCDM.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDA98E091603CD5900FEA3B1 /* LegacyCDM.cpp */; }; > CDAB6D2917C7DE6C00C60B34 /* MediaControlsHost.h in Headers */ = {isa = PBXBuildFile; fileRef = CDAB6D2717C7DE6C00C60B34 /* MediaControlsHost.h */; }; >diff --git a/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm b/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >index 90924e6fa6f8205c9a9297a390536830ea9e9fd3..1b2f65efcda04159cad2f6f1639633d2ae3bd2b1 100644 >--- a/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >+++ b/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >@@ -34,29 +34,8 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/OSObjectPtr.h> > #import <wtf/RetainPtr.h> >-#import <wtf/SoftLinking.h> >- >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryAmbient, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryPlayback, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryRecord, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionModeDefault, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionModeVideoChat, NSString *) >- >-#define AVAudioSession getAVAudioSessionClass() >-#define AVAudioSessionCategoryAmbient getAVAudioSessionCategoryAmbient() >-#define AVAudioSessionCategorySoloAmbient getAVAudioSessionCategorySoloAmbient() >-#define AVAudioSessionCategoryPlayback getAVAudioSessionCategoryPlayback() >-#define AVAudioSessionCategoryRecord getAVAudioSessionCategoryRecord() >-#define AVAudioSessionCategoryPlayAndRecord getAVAudioSessionCategoryPlayAndRecord() >-#define AVAudioSessionCategoryAudioProcessing getAVAudioSessionCategoryAudioProcessing() >-#define AVAudioSessionModeDefault getAVAudioSessionModeDefault() >-#define AVAudioSessionModeVideoChat getAVAudioSessionModeVideoChat() >+ >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -145,7 +124,7 @@ void AudioSession::setCategory(CategoryType newCategory, RouteSharingPolicy poli > } > > NSError *error = nil; >- [[AVAudioSession sharedInstance] setCategory:categoryString mode:categoryMode routeSharingPolicy:static_cast<AVAudioSessionRouteSharingPolicy>(policy) options:options error:&error]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] setCategory:categoryString mode:categoryMode routeSharingPolicy:static_cast<AVAudioSessionRouteSharingPolicy>(policy) options:options error:&error]; > #if !PLATFORM(IOS_FAMILY_SIMULATOR) && !PLATFORM(IOSMAC) > ASSERT(!error); > #endif >@@ -153,7 +132,7 @@ void AudioSession::setCategory(CategoryType newCategory, RouteSharingPolicy poli > > AudioSession::CategoryType AudioSession::category() const > { >- NSString *categoryString = [[AVAudioSession sharedInstance] category]; >+ NSString *categoryString = [[PAL::getAVAudioSessionClass() sharedInstance] category]; > if ([categoryString isEqual:AVAudioSessionCategoryAmbient]) > return AmbientSound; > if ([categoryString isEqual:AVAudioSessionCategorySoloAmbient]) >@@ -182,7 +161,7 @@ ALLOW_DEPRECATED_DECLARATIONS_END > #endif > static_assert(static_cast<size_t>(RouteSharingPolicy::Independent) == static_cast<size_t>(AVAudioSessionRouteSharingPolicyIndependent), "RouteSharingPolicy::Independent is not AVAudioSessionRouteSharingPolicyIndependent as expected"); > >- AVAudioSessionRouteSharingPolicy policy = [[AVAudioSession sharedInstance] routeSharingPolicy]; >+ AVAudioSessionRouteSharingPolicy policy = [[PAL::getAVAudioSessionClass() sharedInstance] routeSharingPolicy]; > ASSERT(static_cast<RouteSharingPolicy>(policy) <= RouteSharingPolicy::LongFormVideo); > return static_cast<RouteSharingPolicy>(policy); > } >@@ -190,7 +169,7 @@ ALLOW_DEPRECATED_DECLARATIONS_END > String AudioSession::routingContextUID() const > { > #if !PLATFORM(IOS_FAMILY_SIMULATOR) && !PLATFORM(IOSMAC) && !PLATFORM(WATCHOS) >- return [[AVAudioSession sharedInstance] routingContextUID]; >+ return [[PAL::getAVAudioSessionClass() sharedInstance] routingContextUID]; > #else > return emptyString(); > #endif >@@ -212,17 +191,17 @@ AudioSession::CategoryType AudioSession::categoryOverride() const > > float AudioSession::sampleRate() const > { >- return [[AVAudioSession sharedInstance] sampleRate]; >+ return [[PAL::getAVAudioSessionClass() sharedInstance] sampleRate]; > } > > size_t AudioSession::bufferSize() const > { >- return [[AVAudioSession sharedInstance] IOBufferDuration] * sampleRate(); >+ return [[PAL::getAVAudioSessionClass() sharedInstance] IOBufferDuration] * sampleRate(); > } > > size_t AudioSession::numberOfOutputChannels() const > { >- return [[AVAudioSession sharedInstance] outputNumberOfChannels]; >+ return [[PAL::getAVAudioSessionClass() sharedInstance] outputNumberOfChannels]; > } > > bool AudioSession::tryToSetActiveInternal(bool active) >@@ -237,14 +216,14 @@ bool AudioSession::tryToSetActiveInternal(bool active) > // returns, so do it synchronously on the same serial queue. > if (active) { > dispatch_sync(m_private->m_dispatchQueue.get(), ^{ >- [[AVAudioSession sharedInstance] setActive:YES withOptions:0 error:&error]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] setActive:YES withOptions:0 error:&error]; > }); > > return !error; > } > > dispatch_async(m_private->m_dispatchQueue.get(), ^{ >- [[AVAudioSession sharedInstance] setActive:NO withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] setActive:NO withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error]; > }); > > return true; >@@ -252,14 +231,14 @@ bool AudioSession::tryToSetActiveInternal(bool active) > > size_t AudioSession::preferredBufferSize() const > { >- return [[AVAudioSession sharedInstance] preferredIOBufferDuration] * sampleRate(); >+ return [[PAL::getAVAudioSessionClass() sharedInstance] preferredIOBufferDuration] * sampleRate(); > } > > void AudioSession::setPreferredBufferSize(size_t bufferSize) > { > NSError *error = nil; > float duration = bufferSize / sampleRate(); >- [[AVAudioSession sharedInstance] setPreferredIOBufferDuration:duration error:&error]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] setPreferredIOBufferDuration:duration error:&error]; > ASSERT(!error); > } > >diff --git a/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm b/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >index d62593a3b55f786fab43a75657b1f84983cac44e..faf5b44db7d3a97da9d27e27da949be98332e449 100644 >--- a/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >+++ b/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >@@ -45,12 +45,12 @@ > #import <wtf/RAMSize.h> > #import <wtf/RetainPtr.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *) >+#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+WEBCORE_EXPORT NSString* WebUIApplicationWillResignActiveNotification = @"WebUIApplicationWillResignActiveNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationWillEnterForegroundNotification = @"WebUIApplicationWillEnterForegroundNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationDidBecomeActiveNotification = @"WebUIApplicationDidBecomeActiveNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationDidEnterBackgroundNotification = @"WebUIApplicationDidEnterBackgroundNotification"; > > #if HAVE(CELESTIAL) > SOFT_LINK_PRIVATE_FRAMEWORK_OPTIONAL(Celestial) >@@ -58,20 +58,6 @@ SOFT_LINK_CLASS_OPTIONAL(Celestial, AVSystemController) > SOFT_LINK_CONSTANT_MAY_FAIL(Celestial, AVSystemController_PIDToInheritApplicationStateFrom, NSString *) > #endif > >-#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >-SOFT_LINK_CLASS(AVFoundation, AVRouteDetector) >-#endif >- >-#define AVAudioSession getAVAudioSessionClass() >-#define AVAudioSessionInterruptionNotification getAVAudioSessionInterruptionNotification() >-#define AVAudioSessionInterruptionTypeKey getAVAudioSessionInterruptionTypeKey() >-#define AVAudioSessionInterruptionOptionKey getAVAudioSessionInterruptionOptionKey() >- >-WEBCORE_EXPORT NSString* WebUIApplicationWillResignActiveNotification = @"WebUIApplicationWillResignActiveNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationWillEnterForegroundNotification = @"WebUIApplicationWillEnterForegroundNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationDidBecomeActiveNotification = @"WebUIApplicationDidBecomeActiveNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationDidEnterBackgroundNotification = @"WebUIApplicationDidEnterBackgroundNotification"; >- > using namespace WebCore; > > @interface WebMediaSessionHelper : NSObject { >@@ -217,7 +203,7 @@ void MediaSessionManageriOS::externalOutputDeviceAvailableDidChange() > _callback = callback; > > NSNotificationCenter *center = [NSNotificationCenter defaultCenter]; >- [center addObserver:self selector:@selector(interruption:) name:AVAudioSessionInterruptionNotification object:[AVAudioSession sharedInstance]]; >+ [center addObserver:self selector:@selector(interruption:) name:AVAudioSessionInterruptionNotification object:[PAL::getAVAudioSessionClass() sharedInstance]]; > > [center addObserver:self selector:@selector(applicationWillEnterForeground:) name:PAL::get_UIKit_UIApplicationWillEnterForegroundNotification() object:nil]; > [center addObserver:self selector:@selector(applicationWillEnterForeground:) name:WebUIApplicationWillEnterForegroundNotification object:nil]; >@@ -300,9 +286,9 @@ void MediaSessionManageriOS::externalOutputDeviceAvailableDidChange() > > if (protectedSelf->_callback) { > BEGIN_BLOCK_OBJC_EXCEPTIONS >- protectedSelf->_routeDetector = adoptNS([allocAVRouteDetectorInstance() init]); >+ protectedSelf->_routeDetector = adoptNS([PAL::allocAVRouteDetectorInstance() init]); > protectedSelf->_routeDetector.get().routeDetectionEnabled = protectedSelf->_monitoringAirPlayRoutes; >- [[NSNotificationCenter defaultCenter] addObserver:protectedSelf selector:@selector(wirelessRoutesAvailableDidChange:) name:getAVRouteDetectorMultipleRoutesDetectedDidChangeNotification() object:protectedSelf->_routeDetector.get()]; >+ [[NSNotificationCenter defaultCenter] addObserver:protectedSelf selector:@selector(wirelessRoutesAvailableDidChange:) name:AVRouteDetectorMultipleRoutesDetectedDidChangeNotification object:protectedSelf->_routeDetector.get()]; > END_BLOCK_OBJC_EXCEPTIONS > } > >diff --git a/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm b/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >index eb6988840a1307157831fe38aab23f0e85a8be74..01502cdb0beabe3880bf33d5432f2f6526f36c7f 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >@@ -35,38 +35,14 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <AVFoundation/AVPlayerItemTrack.h> > #import <objc/runtime.h> >-#import <wtf/SoftLinking.h> >+ >+#import <pal/cocoa/AVFoundationSoftLink.h> > > @class AVMediaSelectionOption; > @interface AVMediaSelectionOption (WebKitInternal) > - (id)optionID; > @end > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVAssetTrack) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItemTrack) >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >- >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >- >-#define AVMetadataItem getAVMetadataItemClass() >- >-#define AVMediaCharacteristicIsMainProgramContent getAVMediaCharacteristicIsMainProgramContent() >-#define AVMediaCharacteristicDescribesVideoForAccessibility getAVMediaCharacteristicDescribesVideoForAccessibility() >-#define AVMediaCharacteristicIsAuxiliaryContent getAVMediaCharacteristicIsAuxiliaryContent() >-#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >-#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >-#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() >- > namespace WebCore { > > AVTrackPrivateAVFObjCImpl::AVTrackPrivateAVFObjCImpl(AVPlayerItemTrack* track) >@@ -112,22 +88,22 @@ void AVTrackPrivateAVFObjCImpl::setEnabled(bool enabled) > AudioTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::audioKind() const > { > if (m_assetTrack) { >- if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return AudioTrackPrivate::Alternative; >- if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return AudioTrackPrivate::Description; >- if (canLoadAVMediaCharacteristicIsMainProgramContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return AudioTrackPrivate::Main; > return AudioTrackPrivate::None; > } > > if (m_mediaSelectionOption) { > AVMediaSelectionOption *option = m_mediaSelectionOption->avMediaSelectionOption(); >- if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return AudioTrackPrivate::Alternative; >- if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return AudioTrackPrivate::Description; >- if (canLoadAVMediaCharacteristicIsMainProgramContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return AudioTrackPrivate::Main; > return AudioTrackPrivate::None; > } >@@ -139,26 +115,26 @@ AudioTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::audioKind() const > VideoTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::videoKind() const > { > if (m_assetTrack) { >- if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return VideoTrackPrivate::Sign; >- if (canLoadAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) > return VideoTrackPrivate::Captions; >- if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return VideoTrackPrivate::Alternative; >- if (canLoadAVMediaCharacteristicIsMainProgramContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return VideoTrackPrivate::Main; > return VideoTrackPrivate::None; > } > > if (m_mediaSelectionOption) { > AVMediaSelectionOption *option = m_mediaSelectionOption->avMediaSelectionOption(); >- if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return VideoTrackPrivate::Sign; >- if (canLoadAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) > return VideoTrackPrivate::Captions; >- if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return VideoTrackPrivate::Alternative; >- if (canLoadAVMediaCharacteristicIsMainProgramContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if ([option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return VideoTrackPrivate::Main; > return VideoTrackPrivate::None; > } >@@ -189,9 +165,6 @@ AtomicString AVTrackPrivateAVFObjCImpl::id() const > > AtomicString AVTrackPrivateAVFObjCImpl::label() const > { >- if (!canLoadAVMetadataCommonKeyTitle() || !canLoadAVMetadataKeySpaceCommon()) >- return emptyAtom(); >- > NSArray *commonMetadata = nil; > if (m_assetTrack) > commonMetadata = [m_assetTrack commonMetadata]; >@@ -200,12 +173,12 @@ AtomicString AVTrackPrivateAVFObjCImpl::label() const > else > ASSERT_NOT_REACHED(); > >- NSArray *titles = [AVMetadataItem metadataItemsFromArray:commonMetadata withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:commonMetadata withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if (![titles count]) > return emptyAtom(); > > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > return [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > return [[titles objectAtIndex:0] stringValue]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >index 46090561811ed1c4482721015e7ec41b2eb90f8d..a576b6b71149421932c1433c950de5897a4a5b07 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >@@ -48,15 +48,11 @@ > #endif > > #import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation) > SOFT_LINK_FRAMEWORK(MediaToolbox) > SOFT_LINK_FRAMEWORK(AudioToolbox) > >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >-SOFT_LINK_CLASS(AVFoundation, AVMutableAudioMix) >-SOFT_LINK_CLASS(AVFoundation, AVMutableAudioMixInputParameters) >- > SOFT_LINK(AudioToolbox, AudioConverterConvertComplexBuffer, OSStatus, (AudioConverterRef inAudioConverter, UInt32 inNumberPCMFrames, const AudioBufferList* inInputData, AudioBufferList* outOutputData), (inAudioConverter, inNumberPCMFrames, inInputData, outOutputData)) > SOFT_LINK(AudioToolbox, AudioConverterNew, OSStatus, (const AudioStreamBasicDescription* inSourceFormat, const AudioStreamBasicDescription* inDestinationFormat, AudioConverterRef* outAudioConverter), (inSourceFormat, inDestinationFormat, outAudioConverter)) > >@@ -207,7 +203,7 @@ void AudioSourceProviderAVFObjC::createMix() > ASSERT(m_avPlayerItem); > ASSERT(m_client); > >- m_avAudioMix = adoptNS([allocAVMutableAudioMixInstance() init]); >+ m_avAudioMix = adoptNS([PAL::allocAVMutableAudioMixInstance() init]); > > MTAudioProcessingTapCallbacks callbacks = { > 0, >@@ -224,7 +220,7 @@ void AudioSourceProviderAVFObjC::createMix() > ASSERT(tap); > ASSERT(m_tap == tap); > >- RetainPtr<AVMutableAudioMixInputParameters> parameters = adoptNS([allocAVMutableAudioMixInputParametersInstance() init]); >+ RetainPtr<AVMutableAudioMixInputParameters> parameters = adoptNS([PAL::allocAVMutableAudioMixInputParametersInstance() init]); > [parameters setAudioTapProcessor:m_tap.get()]; > > CMPersistentTrackID trackID = m_avAssetTrack.get().trackID; >diff --git a/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm b/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >index 5d5e1473d09450f3d629d6fd989031bd35f48c86..8e6ed7c8b2a1261ed74b939084752c93dd1e7f43 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >@@ -30,10 +30,8 @@ > > #import <objc/runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVOutputContext) >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >diff --git a/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >index f782d8b7e0fd897d02baa5ad1f77331abb9fabcc..83b642172e19e67be9eabd9a6f99abc349048f20 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >@@ -33,19 +33,15 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <objc/runtime.h> > #import <wtf/Language.h> >-#import <wtf/SoftLinking.h> > #import <wtf/text/WTFString.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >- > #if HAVE(MEDIA_ACCESSIBILITY_FRAMEWORK) > #include <MediaAccessibility/MediaAccessibility.h> > #include "MediaAccessibilitySoftLink.h" > #endif > >+#import <pal/cocoa/AVFoundationSoftLink.h> >+ > namespace WebCore { > > Ref<MediaSelectionOptionAVFObjC> MediaSelectionOptionAVFObjC::create(MediaSelectionGroupAVFObjC& group, AVMediaSelectionOption *option) >@@ -106,7 +102,7 @@ MediaSelectionGroupAVFObjC::~MediaSelectionGroupAVFObjC() > > void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteristics) > { >- RetainPtr<NSSet> newAVOptions = adoptNS([[NSSet alloc] initWithArray:[getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[m_mediaSelectionGroup options]]]); >+ RetainPtr<NSSet> newAVOptions = adoptNS([[NSSet alloc] initWithArray:[PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[m_mediaSelectionGroup options]]]); > RetainPtr<NSMutableSet> oldAVOptions = adoptNS([[NSMutableSet alloc] initWithCapacity:m_options.size()]); > for (auto& avOption : m_options.keys()) > [oldAVOptions addObject:(__bridge AVMediaSelectionOption *)avOption]; >@@ -139,7 +135,7 @@ void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteris > RetainPtr<NSMutableArray> nsLanguages = adoptNS([[NSMutableArray alloc] initWithCapacity:userPreferredLanguages().size()]); > for (auto& language : userPreferredLanguages()) > [nsLanguages addObject:(NSString*)language]; >- NSArray* filteredOptions = [getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:[m_mediaSelectionGroup options] filteredAndSortedAccordingToPreferredLanguages:nsLanguages.get()]; >+ NSArray* filteredOptions = [PAL::getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:[m_mediaSelectionGroup options] filteredAndSortedAccordingToPreferredLanguages:nsLanguages.get()]; > > if (![filteredOptions count] && characteristics.isEmpty()) > return; >@@ -152,7 +148,7 @@ void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteris > for (auto& characteristic : characteristics) > [nsCharacteristics addObject:(NSString *)characteristic]; > >- NSArray* optionsWithCharacteristics = [getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:filteredOptions withMediaCharacteristics:nsCharacteristics.get()]; >+ NSArray* optionsWithCharacteristics = [PAL::getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:filteredOptions withMediaCharacteristics:nsCharacteristics.get()]; > if (optionsWithCharacteristics && [optionsWithCharacteristics count]) > filteredOptions = optionsWithCharacteristics; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm b/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >index 0b32f79f5e93368628015670bbd4475431ea12ad..5a7bf691dda95d7e912492548df93f61bae49643 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >@@ -29,18 +29,14 @@ > #if PLATFORM(COCOA) > > #import "ContentType.h" >-#import <AVFoundation/AVAsset.h> > #import <wtf/HashSet.h> > > #import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > >-#if ENABLE(VIDEO) && USE(AVFOUNDATION) > #if !PLATFORM(IOSMAC) > SOFT_LINK_FRAMEWORK_OPTIONAL_PREFLIGHT(AVFoundation) > #endif >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >-#endif > > namespace WebCore { > >@@ -85,7 +81,7 @@ bool AVFoundationMIMETypeCache::canDecodeType(const String& mimeType) > return false; > > #if ENABLE(VIDEO) && USE(AVFOUNDATION) >- return [getAVURLAssetClass() isPlayableExtendedMIMEType:mimeType]; >+ return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:mimeType]; > #endif > > return false; >@@ -97,7 +93,7 @@ bool AVFoundationMIMETypeCache::isAvailable() const > #if PLATFORM(IOSMAC) > // FIXME: This should be using AVFoundationLibraryIsAvailable() instead, but doing so causes soft-linking > // to subsequently fail on certain symbols. See <rdar://problem/42224780> for more details. >- return AVFoundationLibrary(); >+ return PAL::AVFoundationLibrary(); > #else > return AVFoundationLibraryIsAvailable(); > #endif >@@ -113,10 +109,10 @@ void AVFoundationMIMETypeCache::loadMIMETypes() > #if ENABLE(VIDEO) && USE(AVFOUNDATION) > static std::once_flag onceFlag; > std::call_once(onceFlag, [this] { >- if (!AVFoundationLibrary()) >+ if (!PAL::AVFoundationLibrary()) > return; > >- for (NSString* type in [getAVURLAssetClass() audiovisualMIMETypes]) >+ for (NSString* type in [PAL::getAVURLAssetClass() audiovisualMIMETypes]) > m_cache->add(type); > > if (m_cacheTypeCallback) >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >index 620014d90437b4dd25fc211ac56b2ddbe4640259..43ef2cd70004902653e431dd2b08ac342abe7ebd 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >@@ -39,18 +39,9 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/Algorithms.h> > #import <wtf/FileSystem.h> >-#import <wtf/SoftLinking.h> > #import <wtf/text/StringHash.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeySession); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeyResponse); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset); >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*) >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPersistableContentKeyRequest); >-#endif >+#import <pal/cocoa/AVFoundationSoftLink.h> > > static const NSString *PlaybackSessionIdKey = @"PlaybackSessionID"; > >@@ -138,13 +129,13 @@ namespace WebCore { > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistableState() > { >- return [getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]; >+ return [PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]; > } > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistentKeys() > { > #if PLATFORM(IOS_FAMILY) >- return getAVPersistableContentKeyRequestClass(); >+ return PAL::getAVPersistableContentKeyRequestClass(); > #else > return false; > #endif >@@ -152,7 +143,7 @@ bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistentKeys() > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsMediaCapability(const CDMMediaCapability& capability) > { >- if (![getAVURLAssetClass() isPlayableExtendedMIMEType:capability.contentType]) >+ if (![PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:capability.contentType]) > return false; > > // FairPlay only supports 'cbcs' encryption: >@@ -177,7 +168,7 @@ CDMInstance::SuccessValue CDMInstanceFairPlayStreamingAVFObjC::initializeWithCon > if (configuration.sessionTypes.contains(CDMSessionType::PersistentLicense) && !supportsPersistentKeys()) > return Failed; > >- if (!canLoadAVContentKeySystemFairPlayStreaming()) >+ if (!PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) > return Failed; > > return Succeeded; >@@ -370,7 +361,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::updateLicense(const String&, Li > } > > RetainPtr<NSData> appIdentifier = certificate->createNSData(); >- [getAVContentKeySessionClass() removePendingExpiredSessionReports:expiredSessions.get() withAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]; >+ [PAL::getAVContentKeySessionClass() removePendingExpiredSessionReports:expiredSessions.get() withAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]; > callback(false, { }, WTF::nullopt, WTF::nullopt, Succeeded); > return; > } >@@ -391,7 +382,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::updateLicense(const String&, Li > return; > } > >- [m_currentRequest processContentKeyResponse:[getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:responseData.createNSData().get()]]; >+ [m_currentRequest processContentKeyResponse:[PAL::getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:responseData.createNSData().get()]]; > > // FIXME(rdar://problem/35592277): stash the callback and call it once AVContentKeyResponse supports a success callback. > struct objc_method_description method = protocol_getMethodDescription(@protocol(AVContentKeySessionDelegate), @selector(contentKeySession:contentKeyRequestDidSucceed:), NO, YES); >@@ -423,7 +414,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::loadSession(LicenseType license > > RetainPtr<NSData> appIdentifier = certificate->createNSData(); > KeyStatusVector changedKeys; >- for (NSData* expiredSessionData in [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { >+ for (NSData* expiredSessionData in [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; > if (![playbackSessionIdValue isKindOfClass:[NSString class]]) >@@ -482,7 +473,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::removeSessionData(const String& > RetainPtr<NSData> appIdentifier = certificate->createNSData(); > RetainPtr<NSMutableArray> expiredSessionsArray = adoptNS([[NSMutableArray alloc] init]); > KeyStatusVector changedKeys; >- for (NSData* expiredSessionData in [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { >+ for (NSData* expiredSessionData in [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; > if (![playbackSessionIdValue isKindOfClass:[NSString class]]) >@@ -715,9 +706,9 @@ AVContentKeySession* CDMInstanceSessionFairPlayStreamingAVFObjC::ensureSession() > > auto storageURL = m_instance->storageURL(); > if (!m_instance->persistentStateAllowed() || !storageURL) >- m_session = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming()]; >+ m_session = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming]; > else >- m_session = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming() storageDirectoryAtURL:storageURL]; >+ m_session = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming storageDirectoryAtURL:storageURL]; > > if (!m_session) > return nullptr; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >index 9839f353f82c1ac7a2b58cf765f0cdc7a6420eb8..1f468797700c05f39697c528b86d6471e83c313f 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >@@ -40,14 +40,8 @@ > #import <objc/objc-runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/FileSystem.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeySession); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeyResponse); >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeySystemFairPlayStreaming, NSString *) >+#import <pal/cocoa/AVFoundationSoftLink.h> > > typedef NSString *AVContentKeySystem; > >@@ -127,7 +121,7 @@ CDMSessionAVContentKeySession::~CDMSessionAVContentKeySession() > > bool CDMSessionAVContentKeySession::isAvailable() > { >- return getAVContentKeySessionClass(); >+ return PAL::getAVContentKeySessionClass(); > } > > RefPtr<Uint8Array> CDMSessionAVContentKeySession::generateKeyRequest(const String& mimeType, Uint8Array* initData, String& destinationURL, unsigned short& errorCode, uint32_t& systemCode) >@@ -173,7 +167,7 @@ void CDMSessionAVContentKeySession::releaseKeys() > if (!m_certificate) > return; > >- if (![getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) >+ if (![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) > return; > > auto storagePath = this->storagePath(); >@@ -181,7 +175,7 @@ void CDMSessionAVContentKeySession::releaseKeys() > return; > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); >- NSArray* expiredSessions = [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > for (NSData* expiredSessionData in expiredSessions) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; >@@ -229,8 +223,8 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > >- if ([getAVContentKeySessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >- [getAVContentKeySessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ if ([PAL::getAVContentKeySessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >+ [PAL::getAVContentKeySessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > m_expiredSession = nullptr; > return true; > } >@@ -284,7 +278,7 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > RetainPtr<NSDictionary> options; >- if (!m_protocolVersions.isEmpty() && canLoadAVContentKeyRequestProtocolVersionsKey()) { >+ if (!m_protocolVersions.isEmpty() && PAL::canLoad_AVFoundation_AVContentKeyRequestProtocolVersionsKey()) { > RetainPtr<NSMutableArray> protocolVersionsOption = adoptNS([[NSMutableArray alloc] init]); > for (auto& version : m_protocolVersions) { > if (!version) >@@ -292,7 +286,7 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > [protocolVersionsOption addObject:@(version)]; > } > >- options = @{ getAVContentKeyRequestProtocolVersionsKey(): protocolVersionsOption.get() }; >+ options = @{ AVContentKeyRequestProtocolVersionsKey: protocolVersionsOption.get() }; > } > > errorCode = MediaPlayer::NoError; >@@ -316,8 +310,8 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > systemCode = 0; > RetainPtr<NSData> keyData = adoptNS([[NSData alloc] initWithBytes:key->data() length:key->length()]); > >- if ([m_keyRequest respondsToSelector:@selector(processContentKeyResponse:)] && [getAVContentKeyResponseClass() respondsToSelector:@selector(contentKeyResponseWithFairPlayStreamingKeyResponseData:)]) >- [m_keyRequest processContentKeyResponse:[getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:keyData.get()]]; >+ if ([m_keyRequest respondsToSelector:@selector(processContentKeyResponse:)] && [PAL::getAVContentKeyResponseClass() respondsToSelector:@selector(contentKeyResponseWithFairPlayStreamingKeyResponseData:)]) >+ [m_keyRequest processContentKeyResponse:[PAL::getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:keyData.get()]]; > else > [m_keyRequest processContentKeyResponseData:keyData.get()]; > >@@ -346,13 +340,13 @@ RefPtr<Uint8Array> CDMSessionAVContentKeySession::generateKeyReleaseMessage(unsi > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { >+ if (storagePath.isEmpty() || ![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { > errorCode = MediaPlayer::KeySystemNotSupported; > systemCode = '!mor'; > return nullptr; > } > >- NSArray* expiredSessions = [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > if (![expiredSessions count]) { > LOG(Media, "CDMSessionAVContentKeySession::generateKeyReleaseMessage(%p) - no expired sessions found", this); > >@@ -381,10 +375,10 @@ AVContentKeySession* CDMSessionAVContentKeySession::contentKeySession() > > String storagePath = this->storagePath(); > if (storagePath.isEmpty()) { >- if (![getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:)] || !canLoadAVContentKeySystemFairPlayStreaming()) >+ if (![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:)] || !PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) > return nil; > >- m_contentKeySession = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming()]; >+ m_contentKeySession = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming]; > } else { > String storageDirectory = FileSystem::directoryName(storagePath); > >@@ -394,10 +388,10 @@ AVContentKeySession* CDMSessionAVContentKeySession::contentKeySession() > } > > auto url = [NSURL fileURLWithPath:storagePath]; >- if ([getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:storageDirectoryAtURL:)] && canLoadAVContentKeySystemFairPlayStreaming()) >- m_contentKeySession = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming() storageDirectoryAtURL:url]; >+ if ([PAL::getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:storageDirectoryAtURL:)] && PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) >+ m_contentKeySession = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming storageDirectoryAtURL:url]; > else >- m_contentKeySession = adoptNS([allocAVContentKeySessionInstance() initWithStorageDirectoryAtURL:url]); >+ m_contentKeySession = adoptNS([PAL::allocAVContentKeySessionInstance() initWithStorageDirectoryAtURL:url]); > } > > m_contentKeySession.get().delegate = m_contentKeySessionDelegate.get(); >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >index ca19bae08fec5fccba57e49aa0f4eaf8713e5278..c1efd51c95c285dd9e3bf971ba9416276a40629e 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >@@ -41,12 +41,6 @@ > #import <wtf/SoftLinking.h> > #import <wtf/UUID.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >-SOFT_LINK_CLASS(AVFoundation, AVAssetResourceLoadingRequest) >-#define AVURLAsset getAVURLAssetClass() >-#define AVAssetResourceLoadingRequest getAVAssetResourceLoadingRequest() >- > namespace WebCore { > > CDMSessionAVFoundationObjC::CDMSessionAVFoundationObjC(MediaPlayerPrivateAVFoundationObjC* parent, LegacyCDMSessionClient* client) >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >index 75587c629f2f392bbe6cc2bb70dc8444e40b4933..99175a75e2ac39fa57487f077b61a84a12880ce1 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >@@ -40,14 +40,9 @@ > #import <objc/objc-runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/FileSystem.h> >-#import <wtf/SoftLinking.h> > #import <wtf/UUID.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamSession); >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *) >+#import <pal/cocoa/AVFoundationSoftLink.h> > > @interface AVStreamSession : NSObject > - (void)addStreamDataParser:(AVStreamDataParser *)streamDataParser; >@@ -142,11 +137,11 @@ void CDMSessionAVStreamSession::releaseKeys() > return; > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) >+ if (storagePath.isEmpty() || ![PAL::getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) > return; > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); >- NSArray* expiredSessions = [getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [PAL::getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > for (NSData* expiredSessionData in expiredSessions) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; >@@ -200,8 +195,8 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > > IGNORE_WARNINGS_BEGIN("objc-literal-conversion") > String storagePath = this->storagePath(); >- if (!storagePath.isEmpty() && [getAVStreamSessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >- [getAVStreamSessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ if (!storagePath.isEmpty() && [PAL::getAVStreamSessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >+ [PAL::getAVStreamSessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > IGNORE_WARNINGS_END > m_expiredSession = nullptr; > return true; >@@ -230,7 +225,7 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > RetainPtr<NSData> initData = adoptNS([[NSData alloc] initWithBytes:m_initData->data() length:m_initData->length()]); > > RetainPtr<NSDictionary> options; >- if (!m_protocolVersions.isEmpty() && canLoadAVStreamDataParserContentKeyRequestProtocolVersionsKey()) { >+ if (!m_protocolVersions.isEmpty()) { > RetainPtr<NSMutableArray> protocolVersionsOption = adoptNS([[NSMutableArray alloc] init]); > for (auto& version : m_protocolVersions) { > if (!version) >@@ -238,7 +233,7 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > [protocolVersionsOption addObject:@(version)]; > } > >- options = @{ getAVStreamDataParserContentKeyRequestProtocolVersionsKey(): protocolVersionsOption.get() }; >+ options = @{ AVStreamDataParserContentKeyRequestProtocolVersionsKey: protocolVersionsOption.get() }; > } > > NSError* error = nil; >@@ -280,16 +275,15 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > > void CDMSessionAVStreamSession::setStreamSession(AVStreamSession *streamSession) > { >- if (m_streamSession && canLoadAVStreamSessionContentProtectionSessionIdentifierChangedNotification()) >- [[NSNotificationCenter defaultCenter] removeObserver:m_dataParserObserver.get() name:getAVStreamSessionContentProtectionSessionIdentifierChangedNotification() object:m_streamSession.get()]; >+ if (m_streamSession) >+ [[NSNotificationCenter defaultCenter] removeObserver:m_dataParserObserver.get() name:AVStreamSessionContentProtectionSessionIdentifierChangedNotification object:m_streamSession.get()]; > > m_streamSession = streamSession; > > if (!m_streamSession) > return; > >- if (canLoadAVStreamSessionContentProtectionSessionIdentifierChangedNotification()) >- [[NSNotificationCenter defaultCenter] addObserver:m_dataParserObserver.get() selector:@selector(contentProtectionSessionIdentifierChanged:) name:getAVStreamSessionContentProtectionSessionIdentifierChangedNotification() object:m_streamSession.get()]; >+ [[NSNotificationCenter defaultCenter] addObserver:m_dataParserObserver.get() selector:@selector(contentProtectionSessionIdentifierChanged:) name:AVStreamSessionContentProtectionSessionIdentifierChangedNotification object:m_streamSession.get()]; > > NSData* identifier = [streamSession contentProtectionSessionIdentifier]; > RetainPtr<NSString> sessionIdentifierString = identifier ? adoptNS([[NSString alloc] initWithData:identifier encoding:(NSUTF8StringEncoding)]) : nil; >@@ -315,13 +309,13 @@ RefPtr<Uint8Array> CDMSessionAVStreamSession::generateKeyReleaseMessage(unsigned > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { >+ if (storagePath.isEmpty() || ![PAL::getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { > errorCode = MediaPlayer::KeySystemNotSupported; > systemCode = '!mor'; > return nullptr; > } > >- NSArray* expiredSessions = [getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [PAL::getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > if (![expiredSessions count]) { > LOG(Media, "CDMSessionAVStreamSession::generateKeyReleaseMessage(%p) - no expired sessions found", this); > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >index 17dfd7a8f2c276c08db1f7f8aff7eb6d6d0dd2ae..d3f547f99c46f5de47165e05f441462b374eaa44 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >@@ -52,25 +52,12 @@ > #import <wtf/MediaTime.h> > #import <wtf/NeverDestroyed.h> > #import <wtf/Optional.h> >-#import <wtf/SoftLinking.h> > #import <wtf/Vector.h> > >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" > #import "VideoToolboxSoftLink.h" >- >-#pragma mark - Soft Linking >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAssetReader) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAssetReaderSampleReferenceOutput) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicVisual, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >-#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >-#define AVURLAssetReferenceRestrictionsKey getAVURLAssetReferenceRestrictionsKey() >-#define AVURLAssetUsesNoPersistentCacheKey getAVURLAssetUsesNoPersistentCacheKey() >+#import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > > #pragma mark - > >@@ -238,9 +225,6 @@ static NSURL *customSchemeURL() > static NSDictionary *imageDecoderAssetOptions() > { > static NSDictionary *options = [] { >- // FIXME: Are these keys really optional? >- if (!canLoadAVURLAssetReferenceRestrictionsKey() || !canLoadAVURLAssetUsesNoPersistentCacheKey()) >- return [@{ } retain]; > return [@{ > AVURLAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidAll), > AVURLAssetUsesNoPersistentCacheKey: @YES, >@@ -366,7 +350,7 @@ ImageDecoderAVFObjC::ImageDecoderAVFObjC(SharedBuffer& data, const String& mimeT > : ImageDecoder() > , m_mimeType(mimeType) > , m_uti(WebCore::UTIFromMIMEType(mimeType)) >- , m_asset(adoptNS([allocAVURLAssetInstance() initWithURL:customSchemeURL() options:imageDecoderAssetOptions()])) >+ , m_asset(adoptNS([PAL::allocAVURLAssetInstance() initWithURL:customSchemeURL() options:imageDecoderAssetOptions()])) > , m_loader(adoptNS([[WebCoreSharedBufferResourceLoaderDelegate alloc] initWithParent:this])) > , m_decompressionSession(WebCoreDecompressionSession::createRGB()) > { >@@ -399,12 +383,6 @@ bool ImageDecoderAVFObjC::canDecodeType(const String& mimeType) > > AVAssetTrack *ImageDecoderAVFObjC::firstEnabledTrack() > { >- // FIXME: Is AVMediaCharacteristicVisual truly optional? >- if (!canLoadAVMediaCharacteristicVisual()) { >- LOG(Images, "ImageDecoderAVFObjC::firstEnabledTrack(%p) - AVMediaCharacteristicVisual is not supported", this); >- return nil; >- } >- > NSArray<AVAssetTrack *> *videoTracks = [m_asset tracksWithMediaCharacteristic:AVMediaCharacteristicVisual]; > NSUInteger firstEnabledIndex = [videoTracks indexOfObjectPassingTest:^(AVAssetTrack *track, NSUInteger, BOOL*) { > return track.enabled; >@@ -423,8 +401,8 @@ void ImageDecoderAVFObjC::readSamples() > if (!m_sampleData.empty()) > return; > >- auto assetReader = adoptNS([allocAVAssetReaderInstance() initWithAsset:m_asset.get() error:nil]); >- auto referenceOutput = adoptNS([allocAVAssetReaderSampleReferenceOutputInstance() initWithTrack:m_track.get()]); >+ auto assetReader = adoptNS([PAL::allocAVAssetReaderInstance() initWithAsset:m_asset.get() error:nil]); >+ auto referenceOutput = adoptNS([PAL::allocAVAssetReaderSampleReferenceOutputInstance() initWithTrack:m_track.get()]); > > referenceOutput.get().alwaysCopiesSampleData = NO; > [assetReader addOutput:referenceOutput.get()]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >index d88f7c7479b2b4be07890f9096f7d94a0f7124a3..3067273ed62e5a1fa6fec4adbd228258f5c0b540 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >@@ -39,42 +39,8 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <AVFoundation/AVPlayerItemOutput.h> > #import <objc/runtime.h> >-#import <wtf/SoftLinking.h> >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVPlayer) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItemLegibleOutput) >-#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >-#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >-#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >-#define AVMediaCharacteristicContainsOnlyForcedSubtitles getAVMediaCharacteristicContainsOnlyForcedSubtitles() >-#define AVMediaCharacteristicIsMainProgramContent getAVMediaCharacteristicIsMainProgramContent() >-#define AVMediaCharacteristicEasyToRead getAVMediaCharacteristicEasyToRead() >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeSubtitle, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicEasyToRead, NSString *) >- >-#define AVPlayer getAVPlayerClass() >-#define AVPlayerItem getAVPlayerItemClass() >-#define AVMetadataItem getAVMetadataItemClass() >-#define AVPlayerItemLegibleOutput getAVPlayerItemLegibleOutputClass() >-#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >-#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >-#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() >-#define AVMediaTypeSubtitle getAVMediaTypeSubtitle() >-#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >-#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility getAVMediaCharacteristicDescribesMusicAndSoundForAccessibility() >+ >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -170,10 +136,10 @@ AtomicString InbandTextTrackPrivateAVFObjC::label() const > > NSString *title = 0; > >- NSArray *titles = [AVMetadataItem metadataItemsFromArray:[m_mediaSelectionOption.get() commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:[m_mediaSelectionOption.get() commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if ([titles count]) { > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > title = [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >index 4c7a45f2d3079189880c736934b494d386a1794a..ba0dfdea3e4ab45d71ab790e931f4764a31ba427 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >@@ -33,24 +33,8 @@ > #import "Logging.h" > #import "MediaPlayerPrivateAVFoundationObjC.h" > #import <objc/runtime.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >-#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >- >-#define AVPlayerItem getAVPlayerItemClass() >-#define AVMetadataItem getAVMetadataItemClass() >-#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >-#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >-#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -101,10 +85,10 @@ AtomicString InbandTextTrackPrivateLegacyAVFObjC::label() const > > NSString *title = 0; > >- NSArray *titles = [AVMetadataItem metadataItemsFromArray:[[m_playerItemTrack assetTrack] commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:[[m_playerItemTrack assetTrack] commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if ([titles count]) { > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > title = [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >index 2763744ad0119b4d3f84f515876df5050484a483..9d03f98f090c1ad84cd9fed8d1e7d91f832926e9 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >@@ -37,14 +37,9 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/MainThread.h> > >-typedef AVOutputContext AVOutputContextWKType; >-typedef AVOutputDeviceMenuController AVOutputDeviceMenuControllerWKType; >- >+#import <pal/cocoa/AVFoundationSoftLink.h> > > SOFTLINK_AVKIT_FRAMEWORK() >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVOutputContext) > SOFT_LINK_CLASS_OPTIONAL(AVKit, AVOutputDeviceMenuController) > > using namespace WebCore; >@@ -87,7 +82,7 @@ Ref<MediaPlaybackTarget> MediaPlaybackTargetPickerMac::playbackTarget() > return WebCore::MediaPlaybackTargetMac::create(context); > } > >-AVOutputDeviceMenuControllerWKType *MediaPlaybackTargetPickerMac::devicePicker() >+AVOutputDeviceMenuController *MediaPlaybackTargetPickerMac::devicePicker() > { > if (!getAVOutputDeviceMenuControllerClass()) > return nullptr; >@@ -95,7 +90,7 @@ AVOutputDeviceMenuControllerWKType *MediaPlaybackTargetPickerMac::devicePicker() > if (!m_outputDeviceMenuController) { > LOG(Media, "MediaPlaybackTargetPickerMac::devicePicker - allocating picker"); > >- RetainPtr<AVOutputContextWKType> context = adoptNS([allocAVOutputContextInstance() init]); >+ RetainPtr<AVOutputContext> context = adoptNS([PAL::allocAVOutputContextInstance() init]); > m_outputDeviceMenuController = adoptNS([allocAVOutputDeviceMenuControllerInstance() initWithOutputContext:context.get()]); > > [m_outputDeviceMenuController.get() addObserver:m_outputDeviceMenuControllerDelegate.get() forKeyPath:externalOutputDeviceAvailableKeyName options:NSKeyValueObservingOptionNew context:nullptr]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >index 5e293ba0518be0e4fc8ba0da4abfbdee8fb4946e..4e5b82372e9b758c26d4a3e5d5801cfa7a1840bc 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >@@ -136,168 +136,8 @@ template <> struct iterator_traits<HashSet<RefPtr<WebCore::MediaSelectionOptionA > @property (nonatomic, readonly) NSURL *resolvedURL; > @end > >-typedef AVPlayer AVPlayerType; >-typedef AVPlayerItem AVPlayerItemType; >-typedef AVPlayerItemLegibleOutput AVPlayerItemLegibleOutputType; >-typedef AVPlayerItemVideoOutput AVPlayerItemVideoOutputType; >-typedef AVMetadataItem AVMetadataItemType; >-typedef AVMediaSelectionGroup AVMediaSelectionGroupType; >-typedef AVMediaSelectionOption AVMediaSelectionOptionType; >-typedef AVAssetCache AVAssetCacheType; >- >-#pragma mark - Soft Linking >- >-// Soft-linking headers must be included last since they #define functions, constants, etc. > #import <pal/cf/CoreMediaSoftLink.h> >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(CoreImage) >- >-SOFT_LINK_CLASS(AVFoundation, AVPlayer) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItemVideoOutput) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >-SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >-SOFT_LINK_CLASS(AVFoundation, AVAssetImageGenerator) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >-SOFT_LINK_CLASS(AVFoundation, AVAssetCache) >- >-SOFT_LINK_CLASS(CoreImage, CIContext) >-SOFT_LINK_CLASS(CoreImage, CIImage) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicVisual, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicAudible, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeMetadata, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *) >- >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *) >- >-#define AVPlayer initAVPlayer() >-#define AVPlayerItem initAVPlayerItem() >-#define AVPlayerLayer initAVPlayerLayer() >-#define AVURLAsset initAVURLAsset() >-#define AVAssetImageGenerator initAVAssetImageGenerator() >-#define AVPlayerItemVideoOutput initAVPlayerItemVideoOutput() >-#define AVMetadataItem initAVMetadataItem() >-#define AVAssetCache initAVAssetCache() >- >-#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral() >-#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed() >-#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >-#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >-#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >-#define AVMediaTypeVideo getAVMediaTypeVideo() >-#define AVMediaTypeAudio getAVMediaTypeAudio() >-#define AVMediaTypeMetadata getAVMediaTypeMetadata() >-#define AVPlayerItemDidPlayToEndTimeNotification getAVPlayerItemDidPlayToEndTimeNotification() >-#define AVURLAssetInheritURIQueryComponentFromReferencingURIKey getAVURLAssetInheritURIQueryComponentFromReferencingURIKey() >-#define AVURLAssetOutOfBandMIMETypeKey getAVURLAssetOutOfBandMIMETypeKey() >-#define AVURLAssetUseClientURLLoadingExclusively getAVURLAssetUseClientURLLoadingExclusively() >-#define AVAssetImageGeneratorApertureModeCleanAperture getAVAssetImageGeneratorApertureModeCleanAperture() >-#define AVURLAssetReferenceRestrictionsKey getAVURLAssetReferenceRestrictionsKey() >-#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect() >-#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill() >-#define AVLayerVideoGravityResize getAVLayerVideoGravityResize() >-#define AVStreamingKeyDeliveryContentKeyType getAVStreamingKeyDeliveryContentKeyType() >- >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- >-typedef AVMediaSelectionGroup AVMediaSelectionGroupType; >-typedef AVMediaSelectionOption AVMediaSelectionOptionType; >- >-SOFT_LINK_CLASS(AVFoundation, AVPlayerItemLegibleOutput) >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >-SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >-SOFT_LINK_CLASS(AVFoundation, AVOutputContext) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeSubtitle, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *) >- >-#define AVPlayerItemLegibleOutput getAVPlayerItemLegibleOutputClass() >-#define AVMediaSelectionGroup getAVMediaSelectionGroupClass() >-#define AVMediaSelectionOption getAVMediaSelectionOptionClass() >-#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >-#define AVMediaTypeSubtitle getAVMediaTypeSubtitle() >-#define AVMediaCharacteristicContainsOnlyForcedSubtitles getAVMediaCharacteristicContainsOnlyForcedSubtitles() >-#define AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly getAVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly() >- >-#endif >- >-#if ENABLE(AVF_CAPTIONS) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetCacheKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >- >-#define AVURLAssetOutOfBandAlternateTracksKey getAVURLAssetOutOfBandAlternateTracksKey() >-#define AVURLAssetCacheKey getAVURLAssetCacheKey() >-#define AVURLAssetUsesNoPersistentCacheKey getAVURLAssetUsesNoPersistentCacheKey() >-#define AVOutOfBandAlternateTrackDisplayNameKey getAVOutOfBandAlternateTrackDisplayNameKey() >-#define AVOutOfBandAlternateTrackExtendedLanguageTagKey getAVOutOfBandAlternateTrackExtendedLanguageTagKey() >-#define AVOutOfBandAlternateTrackIsDefaultKey getAVOutOfBandAlternateTrackIsDefaultKey() >-#define AVOutOfBandAlternateTrackMediaCharactersticsKey getAVOutOfBandAlternateTrackMediaCharactersticsKey() >-#define AVOutOfBandAlternateTrackIdentifierKey getAVOutOfBandAlternateTrackIdentifierKey() >-#define AVOutOfBandAlternateTrackSourceKey getAVOutOfBandAlternateTrackSourceKey() >-#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility getAVMediaCharacteristicDescribesMusicAndSoundForAccessibility() >-#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >-#define AVMediaCharacteristicIsAuxiliaryContent getAVMediaCharacteristicIsAuxiliaryContent() >- >-#endif >- >-#if ENABLE(DATACUE_VALUE) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataKeySpaceISOUserData, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceiTunes, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceID3, NSString *) >- >-#define AVMetadataKeySpaceQuickTimeUserData getAVMetadataKeySpaceQuickTimeUserData() >-#define AVMetadataKeySpaceISOUserData getAVMetadataKeySpaceISOUserData() >-#define AVMetadataKeySpaceQuickTimeMetadata getAVMetadataKeySpaceQuickTimeMetadata() >-#define AVMetadataKeySpaceiTunes getAVMetadataKeySpaceiTunes() >-#define AVMetadataKeySpaceID3 getAVMetadataKeySpaceID3() >- >-#endif >- >-#if PLATFORM(IOS_FAMILY) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetHTTPCookiesKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *) >- >-#define AVURLAssetBoundNetworkInterfaceName getAVURLAssetBoundNetworkInterfaceName() >-#define AVURLAssetClientBundleIdentifierKey getAVURLAssetClientBundleIdentifierKey() >-#define AVURLAssetHTTPCookiesKey getAVURLAssetHTTPCookiesKey() >-#define AVURLAssetRequiresCustomURLLoadingKey getAVURLAssetRequiresCustomURLLoadingKey() >- >-#endif >+#import <pal/cocoa/AVFoundationSoftLink.h> > > SOFT_LINK_FRAMEWORK(MediaToolbox) > SOFT_LINK_OPTIONAL(MediaToolbox, MTEnableCaption2015Behavior, Boolean, (), ()) >@@ -325,11 +165,7 @@ enum MediaPlayerAVFoundationObservationContext { > MediaPlayerAVFoundationObservationContextAVPlayerLayer, > }; > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > @interface WebCoreAVFMovieObserver : NSObject <AVPlayerItemLegibleOutputPushDelegate> >-#else >-@interface WebCoreAVFMovieObserver : NSObject >-#endif > { > WeakPtr<MediaPlayerPrivateAVFoundationObjC> m_player; > GenericTaskQueue<Timer, std::atomic<unsigned>> m_taskQueue; >@@ -340,10 +176,8 @@ enum MediaPlayerAVFoundationObservationContext { > -(void)metadataLoaded; > -(void)didEnd:(NSNotification *)notification; > -(void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(MediaPlayerAVFoundationObservationContext)context; >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > - (void)legibleOutput:(id)output didOutputAttributedStrings:(NSArray *)strings nativeSampleBuffers:(NSArray *)nativeSamples forItemTime:(CMTime)itemTime; > - (void)outputSequenceWasFlushed:(id)output; >-#endif > @end > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) >@@ -409,7 +243,7 @@ void MediaPlayerPrivateAVFoundationObjC::registerMediaEngine(MediaEngineRegistra > ASSERT(AVFoundationMIMETypeCache::singleton().isAvailable()); > } > >-static AVAssetCacheType *assetCacheForPath(const String& path) >+static AVAssetCache *assetCacheForPath(const String& path) > { > NSURL *assetCacheURL; > >@@ -418,7 +252,7 @@ static AVAssetCacheType *assetCacheForPath(const String& path) > else > assetCacheURL = [NSURL fileURLWithPath:path isDirectory:YES]; > >- return [initAVAssetCache() assetCacheWithURL:assetCacheURL]; >+ return [PAL::getAVAssetCacheClass() assetCacheWithURL:assetCacheURL]; > } > > HashSet<RefPtr<SecurityOrigin>> MediaPlayerPrivateAVFoundationObjC::originsInMediaCache(const String& path) >@@ -440,7 +274,7 @@ static WallTime toSystemClockTime(NSDate *date) > > void MediaPlayerPrivateAVFoundationObjC::clearMediaCache(const String& path, WallTime modifiedSince) > { >- AVAssetCacheType* assetCache = assetCacheForPath(path); >+ AVAssetCache* assetCache = assetCacheForPath(path); > > for (NSString *key in [assetCache allKeys]) { > if (toSystemClockTime([assetCache lastModifiedDateOfEntryForKey:key]) > modifiedSince) >@@ -482,7 +316,7 @@ void MediaPlayerPrivateAVFoundationObjC::clearMediaCache(const String& path, Wal > > void MediaPlayerPrivateAVFoundationObjC::clearMediaCacheForOrigins(const String& path, const HashSet<RefPtr<SecurityOrigin>>& origins) > { >- AVAssetCacheType* assetCache = assetCacheForPath(path); >+ AVAssetCache* assetCache = assetCacheForPath(path); > for (NSString *key in [assetCache allKeys]) { > URL keyAsURL = URL(URL(), key); > if (keyAsURL.isValid()) { >@@ -560,13 +394,11 @@ void MediaPlayerPrivateAVFoundationObjC::cancelLoad() > > clearTextTracks(); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > if (m_legibleOutput) { > if (m_avPlayerItem) > [m_avPlayerItem.get() removeOutput:m_legibleOutput.get()]; > m_legibleOutput = nil; > } >-#endif > > if (m_avPlayerItem) { > for (NSString *keyName in itemKVOProperties()) >@@ -645,7 +477,7 @@ void MediaPlayerPrivateAVFoundationObjC::createImageGenerator() > if (!m_avAsset || m_imageGenerator) > return; > >- m_imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:m_avAsset.get()]; >+ m_imageGenerator = [PAL::getAVAssetImageGeneratorClass() assetImageGeneratorWithAsset:m_avAsset.get()]; > > [m_imageGenerator.get() setApertureMode:AVAssetImageGeneratorApertureModeCleanAperture]; > [m_imageGenerator.get() setAppliesPreferredTrackTransform:YES]; >@@ -701,7 +533,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerLayer() > if (!m_avPlayer) > return; > >- m_videoLayer = adoptNS([[AVPlayerLayer alloc] init]); >+ m_videoLayer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); > [m_videoLayer setPlayer:m_avPlayer.get()]; > > #ifndef NDEBUG >@@ -800,7 +632,7 @@ void MediaPlayerPrivateAVFoundationObjC::synchronizeTextTrackState() > continue; > > RefPtr<OutOfBandTextTrackPrivateAVF> trackPrivate = static_cast<OutOfBandTextTrackPrivateAVF*>(textTrack.get()); >- RetainPtr<AVMediaSelectionOptionType> currentOption = trackPrivate->mediaSelectionOption(); >+ RetainPtr<AVMediaSelectionOption> currentOption = trackPrivate->mediaSelectionOption(); > > for (auto& track : outOfBandTrackSources) { > RetainPtr<CFStringRef> uniqueID = String::number(track->uniqueId()).createCFString(); >@@ -890,19 +722,19 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > if (player()->doesHaveAttribute("x-itunes-inherit-uri-query-component")) > [options.get() setObject:@YES forKey: AVURLAssetInheritURIQueryComponentFromReferencingURIKey]; > >- if (canLoadAVURLAssetUseClientURLLoadingExclusively()) >+ if (PAL::canLoad_AVFoundation_AVURLAssetUseClientURLLoadingExclusively()) > [options setObject:@YES forKey:AVURLAssetUseClientURLLoadingExclusively]; > #if PLATFORM(IOS_FAMILY) >- else if (canLoadAVURLAssetRequiresCustomURLLoadingKey()) >+ else if (PAL::canLoad_AVFoundation_AVURLAssetRequiresCustomURLLoadingKey()) > [options setObject:@YES forKey:AVURLAssetRequiresCustomURLLoadingKey]; > // FIXME: rdar://problem/20354688 > String identifier = player()->sourceApplicationIdentifier(); >- if (!identifier.isEmpty() && canLoadAVURLAssetClientBundleIdentifierKey()) >+ if (!identifier.isEmpty()) > [options setObject:identifier forKey:AVURLAssetClientBundleIdentifierKey]; > #endif > > auto type = player()->contentMIMEType(); >- if (canLoadAVURLAssetOutOfBandMIMETypeKey() && !type.isEmpty() && !player()->contentMIMETypeWasInferredFromExtension()) { >+ if (PAL::canLoad_AVFoundation_AVURLAssetOutOfBandMIMETypeKey() && !type.isEmpty() && !player()->contentMIMETypeWasInferredFromExtension()) { > auto codecs = player()->contentTypeCodecs(); > if (!codecs.isEmpty()) { > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)type, (NSString *)codecs]; >@@ -947,7 +779,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > for (auto& cookie : cookies) > [nsCookies addObject:toNSHTTPCookie(cookie)]; > >- if (canLoadAVURLAssetHTTPCookiesKey()) >+ if (PAL::canLoad_AVFoundation_AVURLAssetHTTPCookiesKey()) > [options setObject:nsCookies.get() forKey:AVURLAssetHTTPCookiesKey]; > } > #endif >@@ -959,7 +791,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > [options setObject:assetCacheForPath(player()->client().mediaPlayerMediaCacheDirectory()) forKey:AVURLAssetCacheKey]; > > NSURL *cocoaURL = canonicalURL(url); >- m_avAsset = adoptNS([[AVURLAsset alloc] initWithURL:cocoaURL options:options.get()]); >+ m_avAsset = adoptNS([PAL::allocAVURLAssetInstance() initWithURL:cocoaURL options:options.get()]); > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) > AVAssetResourceLoader *resourceLoader = m_avAsset.get().resourceLoader; >@@ -981,7 +813,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > setDelayCallbacks(false); > } > >-void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItemType *item) >+void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItem *item) > { > if (!m_avPlayer) > return; >@@ -991,8 +823,8 @@ void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItemType *item) > return; > } > >- RetainPtr<AVPlayerType> strongPlayer = m_avPlayer.get(); >- RetainPtr<AVPlayerItemType> strongItem = item; >+ RetainPtr<AVPlayer> strongPlayer = m_avPlayer.get(); >+ RetainPtr<AVPlayerItem> strongItem = item; > dispatch_async(dispatch_get_main_queue(), [strongPlayer, strongItem] { > [strongPlayer replaceCurrentItemWithPlayerItem:strongItem.get()]; > }); >@@ -1007,15 +839,13 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayer() > > setDelayCallbacks(true); > >- m_avPlayer = adoptNS([[AVPlayer alloc] init]); >+ m_avPlayer = adoptNS([PAL::allocAVPlayerInstance() init]); > for (NSString *keyName in playerKVOProperties()) > [m_avPlayer.get() addObserver:m_objcObserver.get() forKeyPath:keyName options:NSKeyValueObservingOptionNew context:(void *)MediaPlayerAVFoundationObservationContextPlayer]; > > setShouldObserveTimeControlStatus(true); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > [m_avPlayer.get() setAppliesMediaSelectionCriteriaAutomatically:NO]; >-#endif > > #if ENABLE(WIRELESS_PLAYBACK_TARGET) > updateDisableExternalPlayback(); >@@ -1059,7 +889,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerItem() > setDelayCallbacks(true); > > // Create the player item so we can load media data. >- m_avPlayerItem = adoptNS([[AVPlayerItem alloc] initWithAsset:m_avAsset.get()]); >+ m_avPlayerItem = adoptNS([PAL::allocAVPlayerItemInstance() initWithAsset:m_avAsset.get()]); > > [[NSNotificationCenter defaultCenter] addObserver:m_objcObserver.get() selector:@selector(didEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:m_avPlayerItem.get()]; > >@@ -1072,18 +902,16 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerItem() > if (m_avPlayer) > setAVPlayerItem(m_avPlayerItem.get()); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > const NSTimeInterval legibleOutputAdvanceInterval = 2; > > RetainPtr<NSArray> subtypes = adoptNS([[NSArray alloc] initWithObjects:[NSNumber numberWithUnsignedInt:kCMSubtitleFormatType_WebVTT], nil]); >- m_legibleOutput = adoptNS([[AVPlayerItemLegibleOutput alloc] initWithMediaSubtypesForNativeRepresentation:subtypes.get()]); >+ m_legibleOutput = adoptNS([PAL::allocAVPlayerItemLegibleOutputInstance() initWithMediaSubtypesForNativeRepresentation:subtypes.get()]); > [m_legibleOutput.get() setSuppressesPlayerRendering:YES]; > > [m_legibleOutput.get() setDelegate:m_objcObserver.get() queue:dispatch_get_main_queue()]; > [m_legibleOutput.get() setAdvanceIntervalForDelegateInvocation:legibleOutputAdvanceInterval]; > [m_legibleOutput.get() setTextStylingResolution:AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly]; > [m_avPlayerItem.get() addOutput:m_legibleOutput.get()]; >-#endif > > #if ENABLE(WEB_AUDIO) && USE(MEDIATOOLBOX) > if (m_provider) { >@@ -1722,7 +1550,7 @@ MediaPlayer::SupportsType MediaPlayerPrivateAVFoundationObjC::supportsType(const > return MediaPlayer::IsNotSupported; > > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)containerType, (NSString *)parameters.type.parameter(ContentType::codecsParameter())]; >- return [AVURLAsset isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported; >+ return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported; > } > > bool MediaPlayerPrivateAVFoundationObjC::supportsKeySystem(const String& keySystem, const String& mimeType) >@@ -1868,7 +1696,7 @@ void MediaPlayerPrivateAVFoundationObjC::didStopLoadingRequest(AVAssetResourceLo > > bool MediaPlayerPrivateAVFoundationObjC::isAvailable() > { >- return AVFoundationLibrary() && isCoreMediaFrameworkAvailable(); >+ return PAL::AVFoundationLibrary() && isCoreMediaFrameworkAvailable(); > } > > MediaTime MediaPlayerPrivateAVFoundationObjC::mediaTimeForTimeValue(const MediaTime& timeValue) const >@@ -1933,9 +1761,6 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > AVAssetTrack* firstEnabledVideoTrack = firstEnabledTrack([m_avAsset.get() tracksWithMediaCharacteristic:AVMediaCharacteristicVisual]); > setHasVideo(firstEnabledVideoTrack); > setHasAudio(firstEnabledTrack([m_avAsset.get() tracksWithMediaCharacteristic:AVMediaCharacteristicAudible])); >-#if !HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- hasCaptions = [[m_avAsset.get() tracksWithMediaType:AVMediaTypeClosedCaption] count]; >-#endif > auto size = firstEnabledVideoTrack ? FloatSize(CGSizeApplyAffineTransform([firstEnabledVideoTrack naturalSize], [firstEnabledVideoTrack preferredTransform])) : FloatSize(); > // For videos with rotation tag set, the transformation above might return a CGSize instance with negative width or height. > // See https://bugs.webkit.org/show_bug.cgi?id=172648. >@@ -1957,9 +1782,6 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > else if ([mediaType isEqualToString:AVMediaTypeAudio]) > hasAudio = true; > else if ([mediaType isEqualToString:AVMediaTypeClosedCaption]) { >-#if !HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- hasCaptions = true; >-#endif > haveCCTrack = true; > } else if ([mediaType isEqualToString:AVMediaTypeMetadata]) { > hasMetaData = true; >@@ -1967,15 +1789,11 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > } > } > >-#if ENABLE(VIDEO_TRACK) > updateAudioTracks(); > updateVideoTracks(); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > hasAudio |= (m_audibleGroup && m_audibleGroup->selectedOption()); > hasVideo |= (m_visualGroup && m_visualGroup->selectedOption()); >-#endif >-#endif > > // Always says we have video if the AVPlayerLayer is ready for diaplay to work around > // an AVFoundation bug which causes it to sometimes claim a track is disabled even >@@ -1989,22 +1807,12 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > #endif > } > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- AVMediaSelectionGroupType *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); >+ AVMediaSelectionGroup *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); > if (legibleGroup && m_cachedTracks) { >- hasCaptions = [[AVMediaSelectionGroup playableMediaSelectionOptionsFromArray:[legibleGroup options]] count]; >+ hasCaptions = [[PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[legibleGroup options]] count]; > if (hasCaptions) > processMediaSelectionOptions(); > } >-#endif >- >-#if !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) && HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- if (!hasCaptions && haveCCTrack) >- processLegacyClosedCaptionsTracks(); >-#elif !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) >- if (haveCCTrack) >- processLegacyClosedCaptionsTracks(); >-#endif > > setHasClosedCaptions(hasCaptions); > >@@ -2069,8 +1877,6 @@ void determineChangedTracksFromNewTracksAndOldItems(NSArray* tracks, NSString* t > (player->*addedFunction)(*addedItem); > } > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- > template <typename RefT, typename PassRefT> > void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* group, Vector<RefT>& oldItems, const Vector<String>& characteristics, RefT (*itemFactory)(MediaSelectionOptionAVFObjC&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT)) > { >@@ -2080,7 +1886,7 @@ void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* > for (auto& option : group->options()) { > if (!option) > continue; >- AVMediaSelectionOptionType* avOption = option->avMediaSelectionOption(); >+ AVMediaSelectionOption* avOption = option->avMediaSelectionOption(); > if (!avOption) > continue; > newSelectionOptions.add(option); >@@ -2131,25 +1937,21 @@ void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* > (player->*addedFunction)(*addedItem); > } > >-#endif >- > void MediaPlayerPrivateAVFoundationObjC::updateAudioTracks() > { > #if !RELEASE_LOG_DISABLED > size_t count = m_audioTracks.size(); > #endif > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > Vector<String> characteristics = player()->preferredAudioCharacteristics(); > if (!m_audibleGroup) { >- if (AVMediaSelectionGroupType *group = safeMediaSelectionGroupForAudibleMedia()) >+ if (AVMediaSelectionGroup *group = safeMediaSelectionGroupForAudibleMedia()) > m_audibleGroup = MediaSelectionGroupAVFObjC::create(m_avPlayerItem.get(), group, characteristics); > } > > if (m_audibleGroup) > determineChangedTracksFromNewTracksAndOldItems(m_audibleGroup.get(), m_audioTracks, characteristics, &AudioTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack); > else >-#endif > determineChangedTracksFromNewTracksAndOldItems(m_cachedTracks.get(), AVMediaTypeAudio, m_audioTracks, &AudioTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack); > > for (auto& track : m_audioTracks) >@@ -2168,15 +1970,13 @@ void MediaPlayerPrivateAVFoundationObjC::updateVideoTracks() > > determineChangedTracksFromNewTracksAndOldItems(m_cachedTracks.get(), AVMediaTypeVideo, m_videoTracks, &VideoTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > if (!m_visualGroup) { >- if (AVMediaSelectionGroupType *group = safeMediaSelectionGroupForVisualMedia()) >+ if (AVMediaSelectionGroup *group = safeMediaSelectionGroupForVisualMedia()) > m_visualGroup = MediaSelectionGroupAVFObjC::create(m_avPlayerItem.get(), group, Vector<String>()); > } > > if (m_visualGroup) > determineChangedTracksFromNewTracksAndOldItems(m_visualGroup.get(), m_videoTracks, Vector<String>(), &VideoTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack); >-#endif > > for (auto& track : m_audioTracks) > track->resetPropertiesFromTrack(); >@@ -2272,7 +2072,7 @@ void MediaPlayerPrivateAVFoundationObjC::createVideoOutput() > #else > NSDictionary* attributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]; > #endif >- m_videoOutput = adoptNS([[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attributes]); >+ m_videoOutput = adoptNS([PAL::allocAVPlayerItemVideoOutputInstance() initWithPixelBufferAttributes:attributes]); > ASSERT(m_videoOutput); > > [m_videoOutput setDelegate:m_videoOutputDelegate.get() queue:globalPullDelegateQueue()]; >@@ -2422,7 +2222,7 @@ void MediaPlayerPrivateAVFoundationObjC::waitForVideoOutputMediaDataWillChange() > ERROR_LOG(LOGIDENTIFIER, "timed out"); > } > >-void MediaPlayerPrivateAVFoundationObjC::outputMediaDataWillChange(AVPlayerItemVideoOutputType *) >+void MediaPlayerPrivateAVFoundationObjC::outputMediaDataWillChange(AVPlayerItemVideoOutput *) > { > m_videoOutputSemaphore.signal(); > } >@@ -2551,45 +2351,6 @@ void MediaPlayerPrivateAVFoundationObjC::setWaitingForKey(bool waitingForKey) > } > #endif > >-#if !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) >- >-void MediaPlayerPrivateAVFoundationObjC::processLegacyClosedCaptionsTracks() >-{ >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- [m_avPlayerItem.get() selectMediaOption:nil inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >-#endif >- >- Vector<RefPtr<InbandTextTrackPrivateAVF>> removedTextTracks = m_textTracks; >- for (AVPlayerItemTrack *playerItemTrack in m_cachedTracks.get()) { >- >- AVAssetTrack *assetTrack = [playerItemTrack assetTrack]; >- if (![[assetTrack mediaType] isEqualToString:AVMediaTypeClosedCaption]) >- continue; >- >- bool newCCTrack = true; >- for (unsigned i = removedTextTracks.size(); i > 0; --i) { >- if (removedTextTracks[i - 1]->textTrackCategory() != InbandTextTrackPrivateAVF::LegacyClosedCaption) >- continue; >- >- RefPtr<InbandTextTrackPrivateLegacyAVFObjC> track = static_cast<InbandTextTrackPrivateLegacyAVFObjC*>(m_textTracks[i - 1].get()); >- if (track->avPlayerItemTrack() == playerItemTrack) { >- removedTextTracks.remove(i - 1); >- newCCTrack = false; >- break; >- } >- } >- >- if (!newCCTrack) >- continue; >- >- m_textTracks.append(InbandTextTrackPrivateLegacyAVFObjC::create(this, playerItemTrack)); >- } >- >- processNewAndRemovedTextTracks(removedTextTracks); >-} >- >-#endif >- > NSArray* MediaPlayerPrivateAVFoundationObjC::safeAVAssetTracksForAudibleMedia() > { > if (!m_avAsset) >@@ -2601,8 +2362,6 @@ NSArray* MediaPlayerPrivateAVFoundationObjC::safeAVAssetTracksForAudibleMedia() > return [m_avAsset tracksWithMediaCharacteristic:AVMediaCharacteristicAudible]; > } > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- > bool MediaPlayerPrivateAVFoundationObjC::hasLoadedMediaSelectionGroups() > { > if (!m_avAsset) >@@ -2614,7 +2373,7 @@ bool MediaPlayerPrivateAVFoundationObjC::hasLoadedMediaSelectionGroups() > return true; > } > >-AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForLegibleMedia() >+AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForLegibleMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2622,7 +2381,7 @@ AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectio > return [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicLegible]; > } > >-AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForAudibleMedia() >+AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForAudibleMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2630,7 +2389,7 @@ AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectio > return [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; > } > >-AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForVisualMedia() >+AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForVisualMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2640,7 +2399,7 @@ AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectio > > void MediaPlayerPrivateAVFoundationObjC::processMediaSelectionOptions() > { >- AVMediaSelectionGroupType *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); >+ AVMediaSelectionGroup *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); > if (!legibleGroup) { > INFO_LOG(LOGIDENTIFIER, "no mediaSelectionGroup"); > return; >@@ -2652,14 +2411,14 @@ void MediaPlayerPrivateAVFoundationObjC::processMediaSelectionOptions() > [m_avPlayerItem.get() selectMediaOption:nil inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; > > Vector<RefPtr<InbandTextTrackPrivateAVF>> removedTextTracks = m_textTracks; >- NSArray *legibleOptions = [AVMediaSelectionGroup playableMediaSelectionOptionsFromArray:[legibleGroup options]]; >- for (AVMediaSelectionOptionType *option in legibleOptions) { >+ NSArray *legibleOptions = [PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[legibleGroup options]]; >+ for (AVMediaSelectionOption *option in legibleOptions) { > bool newTrack = true; > for (unsigned i = removedTextTracks.size(); i > 0; --i) { > if (removedTextTracks[i - 1]->textTrackCategory() == InbandTextTrackPrivateAVF::LegacyClosedCaption) > continue; > >- RetainPtr<AVMediaSelectionOptionType> currentOption; >+ RetainPtr<AVMediaSelectionOption> currentOption; > #if ENABLE(AVF_CAPTIONS) > if (removedTextTracks[i - 1]->textTrackCategory() == InbandTextTrackPrivateAVF::OutOfBand) { > RefPtr<OutOfBandTextTrackPrivateAVF> track = static_cast<OutOfBandTextTrackPrivateAVF*>(removedTextTracks[i - 1].get()); >@@ -2724,8 +2483,6 @@ void MediaPlayerPrivateAVFoundationObjC::flushCues() > m_currentTextTrack->resetCueValues(); > } > >-#endif // HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- > void MediaPlayerPrivateAVFoundationObjC::setCurrentTextTrack(InbandTextTrackPrivateAVF *track) > { > if (m_currentTextTrack == track) >@@ -2740,18 +2497,14 @@ void MediaPlayerPrivateAVFoundationObjC::setCurrentTextTrack(InbandTextTrackPriv > ALLOW_DEPRECATED_DECLARATIONS_BEGIN > [m_avPlayer.get() setClosedCaptionDisplayEnabled:YES]; > ALLOW_DEPRECATED_DECLARATIONS_END >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > #if ENABLE(AVF_CAPTIONS) > else if (track->textTrackCategory() == InbandTextTrackPrivateAVF::OutOfBand) > [m_avPlayerItem.get() selectMediaOption:static_cast<OutOfBandTextTrackPrivateAVF*>(track)->mediaSelectionOption() inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; > #endif > else > [m_avPlayerItem.get() selectMediaOption:static_cast<InbandTextTrackPrivateAVFObjC*>(track)->mediaSelectionOption() inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >-#endif > } else { >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > [m_avPlayerItem.get() selectMediaOption:0 inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >-#endif > ALLOW_DEPRECATED_DECLARATIONS_BEGIN > [m_avPlayer.get() setClosedCaptionDisplayEnabled:NO]; > ALLOW_DEPRECATED_DECLARATIONS_END >@@ -2767,11 +2520,10 @@ String MediaPlayerPrivateAVFoundationObjC::languageOfPrimaryAudioTrack() const > if (!m_avPlayerItem.get()) > return emptyString(); > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > // If AVFoundation has an audible group, return the language of the currently selected audible option. >- AVMediaSelectionGroupType *audibleGroup = [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; >+ AVMediaSelectionGroup *audibleGroup = [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; > ALLOW_DEPRECATED_DECLARATIONS_BEGIN >- AVMediaSelectionOptionType *currentlySelectedAudibleOption = [m_avPlayerItem.get() selectedMediaOptionInMediaSelectionGroup:audibleGroup]; >+ AVMediaSelectionOption *currentlySelectedAudibleOption = [m_avPlayerItem.get() selectedMediaOptionInMediaSelectionGroup:audibleGroup]; > ALLOW_DEPRECATED_DECLARATIONS_END > if (currentlySelectedAudibleOption) { > m_languageOfPrimaryAudioTrack = [[currentlySelectedAudibleOption locale] localeIdentifier]; >@@ -2779,7 +2531,6 @@ String MediaPlayerPrivateAVFoundationObjC::languageOfPrimaryAudioTrack() const > > return m_languageOfPrimaryAudioTrack; > } >-#endif // HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > > // AVFoundation synthesizes an audible group when there is only one ungrouped audio track if there is also a legible group (one or > // more in-band text tracks). It doesn't know about out-of-band tracks, so if there is a single audio track return its language. >@@ -2825,7 +2576,7 @@ MediaPlayer::WirelessPlaybackTargetType MediaPlayerPrivateAVFoundationObjC::wire > return MediaPlayer::TargetTypeNone; > > #if PLATFORM(IOS_FAMILY) >- if (!AVFoundationLibrary()) >+ if (!PAL::AVFoundationLibrary()) > return MediaPlayer::TargetTypeNone; > > switch ([m_avPlayer externalPlaybackType]) { >@@ -2846,14 +2597,14 @@ MediaPlayer::WirelessPlaybackTargetType MediaPlayerPrivateAVFoundationObjC::wire > } > > #if PLATFORM(IOS_FAMILY) >-static NSString *exernalDeviceDisplayNameForPlayer(AVPlayerType *player) >+static NSString *exernalDeviceDisplayNameForPlayer(AVPlayer *player) > { > #if HAVE(CELESTIAL) >- if (!AVFoundationLibrary()) >+ if (!PAL::AVFoundationLibrary()) > return nil; > >- if ([getAVOutputContextClass() respondsToSelector:@selector(sharedAudioPresentationOutputContext)]) { >- AVOutputContext *outputContext = [getAVOutputContextClass() sharedAudioPresentationOutputContext]; >+ if ([PAL::getAVOutputContextClass() respondsToSelector:@selector(sharedAudioPresentationOutputContext)]) { >+ AVOutputContext *outputContext = [PAL::getAVOutputContextClass() sharedAudioPresentationOutputContext]; > > if (![outputContext respondsToSelector:@selector(supportsMultipleOutputDevices)] > || ![outputContext supportsMultipleOutputDevices] >@@ -3136,7 +2887,7 @@ static const AtomicString& metadataType(NSString *avMetadataKeySpace) > > if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceQuickTimeUserData]) > return quickTimeUserData; >- if (canLoadAVMetadataKeySpaceISOUserData() && [avMetadataKeySpace isEqualToString:AVMetadataKeySpaceISOUserData]) >+ if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceISOUserData]) > return isoUserData; > if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceQuickTimeMetadata]) > return quickTimeMetadata; >@@ -3170,14 +2921,14 @@ void MediaPlayerPrivateAVFoundationObjC::metadataDidArrive(const RetainPtr<NSArr > > // Set the duration of all incomplete cues before adding new ones. > MediaTime earliestStartTime = MediaTime::positiveInfiniteTime(); >- for (AVMetadataItemType *item in m_currentMetaData.get()) { >+ for (AVMetadataItem *item in m_currentMetaData.get()) { > MediaTime start = std::max(PAL::toMediaTime(item.time), MediaTime::zeroTime()); > if (start < earliestStartTime) > earliestStartTime = start; > } > m_metadataTrack->updatePendingCueEndTimes(earliestStartTime); > >- for (AVMetadataItemType *item in m_currentMetaData.get()) { >+ for (AVMetadataItem *item in m_currentMetaData.get()) { > MediaTime start = std::max(PAL::toMediaTime(item.time), MediaTime::zeroTime()); > MediaTime end = MediaTime::positiveInfiniteTime(); > if (CMTIME_IS_VALID(item.duration)) >@@ -3523,7 +3274,7 @@ NSArray* playerKVOProperties() > player->durationDidChange(PAL::toMediaTime([newValue CMTimeValue])); > else if ([keyPath isEqualToString:@"timedMetadata"] && newValue) { > MediaTime now; >- CMTime itemTime = [(AVPlayerItemType *)object.get() currentTime]; >+ CMTime itemTime = [(AVPlayerItem *)object.get() currentTime]; > if (CMTIME_IS_NUMERIC(itemTime)) > now = std::max(PAL::toMediaTime(itemTime), MediaTime::zeroTime()); > player->metadataDidArrive(RetainPtr<NSArray>(newValue), now); >@@ -3568,8 +3319,6 @@ NSArray* playerKVOProperties() > }); > } > >-#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >- > - (void)legibleOutput:(id)output didOutputAttributedStrings:(NSArray *)strings nativeSampleBuffers:(NSArray *)nativeSamples forItemTime:(CMTime)itemTime > { > UNUSED_PARAM(output); >@@ -3592,8 +3341,6 @@ NSArray* playerKVOProperties() > }); > } > >-#endif >- > @end > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) >@@ -3661,13 +3408,13 @@ NSArray* playerKVOProperties() > return self; > } > >-- (void)outputMediaDataWillChange:(AVPlayerItemVideoOutputType *)output >+- (void)outputMediaDataWillChange:(AVPlayerItemVideoOutput *)output > { > if (m_player) > m_player->outputMediaDataWillChange(output); > } > >-- (void)outputSequenceWasFlushed:(AVPlayerItemVideoOutputType *)output >+- (void)outputSequenceWasFlushed:(AVPlayerItemVideoOutput *)output > { > UNUSED_PARAM(output); > // No-op. >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >index d831b05d180ebb78801d6ec6cf127a63a8255d4f..7fb1a5ec305a854f66ee207579814c6618d1bec9 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >@@ -52,31 +52,9 @@ > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> > >-#pragma mark - Soft Linking >- >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAsset) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset) >-ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer) >-ALLOW_NEW_API_WITHOUT_GUARDS_END >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer) >-ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer) >-ALLOW_NEW_API_WITHOUT_GUARDS_END >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamDataParser) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamSession); >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVVideoPerformanceMetrics) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*) >- >-#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral() >-#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed() >+#import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > > #pragma mark - > #pragma mark AVStreamSession >@@ -117,7 +95,7 @@ static void CMTimebaseEffectiveRateChangedCallback(CMNotificationCenterRef, cons > > MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC(MediaPlayer* player) > : m_player(player) >- , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init])) >+ , m_synchronizer(adoptNS([PAL::allocAVSampleBufferRenderSynchronizerInstance() init])) > , m_seekTimer(*this, &MediaPlayerPrivateMediaSourceAVFObjC::seekInternal) > , m_networkState(MediaPlayer::Empty) > , m_readyState(MediaPlayer::HaveNothing) >@@ -201,7 +179,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::registerMediaEngine(MediaEngineRegist > > bool MediaPlayerPrivateMediaSourceAVFObjC::isAvailable() > { >- return AVFoundationLibrary() >+ return PAL::AVFoundationLibrary() > && isCoreMediaFrameworkAvailable() > && getAVStreamDataParserClass() > && getAVSampleBufferAudioRendererClass() >@@ -234,14 +212,14 @@ MediaPlayer::SupportsType MediaPlayerPrivateMediaSourceAVFObjC::supportsType(con > return MediaPlayer::MayBeSupported; > > NSString *outputCodecs = codecs; >- if ([getAVStreamDataParserClass() respondsToSelector:@selector(outputMIMECodecParameterForInputMIMECodecParameter:)]) >- outputCodecs = [getAVStreamDataParserClass() outputMIMECodecParameterForInputMIMECodecParameter:outputCodecs]; >+ if ([PAL::getAVStreamDataParserClass() respondsToSelector:@selector(outputMIMECodecParameterForInputMIMECodecParameter:)]) >+ outputCodecs = [PAL::getAVStreamDataParserClass() outputMIMECodecParameterForInputMIMECodecParameter:outputCodecs]; > > if (!contentTypeMeetsHardwareDecodeRequirements(parameters.type, parameters.contentTypesRequiringHardwareSupport)) > return MediaPlayer::IsNotSupported; > > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)parameters.type.containerType(), (NSString *)outputCodecs]; >- return [getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported;; >+ return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported;; > } > > #pragma mark - >@@ -719,7 +697,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer() > if (m_sampleBufferDisplayLayer) > return; > >- m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); >+ m_sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]); > #ifndef NDEBUG > [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaSource AVSampleBufferDisplayLayer"]; > #endif >@@ -937,7 +915,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::flushPendingSizeChanges() > #if HAVE(AVSTREAMSESSION) > AVStreamSession* MediaPlayerPrivateMediaSourceAVFObjC::streamSession() > { >- if (!getAVStreamSessionClass() || ![getAVStreamSessionClass() instancesRespondToSelector:@selector(initWithStorageDirectoryAtURL:)]) >+ if (!getAVStreamSessionClass() || ![PAL::getAVStreamSessionClass() instancesRespondToSelector:@selector(initWithStorageDirectoryAtURL:)]) > return nil; > > if (!m_streamSession) { >@@ -951,7 +929,7 @@ AVStreamSession* MediaPlayerPrivateMediaSourceAVFObjC::streamSession() > } > > String storagePath = FileSystem::pathByAppendingComponent(storageDirectory, "SecureStop.plist"); >- m_streamSession = adoptNS([allocAVStreamSessionInstance() initWithStorageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]); >+ m_streamSession = adoptNS([PAL::allocAVStreamSessionInstance() initWithStorageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]); > } > return m_streamSession.get(); > } >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >index d0e92b2a32c9101779939a444b7ba762214ae03b..cda97c91fda083def6acf5fac3ded2f448171d4c 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >@@ -46,23 +46,9 @@ > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> > >- >-#pragma mark - Soft Linking >- >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >- >-#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect() >-#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill() >-#define AVLayerVideoGravityResize getAVLayerVideoGravityResize() >+#import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > > using namespace WebCore; > >@@ -133,7 +119,7 @@ using namespace WebCore; > if (!_parent) > return; > >- if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) { >+ if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) { > RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object; > ASSERT(layer.get() == _parent->displayLayer()); > >@@ -240,7 +226,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::registerMediaEngine(MediaEngineRegist > > bool MediaPlayerPrivateMediaStreamAVFObjC::isAvailable() > { >- return AVFoundationLibrary() && isCoreMediaFrameworkAvailable() && getAVSampleBufferDisplayLayerClass(); >+ return PAL::AVFoundationLibrary() && isCoreMediaFrameworkAvailable() && getAVSampleBufferDisplayLayerClass(); > } > > void MediaPlayerPrivateMediaStreamAVFObjC::getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types) >@@ -484,7 +470,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers() > if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->activeVideoTrack() || !m_mediaStreamPrivate->activeVideoTrack()->enabled()) > return; > >- m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); >+ m_sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]); > if (!m_sampleBufferDisplayLayer) { > ERROR_LOG(LOGIDENTIFIER, "+[AVSampleBufferDisplayLayer alloc] failed."); > return; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >index ffc69d7785772dcad1a49ad7a4a1b07d6cb91eec..5c0f3dc061f05fc693a3787361efd8928f2209aa 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >@@ -32,8 +32,8 @@ > #import <wtf/PrintStream.h> > #import <wtf/cf/TypeCastsCF.h> > >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" >+#import <pal/cf/CoreMediaSoftLink.h> > > using namespace PAL; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >index cb94d66592f5757d803d107d00a21801431858d7..72ed5c9cf2fd0998274d628cd2d6599bc232795d 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >@@ -65,29 +65,7 @@ > #pragma mark - Soft Linking > > #import <pal/cf/CoreMediaSoftLink.h> >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVAssetTrack) >-SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser) >-ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >-SOFT_LINK_CLASS(AVFoundation, AVSampleBufferAudioRenderer) >-ALLOW_NEW_API_WITHOUT_GUARDS_END >-SOFT_LINK_CLASS(AVFoundation, AVSampleBufferDisplayLayer) >-SOFT_LINK_CLASS(AVFoundation, AVStreamSession) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicVisual, NSString*) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicAudible, NSString*) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString*) >-SOFT_LINK_CONSTANT(AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*) >-SOFT_LINK_CONSTANT(AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*) >- >-#define AVSampleBufferDisplayLayerFailedToDecodeNotification getAVSampleBufferDisplayLayerFailedToDecodeNotification() >-#define AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey getAVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey() >- >-#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >-#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >-#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > @interface AVSampleBufferDisplayLayer (WebCoreAVSampleBufferDisplayLayerQueueManagementPrivate) > - (void)prerollDecodeWithCompletionHandler:(void (^)(BOOL success))block; >@@ -354,7 +332,7 @@ ALLOW_NEW_API_WITHOUT_GUARDS_END > UNUSED_PARAM(keyPath); > ASSERT(_parent); > >- if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) { >+ if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) { > RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object; > ASSERT(_layers.contains(layer.get())); > >@@ -375,7 +353,7 @@ ALLOW_NEW_API_WITHOUT_GUARDS_END > } else > ASSERT_NOT_REACHED(); > >- } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) { >+ } else if ([object isKindOfClass:PAL::getAVSampleBufferAudioRendererClass()]) { > ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN > RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object; > ALLOW_NEW_API_WITHOUT_GUARDS_END >@@ -486,7 +464,7 @@ Ref<SourceBufferPrivateAVFObjC> SourceBufferPrivateAVFObjC::create(MediaSourcePr > } > > SourceBufferPrivateAVFObjC::SourceBufferPrivateAVFObjC(MediaSourcePrivateAVFObjC* parent) >- : m_parser(adoptNS([allocAVStreamDataParserInstance() init])) >+ : m_parser(adoptNS([PAL::allocAVStreamDataParserInstance() init])) > , m_delegate(adoptNS([[WebAVStreamDataParserListener alloc] initWithParser:m_parser.get() parent:createWeakPtr()])) > , m_errorListener(adoptNS([[WebAVSampleBufferErrorListener alloc] initWithParent:createWeakPtr()])) > , m_isAppendingGroup(adoptOSObject(dispatch_group_create())) >@@ -498,8 +476,8 @@ SourceBufferPrivateAVFObjC::SourceBufferPrivateAVFObjC(MediaSourcePrivateAVFObjC > #endif > { > ALWAYS_LOG(LOGIDENTIFIER); >- >- if (![getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) >+ >+ if (![PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) > CMNotificationCenterAddListener(CMNotificationCenterGetDefaultLocalCenter(), reinterpret_cast<void*>(m_mapID), bufferWasConsumedCallback, kCMSampleBufferConsumerNotification_BufferConsumed, nullptr, 0); > > m_delegate.get().abortSemaphore = Box<Semaphore>::create(0); >@@ -516,7 +494,7 @@ SourceBufferPrivateAVFObjC::~SourceBufferPrivateAVFObjC() > destroyParser(); > destroyRenderers(); > >- if (![getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) >+ if (![PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) > CMNotificationCenterRemoveListener(CMNotificationCenterGetDefaultLocalCenter(), this, bufferWasConsumedCallback, kCMSampleBufferConsumerNotification_BufferConsumed, nullptr); > > if (m_hasSessionSemaphore) >@@ -909,7 +887,7 @@ void SourceBufferPrivateAVFObjC::trackDidChangeEnabled(AudioTrackPrivateMediaSou > RetainPtr<AVSampleBufferAudioRenderer> renderer; > ALLOW_NEW_API_WITHOUT_GUARDS_END > if (!m_audioRenderers.contains(trackID)) { >- renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]); >+ renderer = adoptNS([PAL::allocAVSampleBufferAudioRendererInstance() init]); > auto weakThis = createWeakPtr(); > [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ > if (weakThis) >@@ -1151,7 +1129,7 @@ void SourceBufferPrivateAVFObjC::enqueueSample(Ref<MediaSample>&& sample, const > if (m_mediaSource && !m_mediaSource->player()->hasAvailableVideoFrame() && !sample->isNonDisplaying()) { > DEBUG_LOG(LOGIDENTIFIER, "adding buffer attachment"); > >- bool havePrerollDecodeWithCompletionHandler = [getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]; >+ bool havePrerollDecodeWithCompletionHandler = [PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]; > > if (!havePrerollDecodeWithCompletionHandler) { > CMSampleBufferRef rawSampleCopy; >diff --git a/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm b/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >index 7daa25952bcbf679c535d4b32960e3d3c210b072..a48523cf5176b31f2acc4984b2ecd8e2c874f607 100644 >--- a/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >+++ b/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >@@ -63,9 +63,7 @@ > #import "ThemeMac.h" > #endif > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPlayerLayer) >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -198,12 +196,12 @@ static NSString *toCAFilterType(PlatformCALayer::FilterType type) > > PlatformCALayer::LayerType PlatformCALayerCocoa::layerTypeForPlatformLayer(PlatformLayer* layer) > { >- if ([layer isKindOfClass:getAVPlayerLayerClass()]) >+ if ([layer isKindOfClass:PAL::getAVPlayerLayerClass()]) > return LayerTypeAVPlayerLayer; > > if ([layer isKindOfClass:objc_getClass("WebVideoContainerLayer")] > && layer.sublayers.count == 1 >- && [layer.sublayers[0] isKindOfClass:getAVPlayerLayerClass()]) >+ && [layer.sublayers[0] isKindOfClass:PAL::getAVPlayerLayerClass()]) > return LayerTypeAVPlayerLayer; > > if ([layer isKindOfClass:[WebGLLayer class]]) >@@ -263,7 +261,7 @@ PlatformCALayerCocoa::PlatformCALayerCocoa(LayerType layerType, PlatformCALayerC > layerClass = [WebTiledBackingLayer class]; > break; > case LayerTypeAVPlayerLayer: >- layerClass = getAVPlayerLayerClass(); >+ layerClass = PAL::getAVPlayerLayerClass(); > break; > case LayerTypeContentsProvidedLayer: > // We don't create PlatformCALayerCocoas wrapped around WebGLLayers or WebGPULayers. >@@ -360,7 +358,7 @@ Ref<PlatformCALayer> PlatformCALayerCocoa::clone(PlatformCALayerClient* owner) c > newLayer->updateCustomAppearance(customAppearance()); > > if (type == LayerTypeAVPlayerLayer) { >- ASSERT([newLayer->platformLayer() isKindOfClass:getAVPlayerLayerClass()]); >+ ASSERT([newLayer->platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]); > > AVPlayerLayer *destinationPlayerLayer = static_cast<PlatformCALayerCocoa&>(newLayer.get()).avPlayerLayer(); > AVPlayerLayer *sourcePlayerLayer = avPlayerLayer(); >@@ -1271,12 +1269,12 @@ AVPlayerLayer *PlatformCALayerCocoa::avPlayerLayer() const > if (layerType() != LayerTypeAVPlayerLayer) > return nil; > >- if ([platformLayer() isKindOfClass:getAVPlayerLayerClass()]) >+ if ([platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]) > return static_cast<AVPlayerLayer *>(platformLayer()); > > if ([platformLayer() isKindOfClass:objc_getClass("WebVideoContainerLayer")]) { > ASSERT([platformLayer() sublayers].count == 1); >- ASSERT([[platformLayer() sublayers][0] isKindOfClass:getAVPlayerLayerClass()]); >+ ASSERT([[platformLayer() sublayers][0] isKindOfClass:PAL::getAVPlayerLayerClass()]); > return static_cast<AVPlayerLayer *>([platformLayer() sublayers][0]); > } > >diff --git a/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm b/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >index 31c2d71b584c6c6f2a8b01c85ae7f0bcd0238916..50506676054828891c6c37c0771b60f158f71187 100644 >--- a/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >+++ b/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >@@ -23,19 +23,17 @@ > * THE POSSIBILITY OF SUCH DAMAGE. > */ > >-#include "config.h" >-#include "HEVCUtilitiesCocoa.h" >+#import "config.h" >+#import "HEVCUtilitiesCocoa.h" > > #if PLATFORM(COCOA) > >-#include "FourCC.h" >-#include "HEVCUtilities.h" >-#include "MediaCapabilitiesInfo.h" >+#import "FourCC.h" >+#import "HEVCUtilities.h" >+#import "MediaCapabilitiesInfo.h" > >-#include "VideoToolboxSoftLink.h" >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *) >+#import "VideoToolboxSoftLink.h" >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -43,10 +41,10 @@ bool validateHEVCParameters(HEVCParameterSet& parameters, MediaCapabilitiesInfo& > { > CMVideoCodecType codec = kCMVideoCodecType_HEVC; > if (hasAlphaChannel) { >- if (!AVFoundationLibrary() || !canLoadAVVideoCodecTypeHEVCWithAlpha()) >+ if (!PAL::AVFoundationLibrary() || !PAL::canLoad_AVFoundation_AVVideoCodecTypeHEVCWithAlpha()) > return false; > >- auto codecCode = FourCC::fromString(getAVVideoCodecTypeHEVCWithAlpha()); >+ auto codecCode = FourCC::fromString(AVVideoCodecTypeHEVCWithAlpha); > if (!codecCode) > return false; > >diff --git a/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm b/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >index 5a0fe71aee9c4fdaca78c302cda72837bde1b01c..4b378fa17e6ef55dffedc58014d7f3feee625ca0 100644 >--- a/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >+++ b/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >@@ -32,22 +32,36 @@ > #import <AVFoundation/AVSpeechSynthesis.h> > #import <wtf/BlockObjCExceptions.h> > #import <wtf/RetainPtr.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVSpeechSynthesizer) >-SOFT_LINK_CLASS(AVFoundation, AVSpeechUtterance) >-SOFT_LINK_CLASS(AVFoundation, AVSpeechSynthesisVoice) >+#import <pal/cocoa/AVFoundationSoftLink.h> > >-SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceDefaultSpeechRate, float) >-SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceMaximumSpeechRate, float) >+static float getAVSpeechUtteranceDefaultSpeechRate() >+{ >+ static float value; >+ static void* symbol; >+ if (!symbol) { >+ void* symbol = dlsym(PAL::AVFoundationLibrary(), "AVSpeechUtteranceDefaultSpeechRate"); >+ RELEASE_ASSERT_WITH_MESSAGE(symbol, "%s", dlerror()); >+ value = *static_cast<float const *>(symbol); >+ } >+ return value; >+} >+ >+static float getAVSpeechUtteranceMaximumSpeechRate() >+{ >+ static float value; >+ static void* symbol; >+ if (!symbol) { >+ void* symbol = dlsym(PAL::AVFoundationLibrary(), "AVSpeechUtteranceMaximumSpeechRate"); >+ RELEASE_ASSERT_WITH_MESSAGE(symbol, "%s", dlerror()); >+ value = *static_cast<float const *>(symbol); >+ } >+ return value; >+} > > #define AVSpeechUtteranceDefaultSpeechRate getAVSpeechUtteranceDefaultSpeechRate() > #define AVSpeechUtteranceMaximumSpeechRate getAVSpeechUtteranceMaximumSpeechRate() > >-#define AVSpeechUtteranceClass getAVSpeechUtteranceClass() >-#define AVSpeechSynthesisVoiceClass getAVSpeechSynthesisVoiceClass() >- > @interface WebSpeechSynthesisWrapper : NSObject<AVSpeechSynthesizerDelegate> > { > WebCore::PlatformSpeechSynthesizer* m_synthesizerObject; >@@ -96,7 +110,7 @@ SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceMaximumSpeechRate, float) > > BEGIN_BLOCK_OBJC_EXCEPTIONS > if (!m_synthesizer) { >- m_synthesizer = adoptNS([allocAVSpeechSynthesizerInstance() init]); >+ m_synthesizer = adoptNS([PAL::allocAVSpeechSynthesizerInstance() init]); > [m_synthesizer setDelegate:self]; > } > >@@ -106,7 +120,7 @@ SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceMaximumSpeechRate, float) > NSString *voiceLanguage = nil; > if (!utteranceVoice) { > if (utterance->lang().isEmpty()) >- voiceLanguage = [AVSpeechSynthesisVoiceClass currentLanguageCode]; >+ voiceLanguage = [PAL::getAVSpeechSynthesisVoiceClass() currentLanguageCode]; > else > voiceLanguage = utterance->lang(); > } else >@@ -114,9 +128,9 @@ SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceMaximumSpeechRate, float) > > AVSpeechSynthesisVoice *avVoice = nil; > if (voiceLanguage) >- avVoice = [AVSpeechSynthesisVoiceClass voiceWithLanguage:voiceLanguage]; >+ avVoice = [PAL::getAVSpeechSynthesisVoiceClass() voiceWithLanguage:voiceLanguage]; > >- AVSpeechUtterance *avUtterance = [AVSpeechUtteranceClass speechUtteranceWithString:utterance->text()]; >+ AVSpeechUtterance *avUtterance = [PAL::getAVSpeechUtteranceClass() speechUtteranceWithString:utterance->text()]; > > [avUtterance setRate:[self mapSpeechRateToPlatformRate:utterance->rate()]]; > [avUtterance setVolume:utterance->volume()]; >@@ -244,7 +258,7 @@ PlatformSpeechSynthesizer::~PlatformSpeechSynthesizer() > void PlatformSpeechSynthesizer::initializeVoiceList() > { > BEGIN_BLOCK_OBJC_EXCEPTIONS >- for (AVSpeechSynthesisVoice *voice in [AVSpeechSynthesisVoiceClass speechVoices]) { >+ for (AVSpeechSynthesisVoice *voice in [PAL::getAVSpeechSynthesisVoiceClass() speechVoices]) { > NSString *language = [voice language]; > bool isDefault = true; > NSString *voiceURI = [voice identifier]; >diff --git a/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm b/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >index 08724eb66c69b48bdfec0d236d720ac8defdc57b..03ce20c90273378f4e862a82edf9d18e6c12442b 100644 >--- a/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >+++ b/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >@@ -43,7 +43,6 @@ > #import <UIKit/UIWindow.h> > #import <objc/message.h> > #import <objc/runtime.h> >-#import <pal/ios/UIKitSoftLink.h> > #import <pal/spi/cocoa/AVKitSPI.h> > #import <pal/spi/ios/UIKitSPI.h> > #import <wtf/RetainPtr.h> >@@ -52,14 +51,9 @@ > > using namespace WebCore; > >-// Soft-linking headers must be included last since they #define functions, constants, etc. > #import <pal/cf/CoreMediaSoftLink.h> >- >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >+#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <pal/ios/UIKitSoftLink.h> > > SOFTLINK_AVKIT_FRAMEWORK() > SOFT_LINK_CLASS_OPTIONAL(AVKit, AVPictureInPictureController) >@@ -210,7 +204,7 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > self = [super init]; > if (self) { > [self setMasksToBounds:YES]; >- _videoGravity = getAVLayerVideoGravityResizeAspect(); >+ _videoGravity = AVLayerVideoGravityResizeAspect; > } > return self; > } >@@ -270,13 +264,13 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > FloatRect targetVideoFrame; > float videoAspectRatio = self.videoDimensions.width / self.videoDimensions.height; > >- if ([getAVLayerVideoGravityResize() isEqualToString:self.videoGravity]) { >+ if ([AVLayerVideoGravityResize isEqualToString:self.videoGravity]) { > sourceVideoFrame = self.modelVideoLayerFrame; > targetVideoFrame = self.bounds; >- } else if ([getAVLayerVideoGravityResizeAspect() isEqualToString:self.videoGravity]) { >+ } else if ([AVLayerVideoGravityResizeAspect isEqualToString:self.videoGravity]) { > sourceVideoFrame = largestRectWithAspectRatioInsideRect(videoAspectRatio, self.modelVideoLayerFrame); > targetVideoFrame = largestRectWithAspectRatioInsideRect(videoAspectRatio, self.bounds); >- } else if ([getAVLayerVideoGravityResizeAspectFill() isEqualToString:self.videoGravity]) { >+ } else if ([AVLayerVideoGravityResizeAspectFill isEqualToString:self.videoGravity]) { > sourceVideoFrame = smallestRectWithAspectRatioAroundRect(videoAspectRatio, self.modelVideoLayerFrame); > self.modelVideoLayerFrame = CGRectMake(0, 0, sourceVideoFrame.width(), sourceVideoFrame.height()); > if (auto* model = _fullscreenInterface->videoFullscreenModel()) >@@ -328,7 +322,7 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > #if PLATFORM(IOSMAC) > // FIXME<rdar://46011230>: remove this #if once this radar lands. > if (!videoGravity) >- videoGravity = getAVLayerVideoGravityResizeAspect(); >+ videoGravity = AVLayerVideoGravityResizeAspect; > #endif > > _videoGravity = videoGravity; >@@ -337,11 +331,11 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > return; > > WebCore::MediaPlayerEnums::VideoGravity gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspect; >- if (videoGravity == getAVLayerVideoGravityResize()) >+ if (videoGravity == AVLayerVideoGravityResize) > gravity = WebCore::MediaPlayerEnums::VideoGravityResize; >- if (videoGravity == getAVLayerVideoGravityResizeAspect()) >+ if (videoGravity == AVLayerVideoGravityResizeAspect) > gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspect; >- else if (videoGravity == getAVLayerVideoGravityResizeAspectFill()) >+ else if (videoGravity == AVLayerVideoGravityResizeAspectFill) > gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspectFill; > else > ASSERT_NOT_REACHED(); >@@ -362,9 +356,9 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > > float videoAspectRatio = self.videoDimensions.width / self.videoDimensions.height; > >- if ([getAVLayerVideoGravityResizeAspect() isEqualToString:self.videoGravity]) >+ if ([AVLayerVideoGravityResizeAspect isEqualToString:self.videoGravity]) > return largestRectWithAspectRatioInsideRect(videoAspectRatio, self.bounds); >- if ([getAVLayerVideoGravityResizeAspectFill() isEqualToString:self.videoGravity]) >+ if ([AVLayerVideoGravityResizeAspectFill isEqualToString:self.videoGravity]) > return smallestRectWithAspectRatioAroundRect(videoAspectRatio, self.bounds); > > return self.bounds; >@@ -460,7 +454,7 @@ static void WebAVPlayerLayerView_startRoutingVideoToPictureInPicturePlayerLayerV > > WebAVPlayerLayer *playerLayer = (WebAVPlayerLayer *)[playerLayerView playerLayer]; > WebAVPlayerLayer *pipPlayerLayer = (WebAVPlayerLayer *)[pipView layer]; >- [playerLayer setVideoGravity:getAVLayerVideoGravityResizeAspect()]; >+ [playerLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; > [pipPlayerLayer setVideoSublayer:playerLayer.videoSublayer]; > [pipPlayerLayer setVideoDimensions:playerLayer.videoDimensions]; > [pipPlayerLayer setVideoGravity:playerLayer.videoGravity]; >diff --git a/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm b/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >index e997b36a7b4a50a1633ab0aeaaf6aa1b78f5db65..c2953ca03e805cfdc3050cf60a35d84cffc95879 100644 >--- a/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >+++ b/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >@@ -38,14 +38,9 @@ > #import <JavaScriptCore/JSObjectRef.h> > #import <JavaScriptCore/JavaScriptCore.h> > #import <objc/runtime.h> >-#import <wtf/SoftLinking.h> > #import <wtf/text/Base64.h> > >-typedef AVMetadataItem AVMetadataItemType; >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >-#define AVMetadataItem getAVMetadataItemClass() >- >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebCore { > >@@ -53,7 +48,7 @@ namespace WebCore { > static JSValue *jsValueWithDataInContext(NSData *, JSContext *); > static JSValue *jsValueWithArrayInContext(NSArray *, JSContext *); > static JSValue *jsValueWithDictionaryInContext(NSDictionary *, JSContext *); >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *, JSContext *); >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *, JSContext *); > static JSValue *jsValueWithValueInContext(id, JSContext *); > #endif > >@@ -136,7 +131,7 @@ static JSValue *jsValueWithValueInContext(id value, JSContext *context) > if ([value isKindOfClass:[NSData class]]) > return jsValueWithDataInContext(value, context); > >- if ([value isKindOfClass:[AVMetadataItem class]]) >+ if ([value isKindOfClass:PAL::getAVMetadataItemClass()]) > return jsValueWithAVMetadataItemInContext(value, context); > > return nil; >@@ -199,7 +194,7 @@ static JSValue *jsValueWithDictionaryInContext(NSDictionary *dictionary, JSConte > return result; > } > >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *item, JSContext *context) >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *item, JSContext *context) > { > NSMutableDictionary *dictionary = [NSMutableDictionary dictionary]; > >diff --git a/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm b/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >index 2b784a4deb6a198bd583006e539dff54edbb1f1a..75d46567869841dc04cd9dad39af9158bd717284 100644 >--- a/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >+++ b/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >@@ -37,52 +37,15 @@ > #include <pal/cf/CoreMediaSoftLink.h> > #include <wtf/FileSystem.h> > >-typedef AVAssetWriter AVAssetWriterType; >-typedef AVAssetWriterInput AVAssetWriterInputType; >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVAssetWriter) >-SOFT_LINK_CLASS(AVFoundation, AVAssetWriterInput) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVFileTypeMPEG4, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoCodecKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoCodecH264, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoWidthKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoHeightKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoProfileLevelKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoAverageBitRateKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVVideoCompressionPropertiesKey, NSString *) >- >-#define AVFileTypeMPEG4 getAVFileTypeMPEG4() >-#define AVMediaTypeAudio getAVMediaTypeAudio() >-#define AVMediaTypeVideo getAVMediaTypeVideo() >-#define AVVideoCodecKey getAVVideoCodecKey() >-#define AVVideoCodecH264 getAVVideoCodecH264() >-#define AVVideoWidthKey getAVVideoWidthKey() >-#define AVVideoHeightKey getAVVideoHeightKey() >- >-#define AVVideoExpectedSourceFrameRateKey getAVVideoExpectedSourceFrameRateKey() >-#define AVVideoProfileLevelKey getAVVideoProfileLevelKey() >-#define AVVideoAverageBitRateKey getAVVideoAverageBitRateKey() >-#define AVVideoMaxKeyFrameIntervalKey getAVVideoMaxKeyFrameIntervalKey() >-#define AVVideoProfileLevelH264MainAutoLevel getAVVideoProfileLevelH264MainAutoLevel() >-#define AVVideoCompressionPropertiesKey getAVVideoCompressionPropertiesKey() >- >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVEncoderBitRateKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVFormatIDKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVNumberOfChannelsKey, NSString *) >-SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVSampleRateKey, NSString *) >+#import <pal/cocoa/AVFoundationSoftLink.h> > >+#undef AVEncoderBitRateKey > #define AVEncoderBitRateKey getAVEncoderBitRateKeyWithFallback() >+#undef AVFormatIDKey > #define AVFormatIDKey getAVFormatIDKeyWithFallback() >+#undef AVNumberOfChannelsKey > #define AVNumberOfChannelsKey getAVNumberOfChannelsKeyWithFallback() >+#undef AVSampleRateKey > #define AVSampleRateKey getAVSampleRateKeyWithFallback() > > namespace WebCore { >@@ -91,8 +54,8 @@ using namespace PAL; > > static NSString *getAVFormatIDKeyWithFallback() > { >- if (canLoadAVFormatIDKey()) >- return getAVFormatIDKey(); >+ if (PAL::canLoad_AVFoundation_AVFormatIDKey()) >+ return PAL::get_AVFoundation_AVFormatIDKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVFormatIDKey"); > return @"AVFormatIDKey"; >@@ -100,8 +63,8 @@ static NSString *getAVFormatIDKeyWithFallback() > > static NSString *getAVNumberOfChannelsKeyWithFallback() > { >- if (canLoadAVNumberOfChannelsKey()) >- return getAVNumberOfChannelsKey(); >+ if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey()) >+ return PAL::get_AVFoundation_AVNumberOfChannelsKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVNumberOfChannelsKey"); > return @"AVNumberOfChannelsKey"; >@@ -109,8 +72,8 @@ static NSString *getAVNumberOfChannelsKeyWithFallback() > > static NSString *getAVSampleRateKeyWithFallback() > { >- if (canLoadAVSampleRateKey()) >- return getAVSampleRateKey(); >+ if (PAL::canLoad_AVFoundation_AVSampleRateKey()) >+ return PAL::get_AVFoundation_AVSampleRateKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVSampleRateKey"); > return @"AVSampleRateKey"; >@@ -118,8 +81,8 @@ static NSString *getAVSampleRateKeyWithFallback() > > static NSString *getAVEncoderBitRateKeyWithFallback() > { >- if (canLoadAVEncoderBitRateKey()) >- return getAVEncoderBitRateKey(); >+ if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey()) >+ return PAL::get_AVFoundation_AVEncoderBitRateKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVEncoderBitRateKey"); > return @"AVEncoderBitRateKey"; >@@ -134,7 +97,7 @@ RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(const Medi > NSURL *outputURL = [NSURL fileURLWithPath:path]; > String filePath = [path UTF8String]; > NSError *error = nil; >- auto avAssetWriter = adoptNS([allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]); >+ auto avAssetWriter = adoptNS([PAL::allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]); > if (error) { > RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)error.code); > return nullptr; >@@ -197,7 +160,7 @@ bool MediaRecorderPrivateWriter::setVideoInput(int width, int height) > AVVideoCompressionPropertiesKey: compressionProperties > }; > >- m_videoInput = adoptNS([allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]); >+ m_videoInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]); > [m_videoInput setExpectsMediaDataInRealTime:true]; > > if (![m_writer canAddInput:m_videoInput.get()]) { >@@ -221,7 +184,7 @@ bool MediaRecorderPrivateWriter::setAudioInput() > AVSampleRateKey : @(22050) > }; > >- m_audioInput = adoptNS([allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]); >+ m_audioInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]); > [m_audioInput setExpectsMediaDataInRealTime:true]; > > if (![m_writer canAddInput:m_audioInput.get()]) { >diff --git a/Source/WebCore/platform/mediastream/RealtimeVideoSource.h b/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >index ac34400d5240a1036ef420af136956e0fd26fec1..99bacbe242399ba46459c3bfda8c9a0daff18ea0 100644 >--- a/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >+++ b/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >@@ -27,7 +27,6 @@ > > #if ENABLE(MEDIA_STREAM) > >-#include "FontCascade.h" > #include "ImageBuffer.h" > #include "MediaSample.h" > #include "RealtimeMediaSource.h" >diff --git a/Source/WebCore/platform/mediastream/VideoPreset.h b/Source/WebCore/platform/mediastream/VideoPreset.h >index f99ff0a38c744f7613d79ad9a5ae243da019fe2c..2033c76db65fa39e6adca39dd82c4d092c7586ff 100644 >--- a/Source/WebCore/platform/mediastream/VideoPreset.h >+++ b/Source/WebCore/platform/mediastream/VideoPreset.h >@@ -27,7 +27,6 @@ > > #if ENABLE(MEDIA_STREAM) > >-#include "FontCascade.h" > #include "ImageBuffer.h" > #include "MediaSample.h" > #include "RealtimeMediaSource.h" >diff --git a/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm b/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >index 0406d6c9b531d5297a8938b274fb38a603f09440..46050d93834f3a937fee3755737deaa4e2c9c0a5 100644 >--- a/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >+++ b/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >@@ -23,20 +23,17 @@ > * THE POSSIBILITY OF SUCH DAMAGE. > */ > >-#include "config.h" >-#include "AVAudioSessionCaptureDeviceManager.h" >+#import "config.h" >+#import "AVAudioSessionCaptureDeviceManager.h" > > #if ENABLE(MEDIA_STREAM) && PLATFORM(IOS_FAMILY) > >-#include "AVAudioSessionCaptureDevice.h" >-#include "RealtimeMediaSourceCenter.h" >-#include <AVFoundation/AVAudioSession.h> >-#include <wtf/SoftLinking.h> >-#include <wtf/Vector.h> >+#import "AVAudioSessionCaptureDevice.h" >+#import "RealtimeMediaSourceCenter.h" >+#import <AVFoundation/AVAudioSession.h> >+#import <wtf/Vector.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >-#define AVAudioSession getAVAudioSessionClass() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > void* AvailableInputsContext = &AvailableInputsContext; > >@@ -127,13 +124,13 @@ void AVAudioSessionCaptureDeviceManager::refreshAudioCaptureDevices() > m_listener = adoptNS([[WebAVAudioSessionAvailableInputsListener alloc] initWithCallback:[this] { > refreshAudioCaptureDevices(); > }]); >- [[AVAudioSession sharedInstance] addObserver:m_listener.get() forKeyPath:@"availableInputs" options:0 context:AvailableInputsContext]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] addObserver:m_listener.get() forKeyPath:@"availableInputs" options:0 context:AvailableInputsContext]; > } > > Vector<AVAudioSessionCaptureDevice> newAudioDevices; > Vector<CaptureDevice> newDevices; > >- for (AVAudioSessionPortDescription *portDescription in [AVAudioSession sharedInstance].availableInputs) { >+ for (AVAudioSessionPortDescription *portDescription in [PAL::getAVAudioSessionClass() sharedInstance].availableInputs) { > auto audioDevice = AVAudioSessionCaptureDevice::create(portDescription); > newDevices.append(audioDevice); > newAudioDevices.append(WTFMove(audioDevice)); >diff --git a/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm b/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >index 2ed832427412244f2d6f38604c2093a41c543a55..6fb3dad39c2d9e978dc521b7bb83f699a93253ff 100644 >--- a/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >+++ b/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >@@ -31,21 +31,8 @@ > #import "Logging.h" > #import <AVFoundation/AVAudioSession.h> > #import <wtf/MainThread.h> >-#import <wtf/SoftLinking.h> > >-typedef AVAudioSession AVAudioSessionType; >- >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *) >- >-#define AVAudioSession getAVAudioSessionClass() >-#define AVAudioSessionInterruptionNotification getAVAudioSessionInterruptionNotification() >-#define AVAudioSessionInterruptionTypeKey getAVAudioSessionInterruptionTypeKey() >-#define AVAudioSessionMediaServicesWereResetNotification getAVAudioSessionMediaServicesWereResetNotification() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > using namespace WebCore; > >@@ -68,7 +55,7 @@ using namespace WebCore; > _callback = callback; > > NSNotificationCenter* center = [NSNotificationCenter defaultCenter]; >- AVAudioSessionType* session = [AVAudioSession sharedInstance]; >+ AVAudioSession* session = [PAL::getAVAudioSessionClass() sharedInstance]; > > [center addObserver:self selector:@selector(handleInterruption:) name:AVAudioSessionInterruptionNotification object:session]; > [center addObserver:self selector:@selector(sessionMediaServicesWereReset:) name:AVAudioSessionMediaServicesWereResetNotification object:session]; >@@ -95,7 +82,7 @@ using namespace WebCore; > > if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] intValue] == AVAudioSessionInterruptionTypeEnded) { > NSError *error = nil; >- [[AVAudioSession sharedInstance] setActive:YES error:&error]; >+ [[PAL::getAVAudioSessionClass() sharedInstance] setActive:YES error:&error]; > > #if !LOG_DISABLED > if (error) >diff --git a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >index 10626aaef8aa0233213a6f53f9aab997a7055332..a9ea3e1b3ecaa3f93916e6d157462d2313e7cd52 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >@@ -41,27 +41,8 @@ > #import <objc/runtime.h> > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> >-#import <wtf/SoftLinking.h> > >-typedef AVCaptureDevice AVCaptureDeviceTypedef; >-typedef AVCaptureSession AVCaptureSessionType; >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeMuxed, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >- >-#define AVMediaTypeAudio getAVMediaTypeAudio() >-#define AVMediaTypeMuxed getAVMediaTypeMuxed() >-#define AVMediaTypeVideo getAVMediaTypeVideo() >-#define AVCaptureDeviceWasConnectedNotification getAVCaptureDeviceWasConnectedNotification() >-#define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() >+#import <pal/cocoa/AVFoundationSoftLink.h> > > using namespace WebCore; > >@@ -99,7 +80,7 @@ const Vector<CaptureDevice>& AVCaptureDeviceManager::captureDevices() > return captureDevicesInternal(); > } > >-inline static bool deviceIsAvailable(AVCaptureDeviceTypedef *device) >+inline static bool deviceIsAvailable(AVCaptureDevice *device) > { > if (![device isConnected]) > return false; >@@ -114,20 +95,20 @@ inline static bool deviceIsAvailable(AVCaptureDeviceTypedef *device) > > void AVCaptureDeviceManager::updateCachedAVCaptureDevices() > { >- auto* currentDevices = [getAVCaptureDeviceClass() devices]; >+ auto* currentDevices = [PAL::getAVCaptureDeviceClass() devices]; > auto changedDevices = adoptNS([[NSMutableArray alloc] init]); >- for (AVCaptureDeviceTypedef *cachedDevice in m_avCaptureDevices.get()) { >+ for (AVCaptureDevice *cachedDevice in m_avCaptureDevices.get()) { > if (![currentDevices containsObject:cachedDevice]) > [changedDevices addObject:cachedDevice]; > } > > if ([changedDevices count]) { >- for (AVCaptureDeviceTypedef *device in changedDevices.get()) >+ for (AVCaptureDevice *device in changedDevices.get()) > [device removeObserver:m_objcObserver.get() forKeyPath:@"suspended"]; > [m_avCaptureDevices removeObjectsInArray:changedDevices.get()]; > } > >- for (AVCaptureDeviceTypedef *device in currentDevices) { >+ for (AVCaptureDevice *device in currentDevices) { > > if (![device hasMediaType:AVMediaTypeVideo] && ![device hasMediaType:AVMediaTypeMuxed]) > continue; >@@ -151,9 +132,9 @@ void AVCaptureDeviceManager::refreshCaptureDevices() > updateCachedAVCaptureDevices(); > > bool deviceHasChanged = false; >- auto* currentDevices = [getAVCaptureDeviceClass() devices]; >+ auto* currentDevices = [PAL::getAVCaptureDeviceClass() devices]; > Vector<CaptureDevice> deviceList; >- for (AVCaptureDeviceTypedef *platformDevice in currentDevices) { >+ for (AVCaptureDevice *platformDevice in currentDevices) { > > if (![platformDevice hasMediaType:AVMediaTypeVideo] && ![platformDevice hasMediaType:AVMediaTypeMuxed]) > continue; >@@ -179,7 +160,7 @@ void AVCaptureDeviceManager::refreshCaptureDevices() > > bool AVCaptureDeviceManager::isAvailable() > { >- return AVFoundationLibrary(); >+ return PAL::AVFoundationLibrary(); > } > > AVCaptureDeviceManager& AVCaptureDeviceManager::singleton() >@@ -197,7 +178,7 @@ AVCaptureDeviceManager::~AVCaptureDeviceManager() > { > [[NSNotificationCenter defaultCenter] removeObserver:m_objcObserver.get()]; > [m_objcObserver disconnect]; >- for (AVCaptureDeviceTypedef *device in m_avCaptureDevices.get()) >+ for (AVCaptureDevice *device in m_avCaptureDevices.get()) > [device removeObserver:m_objcObserver.get() forKeyPath:@"suspended"]; > } > >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >index 7f3ec82b3a8f3cfab79d84fe52e4a23c8d40c8c8..85144adf4590a6d8c42fe05b813fc51fa8b3cb3b 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >@@ -45,55 +45,9 @@ > #import <AVFoundation/AVError.h> > #import <objc/runtime.h> > >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" >- >-typedef AVCaptureConnection AVCaptureConnectionType; >-typedef AVCaptureDevice AVCaptureDeviceTypedef; >-typedef AVCaptureDeviceFormat AVCaptureDeviceFormatType; >-typedef AVCaptureDeviceInput AVCaptureDeviceInputType; >-typedef AVCaptureOutput AVCaptureOutputType; >-typedef AVCaptureVideoDataOutput AVCaptureVideoDataOutputType; >-typedef AVFrameRateRange AVFrameRateRangeType; >-typedef AVCaptureSession AVCaptureSessionType; >- >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >- >-SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceFormat) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureVideoDataOutput) >-SOFT_LINK_CLASS(AVFoundation, AVFrameRateRange) >-SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) >- >-#define AVCaptureConnection getAVCaptureConnectionClass() >-#define AVCaptureDevice getAVCaptureDeviceClass() >-#define AVCaptureDeviceFormat getAVCaptureDeviceFormatClass() >-#define AVCaptureDeviceInput getAVCaptureDeviceInputClass() >-#define AVCaptureOutput getAVCaptureOutputClass() >-#define AVCaptureVideoDataOutput getAVCaptureVideoDataOutputClass() >-#define AVFrameRateRange getAVFrameRateRangeClass() >- >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >- >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >-#define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) >-SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >-SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *) >-SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *) >-SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionErrorKey, NSString *) >- >-#define AVCaptureSessionRuntimeErrorNotification getAVCaptureSessionRuntimeErrorNotification() >-#define AVCaptureSessionWasInterruptedNotification getAVCaptureSessionWasInterruptedNotification() >-#define AVCaptureSessionInterruptionEndedNotification getAVCaptureSessionInterruptionEndedNotification() >-#define AVCaptureSessionInterruptionReasonKey getAVCaptureSessionInterruptionReasonKey() >-#define AVCaptureSessionErrorKey getAVCaptureSessionErrorKey() >-#endif >+#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <pal/cf/CoreMediaSoftLink.h> > > using namespace WebCore; > using namespace PAL; >@@ -106,7 +60,7 @@ using namespace PAL; > -(void)disconnect; > -(void)addNotificationObservers; > -(void)removeNotificationObservers; >--(void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection; >+-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection; > -(void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context; > #if PLATFORM(IOS_FAMILY) > -(void)sessionRuntimeError:(NSNotification*)notification; >@@ -140,23 +94,23 @@ static dispatch_queue_t globaVideoCaptureSerialQueue() > > class AVVideoPreset : public VideoPreset { > public: >- static Ref<AVVideoPreset> create(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormatType* format) >+ static Ref<AVVideoPreset> create(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormat* format) > { > return adoptRef(*new AVVideoPreset(size, WTFMove(frameRateRanges), format)); > } > >- AVVideoPreset(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormatType* format) >+ AVVideoPreset(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormat* format) > : VideoPreset(size, WTFMove(frameRateRanges), AVCapture) > , format(format) > { > } > >- RetainPtr<AVCaptureDeviceFormatType> format; >+ RetainPtr<AVCaptureDeviceFormat> format; > }; > > CaptureSourceOrError AVVideoCaptureSource::create(String&& id, String&& hashSalt, const MediaConstraints* constraints) > { >- AVCaptureDeviceTypedef *device = [getAVCaptureDeviceClass() deviceWithUniqueID:id]; >+ AVCaptureDevice *device = [PAL::getAVCaptureDeviceClass() deviceWithUniqueID:id]; > if (!device) > return { }; > >@@ -170,7 +124,7 @@ CaptureSourceOrError AVVideoCaptureSource::create(String&& id, String&& hashSalt > return CaptureSourceOrError(WTFMove(source)); > } > >-AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDeviceTypedef* device, String&& id, String&& hashSalt) >+AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDevice* device, String&& id, String&& hashSalt) > : RealtimeVideoSource(device.localizedName, WTFMove(id), WTFMove(hashSalt)) > , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] initWithCallback:this])) > , m_device(device) >@@ -305,7 +259,7 @@ const RealtimeMediaSourceCapabilities& AVVideoCaptureSource::capabilities() > RealtimeMediaSourceCapabilities capabilities(settings().supportedConstraints()); > capabilities.setDeviceId(hashedId()); > >- AVCaptureDeviceTypedef *videoDevice = device(); >+ AVCaptureDevice *videoDevice = device(); > if ([videoDevice position] == AVCaptureDevicePositionFront) > capabilities.addFacingMode(RealtimeMediaSourceSettings::User); > if ([videoDevice position] == AVCaptureDevicePositionBack) >@@ -421,9 +375,9 @@ static inline int sensorOrientation(AVCaptureVideoOrientation videoOrientation) > #endif > } > >-static inline int sensorOrientationFromVideoOutput(AVCaptureVideoDataOutputType* videoOutput) >+static inline int sensorOrientationFromVideoOutput(AVCaptureVideoDataOutput* videoOutput) > { >- AVCaptureConnectionType* connection = [videoOutput connectionWithMediaType: getAVMediaTypeVideo()]; >+ AVCaptureConnection* connection = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; > return connection ? sensorOrientation([connection videoOrientation]) : 0; > } > >@@ -434,7 +388,7 @@ bool AVVideoCaptureSource::setupSession() > > ALWAYS_LOG_IF(loggerPtr(), LOGIDENTIFIER); > >- m_session = adoptNS([allocAVCaptureSessionInstance() init]); >+ m_session = adoptNS([PAL::allocAVCaptureSessionInstance() init]); > [m_session addObserver:m_objcObserver.get() forKeyPath:@"running" options:NSKeyValueObservingOptionNew context:(void *)nil]; > > [m_session beginConfiguration]; >@@ -447,10 +401,10 @@ bool AVVideoCaptureSource::setupSession() > return success; > } > >-AVFrameRateRangeType* AVVideoCaptureSource::frameDurationForFrameRate(double rate) >+AVFrameRateRange* AVVideoCaptureSource::frameDurationForFrameRate(double rate) > { >- AVFrameRateRangeType *bestFrameRateRange = nil; >- for (AVFrameRateRangeType *frameRateRange in [[device() activeFormat] videoSupportedFrameRateRanges]) { >+ AVFrameRateRange *bestFrameRateRange = nil; >+ for (AVFrameRateRange *frameRateRange in [[device() activeFormat] videoSupportedFrameRateRanges]) { > if (frameRateRangeIncludesRate({ [frameRateRange minFrameRate], [frameRateRange maxFrameRate] }, rate)) { > if (!bestFrameRateRange || CMTIME_COMPARE_INLINE([frameRateRange minFrameDuration], >, [bestFrameRateRange minFrameDuration])) > bestFrameRateRange = frameRateRange; >@@ -472,7 +426,7 @@ bool AVVideoCaptureSource::setupCaptureSession() > #endif > > NSError *error = nil; >- RetainPtr<AVCaptureDeviceInputType> videoIn = adoptNS([allocAVCaptureDeviceInputInstance() initWithDevice:device() error:&error]); >+ RetainPtr<AVCaptureDeviceInput> videoIn = adoptNS([PAL::allocAVCaptureDeviceInputInstance() initWithDevice:device() error:&error]); > if (error) { > ERROR_LOG_IF(loggerPtr(), LOGIDENTIFIER, "failed to allocate AVCaptureDeviceInput ", [[error localizedDescription] UTF8String]); > return false; >@@ -484,7 +438,7 @@ bool AVVideoCaptureSource::setupCaptureSession() > } > [session() addInput:videoIn.get()]; > >- m_videoOutput = adoptNS([allocAVCaptureVideoDataOutputInstance() init]); >+ m_videoOutput = adoptNS([PAL::allocAVCaptureVideoDataOutputInstance() init]); > auto settingsDictionary = adoptNS([[NSMutableDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithInt:avVideoCapturePixelBufferFormat()], kCVPixelBufferPixelFormatTypeKey, nil]); > > [m_videoOutput setVideoSettings:settingsDictionary.get()]; >@@ -568,7 +522,7 @@ void AVVideoCaptureSource::processNewFrame(Ref<MediaSample>&& sample) > dispatchMediaSampleToObservers(WTFMove(sample)); > } > >-void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType* captureConnection) >+void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef sampleBuffer, AVCaptureConnection* captureConnection) > { > if (m_framesToDropAtStartup && m_framesToDropAtStartup--) > return; >@@ -619,7 +573,7 @@ bool AVVideoCaptureSource::interrupted() const > void AVVideoCaptureSource::generatePresets() > { > Vector<Ref<VideoPreset>> presets; >- for (AVCaptureDeviceFormatType* format in [device() formats]) { >+ for (AVCaptureDeviceFormat* format in [device() formats]) { > > CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription); > IntSize size = { dimensions.width, dimensions.height }; >@@ -630,7 +584,7 @@ void AVVideoCaptureSource::generatePresets() > continue; > > Vector<FrameRateRange> frameRates; >- for (AVFrameRateRangeType *range in [format videoSupportedFrameRateRanges]) >+ for (AVFrameRateRange* range in [format videoSupportedFrameRateRanges]) > frameRates.append({ range.minFrameRate, range.maxFrameRate}); > > presets.append(AVVideoPreset::create(size, WTFMove(frameRates), format)); >@@ -712,7 +666,7 @@ void AVVideoCaptureSource::deviceDisconnected(RetainPtr<NSNotification> notifica > [center addObserver:self selector:@selector(deviceConnectedDidChange:) name:AVCaptureDeviceWasDisconnectedNotification object:nil]; > > #if PLATFORM(IOS_FAMILY) >- AVCaptureSessionType* session = m_callback->session(); >+ AVCaptureSession* session = m_callback->session(); > [center addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:session]; > [center addObserver:self selector:@selector(beginSessionInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:session]; > [center addObserver:self selector:@selector(endSessionInterrupted:) name:AVCaptureSessionInterruptionEndedNotification object:session]; >@@ -724,7 +678,7 @@ void AVVideoCaptureSource::deviceDisconnected(RetainPtr<NSNotification> notifica > [[NSNotificationCenter defaultCenter] removeObserver:self]; > } > >-- (void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection >+- (void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection > { > if (!m_callback) > return; >diff --git a/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm b/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >index 04759013b0c54914a13d9b0e79012eac1f3883b0..5c78987bb73a7868b2ac7b9bea0e3b4bf577c4b4 100644 >--- a/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >+++ b/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >@@ -46,8 +46,8 @@ > #import <QuartzCore/CATransaction.h> > #import <objc/runtime.h> > >-#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" >+#import <pal/cf/CoreMediaSoftLink.h> > > namespace WebCore { > using namespace PAL; >diff --git a/Source/WebKit/Shared/ios/WebIconUtilities.mm b/Source/WebKit/Shared/ios/WebIconUtilities.mm >index 470eca4567a0329273b3bf5c98b7dfd2da627416..792c7de9b8ddad6d59fbe0ff9d8334aa4574dd02 100644 >--- a/Source/WebKit/Shared/ios/WebIconUtilities.mm >+++ b/Source/WebKit/Shared/ios/WebIconUtilities.mm >@@ -34,14 +34,11 @@ > #import <CoreMedia/CoreMedia.h> > #import <ImageIO/ImageIO.h> > #import <MobileCoreServices/MobileCoreServices.h> >-#import <pal/cf/CoreMediaSoftLink.h> > #import <wtf/MathExtras.h> > #import <wtf/RetainPtr.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation); >-SOFT_LINK_CLASS(AVFoundation, AVAssetImageGenerator); >-SOFT_LINK_CLASS(AVFoundation, AVURLAsset); >+#import <pal/cf/CoreMediaSoftLink.h> >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebKit { > >@@ -120,8 +117,8 @@ UIImage* iconForVideoFile(NSURL *file) > { > ASSERT_ARG(file, [file isFileURL]); > >- RetainPtr<AVURLAsset> asset = adoptNS([allocAVURLAssetInstance() initWithURL:file options:nil]); >- RetainPtr<AVAssetImageGenerator> generator = adoptNS([allocAVAssetImageGeneratorInstance() initWithAsset:asset.get()]); >+ RetainPtr<AVURLAsset> asset = adoptNS([PAL::allocAVURLAssetInstance() initWithURL:file options:nil]); >+ RetainPtr<AVAssetImageGenerator> generator = adoptNS([PAL::allocAVAssetImageGeneratorInstance() initWithAsset:asset.get()]); > [generator setAppliesPreferredTrackTransform:YES]; > > NSError *error = nil; >diff --git a/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm b/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >index 21c006776a66741ab94ae69180bf6b410f994930..5ba52c90aeb0fead6f448d7086f59a2ae6e9e952 100644 >--- a/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >+++ b/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >@@ -44,11 +44,8 @@ > #if ENABLE(WIRELESS_PLAYBACK_TARGET) > #import <WebCore/MediaPlaybackTargetContext.h> > #import <objc/runtime.h> >-#import <pal/spi/mac/AVFoundationSPI.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVOutputContext) >+#import <pal/cocoa/AVFoundationSoftLink.h> > #endif > > namespace IPC { >@@ -580,16 +577,16 @@ bool ArgumentCoder<WebCore::ContentFilterUnblockHandler>::decode(Decoder& decode > > void ArgumentCoder<WebCore::MediaPlaybackTargetContext>::encodePlatformData(Encoder& encoder, const WebCore::MediaPlaybackTargetContext& target) > { >- if ([getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) >+ if ([PAL::getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) > encoder << target.avOutputContext(); > } > > bool ArgumentCoder<WebCore::MediaPlaybackTargetContext>::decodePlatformData(Decoder& decoder, WebCore::MediaPlaybackTargetContext& target) > { >- if (![getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) >+ if (![PAL::getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) > return false; > >- auto context = IPC::decode<AVOutputContext>(decoder, getAVOutputContextClass()); >+ auto context = IPC::decode<AVOutputContext>(decoder, PAL::getAVOutputContextClass()); > if (!context) > return false; > >diff --git a/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm b/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >index 5eaa27175f0f1ea455512b9f47c48908296728d0..a51a22362384151ab11d4894369f8a18580067da 100644 >--- a/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >+++ b/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >@@ -60,12 +60,8 @@ > #if HAVE(AUTHORIZATION_STATUS_FOR_MEDIA_TYPE) > #import <AVFoundation/AVCaptureDevice.h> > #import <AVFoundation/AVMediaFormat.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation); >-SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice); >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *); >-SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *); >+#import <pal/cocoa/AVFoundationSoftLink.h> > #endif > > namespace WebKit { >@@ -947,7 +943,7 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); > return; > } >- AVAuthorizationStatus cameraAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [getAVCaptureDeviceClass() authorizationStatusForMediaType:getAVMediaTypeVideo()]; >+ AVAuthorizationStatus cameraAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [PAL::getAVCaptureDeviceClass() authorizationStatusForMediaType:AVMediaTypeVideo]; > switch (cameraAuthorizationStatus) { > case AVAuthorizationStatusAuthorized: > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); >@@ -965,13 +961,13 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); > }); > >- [getAVCaptureDeviceClass() requestAccessForMediaType:getAVMediaTypeVideo() completionHandler:decisionHandler.get()]; >+ [PAL::getAVCaptureDeviceClass() requestAccessForMediaType:AVMediaTypeVideo completionHandler:decisionHandler.get()]; > break; > } > }); > > if (requiresAudioCapture) { >- AVAuthorizationStatus microphoneAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [getAVCaptureDeviceClass() authorizationStatusForMediaType:getAVMediaTypeAudio()]; >+ AVAuthorizationStatus microphoneAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [PAL::getAVCaptureDeviceClass() authorizationStatusForMediaType:AVMediaTypeAudio]; > switch (microphoneAuthorizationStatus) { > case AVAuthorizationStatusAuthorized: > requestCameraAuthorization(); >@@ -989,7 +985,7 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestCameraAuthorization(); > }); > >- [getAVCaptureDeviceClass() requestAccessForMediaType:getAVMediaTypeAudio() completionHandler:decisionHandler.get()]; >+ [PAL::getAVCaptureDeviceClass() requestAccessForMediaType:AVMediaTypeAudio completionHandler:decisionHandler.get()]; > break; > } > } else >diff --git a/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm b/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >index 5bdfb6dddbf99142f4229214dfb0d5844f1f3dd1..52a9bd4dcc91758cdf000dbc0b4de727ac856cd8 100644 >--- a/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >+++ b/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >@@ -35,10 +35,8 @@ > #import <WebCore/PlatformCALayerCocoa.h> > #import <WebCore/WebCoreCALayerExtras.h> > #import <wtf/RetainPtr.h> >-#import <wtf/SoftLinking.h> > >-SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPlayerLayer) >+#import <pal/cocoa/AVFoundationSoftLink.h> > > namespace WebKit { > using namespace WebCore; >@@ -105,8 +103,8 @@ Ref<WebCore::PlatformCALayer> PlatformCALayerRemoteCustom::clone(PlatformCALayer > > if (layerType() == LayerTypeAVPlayerLayer) { > >- if ([platformLayer() isKindOfClass:getAVPlayerLayerClass()]) { >- clonedLayer = adoptNS([allocAVPlayerLayerInstance() init]); >+ if ([platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]) { >+ clonedLayer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); > > AVPlayerLayer *destinationPlayerLayer = static_cast<AVPlayerLayer *>(clonedLayer.get()); > AVPlayerLayer *sourcePlayerLayer = static_cast<AVPlayerLayer *>(platformLayer()); >diff --git a/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm b/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >index c6e07164caf05ca4819e30a786dcc8ad83d60533..23aac34426ba03ab93b4e963f1e82f851404a1cb 100644 >--- a/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >+++ b/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >@@ -36,12 +36,10 @@ > #import <objc/runtime.h> > #import <pal/system/SleepDisabler.h> > #import <wtf/RetainPtr.h> >-#import <wtf/SoftLinking.h> > >-ALLOW_DEPRECATED_DECLARATIONS_BEGIN >+#import <pal/cocoa/AVFoundationSoftLink.h> > >-SOFT_LINK_FRAMEWORK(AVFoundation) >-SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >+ALLOW_DEPRECATED_DECLARATIONS_BEGIN > > @interface WebVideoFullscreenWindow : NSWindow<NSAnimationDelegate> { > SEL _controllerActionOnAnimationEnd; >@@ -118,7 +116,7 @@ SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) > > auto contentView = [[self fullscreenWindow] contentView]; > >- auto layer = adoptNS([allocAVPlayerLayerInstance() init]); >+ auto layer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); > [layer setPlayer:player]; > > [contentView setLayer:layer.get()]; >@@ -147,7 +145,7 @@ SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) > - (void)windowDidExitFullscreen > { > CALayer *layer = [[[self window] contentView] layer]; >- if ([layer isKindOfClass:getAVPlayerLayerClass()]) >+ if ([layer isKindOfClass:PAL::getAVPlayerLayerClass()]) > [[(AVPlayerLayer *)layer player] removeObserver:self forKeyPath:@"rate"]; > > [self clearFadeAnimation]; >diff --git a/Tools/ChangeLog b/Tools/ChangeLog >index a2e60f5508b18a72a5c877d41925566868b59a75..4112d12a0a6a74aff0dc73a547c957ec88c2e291 100644 >--- a/Tools/ChangeLog >+++ b/Tools/ChangeLog >@@ -1,3 +1,15 @@ >+2019-04-22 Eric Carlson <eric.carlson@apple.com> >+ >+ Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ <rdar://problem/47454979> >+ >+ Reviewed by Youenn Fablet. >+ >+ * TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj: >+ * TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm: Added. >+ (TestWebKitAPI::TEST): >+ > 2019-04-25 Commit Queue <commit-queue@webkit.org> > > Unreviewed, rolling out r244627. >diff --git a/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj b/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >index 08f1661ac5084ec65d3dc8a3312ef447462c37fb..180f19bfa96c2bed54deb269e44fb07d15fd5ca5 100644 >--- a/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >+++ b/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >@@ -23,6 +23,7 @@ > > /* Begin PBXBuildFile section */ > 041A1E34216FFDBC00789E0A /* PublicSuffix.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 041A1E33216FFDBC00789E0A /* PublicSuffix.cpp */; }; >+ 0711DF52226A95FC003DD2F7 /* AVFoundationSoftLinkTest.mm in Sources */ = {isa = PBXBuildFile; fileRef = 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */; }; > 07492B3B1DF8B14C00633DE1 /* EnumerateMediaDevices.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 07492B3A1DF8AE2D00633DE1 /* EnumerateMediaDevices.cpp */; }; > 07492B3C1DF8B86600633DE1 /* enumerateMediaDevices.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 07492B391DF8ADA400633DE1 /* enumerateMediaDevices.html */; }; > 074994421EA5034B000DA44E /* getUserMedia.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 4A410F4D19AF7BEF002EBAB5 /* getUserMedia.html */; }; >@@ -1338,6 +1339,7 @@ > /* Begin PBXFileReference section */ > 00CD9F6215BE312C002DA2CE /* BackForwardList.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = BackForwardList.mm; sourceTree = "<group>"; }; > 041A1E33216FFDBC00789E0A /* PublicSuffix.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PublicSuffix.cpp; sourceTree = "<group>"; }; >+ 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVFoundationSoftLinkTest.mm; sourceTree = "<group>"; }; > 07492B391DF8ADA400633DE1 /* enumerateMediaDevices.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = enumerateMediaDevices.html; sourceTree = "<group>"; }; > 07492B3A1DF8AE2D00633DE1 /* EnumerateMediaDevices.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = EnumerateMediaDevices.cpp; sourceTree = "<group>"; }; > 0766DD1F1A5AD5200023E3BB /* PendingAPIRequestURL.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PendingAPIRequestURL.cpp; sourceTree = "<group>"; }; >@@ -3687,6 +3689,7 @@ > CD89D0371C4EDB1300040A04 /* cocoa */ = { > isa = PBXGroup; > children = ( >+ 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */, > 751B05D51F8EAC1A0028A09E /* DatabaseTrackerTest.mm */, > 5769C50A1D9B0001000847FB /* SerializedCryptoKeyWrap.mm */, > A17991861E1C994E00A505ED /* SharedBuffer.mm */, >@@ -4028,6 +4031,7 @@ > CDC8E48D1BC5CB4500594FEC /* AudioSessionCategoryIOS.mm in Sources */, > 7C83E0B91D0A64F100FEBCF3 /* AutoLayoutIntegration.mm in Sources */, > 07CD32F62065B5430064A4BE /* AVFoundationPreference.mm in Sources */, >+ 0711DF52226A95FC003DD2F7 /* AVFoundationSoftLinkTest.mm in Sources */, > 7CCE7EB51A411A7E00447C4C /* BackForwardList.mm in Sources */, > 1C7FEB20207C0F2E00D23278 /* BackgroundColor.mm in Sources */, > 374B7A601DF36EEE00ACCB6C /* BundleEditingDelegate.mm in Sources */, >diff --git a/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm b/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..39c8dfacb08bdecf3c72adb89f5644d9dc092e1e >--- /dev/null >+++ b/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm >@@ -0,0 +1,194 @@ >+/* >+ * Copyright (C) 2019 Apple Inc. All rights reserved. >+ * >+ * Redistribution and use in source and binary forms, with or without >+ * modification, are permitted provided that the following conditions >+ * are met: >+ * 1. Redistributions of source code must retain the above copyright >+ * notice, this list of conditions and the following disclaimer. >+ * 2. Redistributions in binary form must reproduce the above copyright >+ * notice, this list of conditions and the following disclaimer in the >+ * documentation and/or other materials provided with the distribution. >+ * >+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >+ * THE POSSIBILITY OF SUCH DAMAGE. >+ */ >+ >+#include "config.h" >+ >+#if PLATFORM(COCOA) >+ >+#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+namespace TestWebKitAPI { >+ >+TEST(AVFoundationSoftLink, Classes) >+{ >+ EXPECT_NE(PAL::getAVPlayerClass(), nullptr); >+ EXPECT_NE(PAL::getAVPlayerItemClass(), nullptr); >+ EXPECT_NE(PAL::getAVPlayerItemVideoOutputClass(), nullptr); >+ EXPECT_NE(PAL::getAVPlayerLayerClass(), nullptr); >+ EXPECT_NE(PAL::getAVURLAssetClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetImageGeneratorClass(), nullptr); >+ EXPECT_NE(PAL::getAVMetadataItemClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetCacheClass(), nullptr); >+ EXPECT_NE(PAL::getAVPlayerItemLegibleOutputClass(), nullptr); >+ EXPECT_NE(PAL::getAVMediaSelectionGroupClass(), nullptr); >+ EXPECT_NE(PAL::getAVMediaSelectionOptionClass(), nullptr); >+ EXPECT_NE(PAL::getAVOutputContextClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetReaderClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetWriterClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetWriterInputClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureSessionClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureConnectionClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureDeviceClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureDeviceFormatClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureDeviceInputClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureOutputClass(), nullptr); >+ EXPECT_NE(PAL::getAVCaptureVideoDataOutputClass(), nullptr); >+ EXPECT_NE(PAL::getAVFrameRateRangeClass(), nullptr); >+ EXPECT_NE(PAL::getAVMutableAudioMixClass(), nullptr); >+ EXPECT_NE(PAL::getAVMutableAudioMixInputParametersClass(), nullptr); >+ >+#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >+ EXPECT_NE(PAL::getAVStreamSessionClass(), nullptr); >+ EXPECT_NE(PAL::getAVStreamDataParserClass(), nullptr); >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+ EXPECT_NE(PAL::getAVPersistableContentKeyRequestClass(), nullptr); >+ EXPECT_NE(PAL::getAVAudioSessionClass(), nullptr); >+ EXPECT_NE(PAL::getAVSpeechSynthesizerClass(), nullptr); >+ EXPECT_NE(PAL::getAVSpeechUtteranceClass(), nullptr); >+ EXPECT_NE(PAL::getAVSpeechSynthesisVoiceClass(), nullptr); >+#endif >+ >+#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >+ EXPECT_NE(PAL::getAVRouteDetectorClass(), nullptr); >+#endif >+ >+ EXPECT_NE(PAL::getAVContentKeyResponseClass(), nullptr); >+ EXPECT_NE(PAL::getAVContentKeySessionClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetResourceLoadingRequestClass(), nullptr); >+ EXPECT_NE(PAL::getAVAssetReaderSampleReferenceOutputClass(), nullptr); >+ EXPECT_NE(PAL::getAVVideoPerformanceMetricsClass(), nullptr); >+ EXPECT_NE(PAL::getAVSampleBufferAudioRendererClass(), nullptr); >+ EXPECT_NE(PAL::getAVSampleBufferDisplayLayerClass(), nullptr); >+ EXPECT_NE(PAL::getAVSampleBufferRenderSynchronizerClass(), nullptr); >+} >+ >+ >+TEST(AVFoundationSoftLink, Constants) >+{ >+ EXPECT_TRUE([AVAudioTimePitchAlgorithmSpectral isEqualToString:@"Spectral"]); >+ EXPECT_TRUE([AVAudioTimePitchAlgorithmVarispeed isEqualToString:@"Varispeed"]); >+ EXPECT_TRUE([AVMediaTypeClosedCaption isEqualToString:@"clcp"]); >+ EXPECT_TRUE([AVMediaTypeVideo isEqualToString:@"vide"]); >+ EXPECT_TRUE([AVMediaTypeAudio isEqualToString:@"soun"]); >+ EXPECT_TRUE([AVMediaTypeMuxed isEqualToString:@"muxx"]); >+ EXPECT_TRUE([AVMediaTypeMetadata isEqualToString:@"meta"]); >+ EXPECT_TRUE([AVAssetImageGeneratorApertureModeCleanAperture isEqualToString:@"CleanAperture"]); >+ EXPECT_TRUE([AVStreamingKeyDeliveryContentKeyType isEqualToString:@"com.apple.streamingkeydelivery.contentkey"]); >+ EXPECT_TRUE([AVMediaCharacteristicContainsOnlyForcedSubtitles isEqualToString:@"public.subtitles.forced-only"]); >+ EXPECT_TRUE([AVMetadataCommonKeyTitle isEqualToString:@"title"]); >+ EXPECT_TRUE([AVMetadataKeySpaceCommon isEqualToString:@"comn"]); >+ EXPECT_TRUE([AVMediaTypeSubtitle isEqualToString:@"sbtl"]); >+ EXPECT_TRUE([AVMediaCharacteristicIsMainProgramContent isEqualToString:@"public.main-program-content"]); >+ EXPECT_TRUE([AVMediaCharacteristicEasyToRead isEqualToString:@"public.easy-to-read"]); >+ EXPECT_TRUE([AVFileTypeMPEG4 isEqualToString:@"public.mpeg-4"]); >+ EXPECT_TRUE([AVVideoCodecH264 isEqualToString:@"avc1"]); >+ EXPECT_TRUE([AVVideoExpectedSourceFrameRateKey isEqualToString:@"ExpectedFrameRate"]); >+ EXPECT_TRUE([AVVideoProfileLevelKey isEqualToString:@"ProfileLevel"]); >+ EXPECT_TRUE([AVVideoAverageBitRateKey isEqualToString:@"AverageBitRate"]); >+ EXPECT_TRUE([AVVideoMaxKeyFrameIntervalKey isEqualToString:@"MaxKeyFrameInterval"]); >+ EXPECT_TRUE([AVVideoProfileLevelH264MainAutoLevel isEqualToString:@"H264_Main_AutoLevel"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackDisplayNameKey isEqualToString:@"MediaSelectionOptionsName"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackExtendedLanguageTagKey isEqualToString:@"MediaSelectionOptionsExtendedLanguageTag"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackIsDefaultKey isEqualToString:@"MediaSelectionOptionsIsDefault"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackMediaCharactersticsKey isEqualToString:@"MediaSelectionOptionsTaggedMediaCharacteristics"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackIdentifierKey isEqualToString:@"MediaSelectionOptionsClientIdentifier"]); >+ EXPECT_TRUE([AVOutOfBandAlternateTrackSourceKey isEqualToString:@"MediaSelectionOptionsURL"]); >+ EXPECT_TRUE([AVMediaCharacteristicDescribesMusicAndSoundForAccessibility isEqualToString:@"public.accessibility.describes-music-and-sound"]); >+ EXPECT_TRUE([AVMediaCharacteristicTranscribesSpokenDialogForAccessibility isEqualToString:@"public.accessibility.transcribes-spoken-dialog"]); >+ EXPECT_TRUE([AVMediaCharacteristicIsAuxiliaryContent isEqualToString:@"public.auxiliary-content"]); >+ EXPECT_TRUE([AVMediaCharacteristicDescribesVideoForAccessibility isEqualToString:@"public.accessibility.describes-video"]); >+ EXPECT_TRUE([AVMetadataKeySpaceQuickTimeUserData isEqualToString:@"udta"]); >+ EXPECT_TRUE([AVMetadataKeySpaceQuickTimeMetadata isEqualToString:@"mdta"]); >+ EXPECT_TRUE([AVMetadataKeySpaceiTunes isEqualToString:@"itsk"]); >+ EXPECT_TRUE([AVMetadataKeySpaceID3 isEqualToString:@"org.id3"]); >+ EXPECT_TRUE([AVMetadataKeySpaceISOUserData isEqualToString:@"uiso"]); >+ >+ if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey()) >+ EXPECT_TRUE([AVEncoderBitRateKey isEqualToString:@"AVEncoderBitRateKey"]); >+ if (PAL::canLoad_AVFoundation_AVFormatIDKey()) >+ EXPECT_TRUE([AVFormatIDKey isEqualToString:@"AVFormatIDKey"]); >+ if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey()) >+ EXPECT_TRUE([AVNumberOfChannelsKey isEqualToString:@"AVNumberOfChannelsKey"]); >+ if (PAL::canLoad_AVFoundation_AVSampleRateKey()) >+ EXPECT_TRUE([AVSampleRateKey isEqualToString:@"AVSampleRateKey"]); >+ >+#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101300) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 110000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 40000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 110000) >+ EXPECT_TRUE(PAL::canLoad_AVFoundation_AVURLAssetOutOfBandMIMETypeKey()); >+ EXPECT_TRUE([AVURLAssetOutOfBandMIMETypeKey isEqualToString:@"AVURLAssetOutOfBandMIMETypeKey"]); >+#endif >+ >+#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101400) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 120000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 50000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 120000) >+ EXPECT_TRUE(PAL::canLoad_AVFoundation_AVURLAssetUseClientURLLoadingExclusively()); >+ EXPECT_TRUE([AVURLAssetUseClientURLLoadingExclusively isEqualToString:@"AVURLAssetUseClientURLLoadingExclusively"]); >+#endif >+ >+#if ENABLE(ENCRYPTED_MEDIA) && HAVE(AVCONTENTKEYSESSION) >+ EXPECT_TRUE(PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()); >+ EXPECT_TRUE([AVContentKeySystemFairPlayStreaming isEqualToString:@"FairPlayStreaming"]); >+#endif >+ >+#if ENABLE(LEGACY_ENCRYPTED_MEDIA) && ENABLE(MEDIA_SOURCE) >+ EXPECT_TRUE(PAL::canLoad_AVFoundation_AVContentKeyRequestProtocolVersionsKey()); >+ EXPECT_TRUE([AVContentKeyRequestProtocolVersionsKey isEqualToString:@"ProtocolVersionsKey"]); >+#endif >+ >+#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101500) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 130000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 60000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 130000) >+ EXPECT_TRUE(PAL::canLoad_AVFoundation_AVVideoCodecTypeHEVCWithAlpha()); >+ EXPECT_TRUE([AVVideoCodecTypeHEVCWithAlpha isEqualToString:@"muxa"]); >+#endif >+ >+#if PLATFORM(MAC) >+ EXPECT_TRUE([AVStreamDataParserContentKeyRequestProtocolVersionsKey isEqualToString:@"AVContentKeyRequestProtocolVersionsKey"]); >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+ EXPECT_TRUE([AVURLAssetBoundNetworkInterfaceName isEqualToString:@"AVURLAssetBoundNetworkInterfaceName"]); >+ EXPECT_TRUE([AVURLAssetClientBundleIdentifierKey isEqualToString:@"AVURLAssetClientBundleIdentifierKey"]); >+ EXPECT_TRUE([AVCaptureSessionRuntimeErrorNotification isEqualToString:@"AVCaptureSessionRuntimeErrorNotification"]); >+ EXPECT_TRUE([AVCaptureSessionWasInterruptedNotification isEqualToString:@"AVCaptureSessionWasInterruptedNotification"]); >+ EXPECT_TRUE([AVCaptureSessionInterruptionEndedNotification isEqualToString:@"AVCaptureSessionInterruptionEndedNotification"]); >+ EXPECT_TRUE([AVCaptureSessionInterruptionReasonKey isEqualToString:@"AVCaptureSessionInterruptionReasonKey"]); >+ EXPECT_TRUE([AVCaptureSessionErrorKey isEqualToString:@"AVCaptureSessionErrorKey"]); >+ EXPECT_TRUE([AVAudioSessionCategoryAmbient isEqualToString:@"AVAudioSessionCategoryAmbient"]); >+ EXPECT_TRUE([AVAudioSessionCategorySoloAmbient isEqualToString:@"AVAudioSessionCategorySoloAmbient"]); >+ EXPECT_TRUE([AVAudioSessionCategoryPlayback isEqualToString:@"AVAudioSessionCategoryPlayback"]); >+ EXPECT_TRUE([AVAudioSessionCategoryRecord isEqualToString:@"AVAudioSessionCategoryRecord"]); >+ EXPECT_TRUE([AVAudioSessionCategoryPlayAndRecord isEqualToString:@"AVAudioSessionCategoryPlayAndRecord"]); >+ EXPECT_TRUE([AVAudioSessionCategoryAudioProcessing isEqualToString:@"AVAudioSessionCategoryAudioProcessing"]); >+ EXPECT_TRUE([AVAudioSessionModeDefault isEqualToString:@"AVAudioSessionModeDefault"]); >+ EXPECT_TRUE([AVAudioSessionModeVideoChat isEqualToString:@"AVAudioSessionModeVideoChat"]); >+ EXPECT_TRUE([AVAudioSessionInterruptionNotification isEqualToString:@"AVAudioSessionInterruptionNotification"]); >+ EXPECT_TRUE([AVAudioSessionInterruptionTypeKey isEqualToString:@"AVAudioSessionInterruptionTypeKey"]); >+ EXPECT_TRUE([AVAudioSessionInterruptionOptionKey isEqualToString:@"AVAudioSessionInterruptionOptionKey"]); >+ EXPECT_TRUE([AVRouteDetectorMultipleRoutesDetectedDidChangeNotification isEqualToString:@"AVRouteDetectorMultipleRoutesDetectedDidChangeNotification"]); >+#endif >+} >+ >+#endif // PLATFORM(COCOA) >+ >+} // namespace TestWebKitAPI >+
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 197171
:
367961
|
367975
|
367990
|
367999
|
368004
|
368041
|
368162
|
368164
|
368180
| 368254