WebKit Bugzilla
Attachment 368238 Details for
Bug 197282
: REGRESSION(r244627): Causing internal build failures (Requested by ShawnRoberts on #webkit).
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
ROLLOUT of r244627
bug-197282-20190425084016.patch (text/plain), 245.28 KB, created by
WebKit Commit Bot
on 2019-04-25 08:40:16 PDT
(
hide
)
Description:
ROLLOUT of r244627
Filename:
MIME Type:
Creator:
WebKit Commit Bot
Created:
2019-04-25 08:40:16 PDT
Size:
245.28 KB
patch
obsolete
>Subversion Revision: 244643 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 84effb1f9b951a16150e4dac42ae777c2667120a..5ac42cac99b1d94d81d6acd37fd2ec161914793a 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,3 +1,17 @@ >+2019-04-25 Commit Queue <commit-queue@webkit.org> >+ >+ Unreviewed, rolling out r244627. >+ https://bugs.webkit.org/show_bug.cgi?id=197282 >+ >+ Causing internal build failures (Requested by ShawnRoberts on >+ #webkit). >+ >+ Reverted changeset: >+ >+ "Create AVFoundationSoftLink.{h,mm} to reduce duplicate code" >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ https://trac.webkit.org/changeset/244627 >+ > 2019-04-25 Antti Koivisto <antti@apple.com> > > redefinition of enumerator 'NSAttachmentCharacter' with Apple internal build >diff --git a/Source/WebCore/PAL/ChangeLog b/Source/WebCore/PAL/ChangeLog >index 23add791db6b351dce8e0fdc146ff7ac4118600e..00a1d395f091159ec666a1901998d1274716e6f8 100644 >--- a/Source/WebCore/PAL/ChangeLog >+++ b/Source/WebCore/PAL/ChangeLog >@@ -1,3 +1,17 @@ >+2019-04-25 Commit Queue <commit-queue@webkit.org> >+ >+ Unreviewed, rolling out r244627. >+ https://bugs.webkit.org/show_bug.cgi?id=197282 >+ >+ Causing internal build failures (Requested by ShawnRoberts on >+ #webkit). >+ >+ Reverted changeset: >+ >+ "Create AVFoundationSoftLink.{h,mm} to reduce duplicate code" >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ https://trac.webkit.org/changeset/244627 >+ > 2019-04-24 Eric Carlson <eric.carlson@apple.com> > > Create AVFoundationSoftLink.{h,mm} to reduce duplicate code >diff --git a/Source/WebKit/ChangeLog b/Source/WebKit/ChangeLog >index 859560fb828f43ab54b88ff73cac169e900df2a3..ff09abede310c1084e2ab78a0dd2e7665c3ac229 100644 >--- a/Source/WebKit/ChangeLog >+++ b/Source/WebKit/ChangeLog >@@ -1,3 +1,17 @@ >+2019-04-25 Commit Queue <commit-queue@webkit.org> >+ >+ Unreviewed, rolling out r244627. >+ https://bugs.webkit.org/show_bug.cgi?id=197282 >+ >+ Causing internal build failures (Requested by ShawnRoberts on >+ #webkit). >+ >+ Reverted changeset: >+ >+ "Create AVFoundationSoftLink.{h,mm} to reduce duplicate code" >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ https://trac.webkit.org/changeset/244627 >+ > 2019-04-24 Carlos Garcia Campos <cgarcia@igalia.com> > > [GTK] Hardcoded text color in input fields >diff --git a/Source/WebKitLegacy/mac/ChangeLog b/Source/WebKitLegacy/mac/ChangeLog >index a9a0b7e08b4f6f9f6fbbdfd8ba1e46abc2efcb34..8a30d748fe71f014e55d11aad4fb90a52b928aa5 100644 >--- a/Source/WebKitLegacy/mac/ChangeLog >+++ b/Source/WebKitLegacy/mac/ChangeLog >@@ -1,3 +1,17 @@ >+2019-04-25 Commit Queue <commit-queue@webkit.org> >+ >+ Unreviewed, rolling out r244627. >+ https://bugs.webkit.org/show_bug.cgi?id=197282 >+ >+ Causing internal build failures (Requested by ShawnRoberts on >+ #webkit). >+ >+ Reverted changeset: >+ >+ "Create AVFoundationSoftLink.{h,mm} to reduce duplicate code" >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ https://trac.webkit.org/changeset/244627 >+ > 2019-04-24 Zalan Bujtas <zalan@apple.com> > > Regression (r244291): Broken API Test AutoLayoutRenderingProgressRelativeOrdering >diff --git a/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm b/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >index 42ea43957c80f3f8b99bbc599e741a729548fd09..263a2036a8d5d69df761aed9dad9af502be9ba72 100644 >--- a/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >+++ b/Source/WebCore/Modules/plugins/QuickTimePluginReplacement.mm >@@ -57,14 +57,18 @@ > #import <wtf/text/Base64.h> > > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+typedef AVMetadataItem AVMetadataItemType; >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+#define AVMetadataItem getAVMetadataItemClass() > > namespace WebCore { > using namespace PAL; > > #if PLATFORM(IOS_FAMILY) > static JSValue *jsValueWithValueInContext(id, JSContext *); >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *, JSContext *); >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *, JSContext *); > #endif > > static String quickTimePluginReplacementScript() >@@ -325,13 +329,13 @@ static JSValue *jsValueWithValueInContext(id value, JSContext *context) > return jsValueWithArrayInContext(value, context); > else if ([value isKindOfClass:[NSData class]]) > return jsValueWithDataInContext(value, emptyString(), context); >- else if ([value isKindOfClass:PAL::getAVMetadataItemClass()]) >+ else if ([value isKindOfClass:[AVMetadataItem class]]) > return jsValueWithAVMetadataItemInContext(value, context); > > return nil; > } > >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *item, JSContext *context) >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *item, JSContext *context) > { > NSMutableDictionary* dictionary = [NSMutableDictionary dictionaryWithDictionary:[item extraAttributes]]; > >diff --git a/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj b/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >index 365637835757f06da8d4ec6b777585e42d4eefd8..4884ed31e3b4a41a4f1f954018efec39bc001cbf 100644 >--- a/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >+++ b/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj >@@ -21,8 +21,6 @@ > /* End PBXAggregateTarget section */ > > /* Begin PBXBuildFile section */ >- 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */; }; >- 077E87B2226A460300A2AFF0 /* AVFoundationSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */; }; > 0C00CFD41F68CE4600AAC26D /* MediaTimeAVFoundation.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C00CFD21F68CE4600AAC26D /* MediaTimeAVFoundation.h */; }; > 0C2D9E731EEF5AF600DBC317 /* ExportMacros.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C2D9E721EEF5AF600DBC317 /* ExportMacros.h */; }; > 0C2DA06D1F33CA8400DBC317 /* CFLocaleSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C2DA0671F33CA8400DBC317 /* CFLocaleSPI.h */; }; >@@ -100,6 +98,7 @@ > 0C7785A01F45130F00F4EBB6 /* QuickLookMacSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C7785871F45130F00F4EBB6 /* QuickLookMacSPI.h */; }; > 0C7785A11F45130F00F4EBB6 /* TelephonyUtilitiesSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 0C7785881F45130F00F4EBB6 /* TelephonyUtilitiesSPI.h */; }; > 0CF99CA41F736375007EE793 /* MediaTimeAVFoundation.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0C00CFD11F68CE4600AAC26D /* MediaTimeAVFoundation.cpp */; }; >+ 7A36D0F9223AD9AB00B0522E /* CommonCryptoSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */; }; > 0CF99CA81F738437007EE793 /* CoreMediaSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0CF99CA61F738436007EE793 /* CoreMediaSoftLink.cpp */; }; > 0CF99CA91F738437007EE793 /* CoreMediaSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 0CF99CA71F738437007EE793 /* CoreMediaSoftLink.h */; }; > 1C09D0531E31C44100725F18 /* CryptoDigest.h in Headers */ = {isa = PBXBuildFile; fileRef = 1C09D0521E31C44100725F18 /* CryptoDigest.h */; }; >@@ -122,7 +121,6 @@ > 570AB8F920AF6E3D00B8BE87 /* NSXPCConnectionSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 570AB8F820AF6E3D00B8BE87 /* NSXPCConnectionSPI.h */; }; > 63C7EDC721AFAE04006A7B99 /* NSProgressSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 63E369F921AFA83F001C14BC /* NSProgressSPI.h */; }; > 7A1656441F97B2B900BA3CE4 /* NSKeyedArchiverSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A1656431F97B2B800BA3CE4 /* NSKeyedArchiverSPI.h */; }; >- 7A36D0F9223AD9AB00B0522E /* CommonCryptoSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */; }; > 7A3A6A8020CADB4700317AAE /* NSImageSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 7A3A6A7F20CADB4600317AAE /* NSImageSPI.h */; }; > A10265891F56747A00B4C844 /* HIToolboxSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = A10265881F56747A00B4C844 /* HIToolboxSPI.h */; }; > A102658E1F567E9D00B4C844 /* HIServicesSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = A102658D1F567E9D00B4C844 /* HIServicesSPI.h */; }; >@@ -177,8 +175,6 @@ > /* End PBXContainerItemProxy section */ > > /* Begin PBXFileReference section */ >- 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVFoundationSoftLink.mm; sourceTree = "<group>"; }; >- 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVFoundationSoftLink.h; sourceTree = "<group>"; }; > 0C00CFD11F68CE4600AAC26D /* MediaTimeAVFoundation.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MediaTimeAVFoundation.cpp; sourceTree = "<group>"; }; > 0C00CFD21F68CE4600AAC26D /* MediaTimeAVFoundation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MediaTimeAVFoundation.h; sourceTree = "<group>"; }; > 0C2D9E721EEF5AF600DBC317 /* ExportMacros.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ExportMacros.h; sourceTree = "<group>"; }; >@@ -186,6 +182,7 @@ > 0C2DA0681F33CA8400DBC317 /* CFNetworkConnectionCacheSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFNetworkConnectionCacheSPI.h; sourceTree = "<group>"; }; > 0C2DA0691F33CA8400DBC317 /* CFNetworkSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFNetworkSPI.h; sourceTree = "<group>"; }; > 0C2DA06A1F33CA8400DBC317 /* CFUtilitiesSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CFUtilitiesSPI.h; sourceTree = "<group>"; }; >+ 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CommonCryptoSPI.h; sourceTree = "<group>"; }; > 0C2DA06B1F33CA8400DBC317 /* CoreAudioSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreAudioSPI.h; sourceTree = "<group>"; }; > 0C2DA06C1F33CA8400DBC317 /* CoreMediaSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreMediaSPI.h; sourceTree = "<group>"; }; > 0C2DA11C1F3BE9E000DBC317 /* CoreGraphicsSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CoreGraphicsSPI.h; sourceTree = "<group>"; }; >@@ -286,7 +283,6 @@ > 570AB8F820AF6E3D00B8BE87 /* NSXPCConnectionSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = NSXPCConnectionSPI.h; sourceTree = "<group>"; }; > 63E369F921AFA83F001C14BC /* NSProgressSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = NSProgressSPI.h; sourceTree = "<group>"; }; > 7A1656431F97B2B800BA3CE4 /* NSKeyedArchiverSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = NSKeyedArchiverSPI.h; sourceTree = "<group>"; }; >- 7A36D0F8223AD9AB00B0522E /* CommonCryptoSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CommonCryptoSPI.h; sourceTree = "<group>"; }; > 7A3A6A7F20CADB4600317AAE /* NSImageSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = NSImageSPI.h; sourceTree = "<group>"; }; > 93E5909C1F93BF1E0067F8CF /* UnencodableHandling.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = UnencodableHandling.h; sourceTree = "<group>"; }; > A10265881F56747A00B4C844 /* HIToolboxSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = HIToolboxSPI.h; sourceTree = "<group>"; }; >@@ -579,8 +575,6 @@ > 1C4876DE1F8D831300CCEEBD /* cocoa */ = { > isa = PBXGroup; > children = ( >- 077E87B0226A460200A2AFF0 /* AVFoundationSoftLink.h */, >- 077E87AF226A460200A2AFF0 /* AVFoundationSoftLink.mm */, > F44291661FA52705002CC93E /* FileSizeFormatterCocoa.mm */, > A1F63C9D21A4DBF7006FB43B /* PassKitSoftLink.h */, > A1F63C9E21A4DBF7006FB43B /* PassKitSoftLink.mm */, >@@ -678,7 +672,6 @@ > buildActionMask = 2147483647; > files = ( > 2D02E93C2056FAA700A13797 /* AudioToolboxSPI.h in Headers */, >- 077E87B2226A460300A2AFF0 /* AVFoundationSoftLink.h in Headers */, > 0C7785891F45130F00F4EBB6 /* AVFoundationSPI.h in Headers */, > 0C2DA13E1F3BEB4900DBC317 /* AVKitSPI.h in Headers */, > CDF91113220E4EEC001EA39E /* CelestialSPI.h in Headers */, >@@ -884,7 +877,6 @@ > isa = PBXSourcesBuildPhase; > buildActionMask = 2147483647; > files = ( >- 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */, > 0C5FFF0F1F78D9DA009EFF1A /* ClockCM.mm in Sources */, > 0CF99CA81F738437007EE793 /* CoreMediaSoftLink.cpp in Sources */, > 1C09D0561E31C46500725F18 /* CryptoDigestCommonCrypto.cpp in Sources */, >diff --git a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h >deleted file mode 100644 >index c7b1547c59d4114cee76526526d963b7dd67cd0c..0000000000000000000000000000000000000000 >--- a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.h >+++ /dev/null >@@ -1,288 +0,0 @@ >-/* >- * Copyright (C) 2019 Apple Inc. All rights reserved. >- * >- * Redistribution and use in source and binary forms, with or without >- * modification, are permitted provided that the following conditions >- * are met: >- * 1. Redistributions of source code must retain the above copyright >- * notice, this list of conditions and the following disclaimer. >- * 2. Redistributions in binary form must reproduce the above copyright >- * notice, this list of conditions and the following disclaimer in the >- * documentation and/or other materials provided with the distribution. >- * >- * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >- * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >- * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >- * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >- * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >- * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >- * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >- * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >- * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >- * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >- * THE POSSIBILITY OF SUCH DAMAGE. >- */ >- >-#pragma once >- >-#if USE(AVFOUNDATION) >- >-#import <AVFoundation/AVFoundation.h> >-#import <pal/spi/mac/AVFoundationSPI.h> >-#import <wtf/SoftLinking.h> >- >-SOFT_LINK_FRAMEWORK_FOR_HEADER(PAL, AVFoundation) >- >-// Note: We don't define accessor macros for classes (e.g. >-// #define AVAssetCache PAL::getAVAssetCacheClass() >-// because they make it difficult to use the class name in source code. >- >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetCache) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetImageGenerator) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetReader) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetReaderSampleReferenceOutput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetResourceLoadingRequest) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetWriter) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAssetWriterInput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureConnection) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDevice) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDeviceFormat) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureDeviceInput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureOutput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureSession) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVCaptureVideoDataOutput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVContentKeyResponse) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVContentKeySession) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVFrameRateRange) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMediaSelectionGroup) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMediaSelectionOption) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMetadataItem) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMutableAudioMix) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVMutableAudioMixInputParameters) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVOutputContext) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItem) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItemLegibleOutput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerItemVideoOutput) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPlayerLayer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferAudioRenderer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferDisplayLayer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSampleBufferRenderSynchronizer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVStreamDataParser) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVURLAsset) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVVideoPerformanceMetrics) >- >-#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVStreamSession) >-#endif >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVAudioSession) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVPersistableContentKeyRequest) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechSynthesisVoice) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechSynthesizer) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVSpeechUtterance) >-#endif >- >-#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >-SOFT_LINK_CLASS_FOR_HEADER(PAL, AVRouteDetector) >-#endif >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *) >-#define AVAudioTimePitchAlgorithmSpectral PAL::get_AVFoundation_AVAudioTimePitchAlgorithmSpectral() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *) >-#define AVAudioTimePitchAlgorithmVarispeed PAL::get_AVFoundation_AVAudioTimePitchAlgorithmVarispeed() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicVisual, NSString *) >-#define AVMediaCharacteristicVisual PAL::get_AVFoundation_AVMediaCharacteristicVisual() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicAudible, NSString *) >-#define AVMediaCharacteristicAudible PAL::get_AVFoundation_AVMediaCharacteristicAudible() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeClosedCaption, NSString *) >-#define AVMediaTypeClosedCaption PAL::get_AVFoundation_AVMediaTypeClosedCaption() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeVideo, NSString *) >-#define AVMediaTypeVideo PAL::get_AVFoundation_AVMediaTypeVideo() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeMuxed, NSString *) >-#define AVMediaTypeMuxed PAL::get_AVFoundation_AVMediaTypeMuxed() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeAudio, NSString *) >-#define AVMediaTypeAudio PAL::get_AVFoundation_AVMediaTypeAudio() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeMetadata, NSString *) >-#define AVMediaTypeMetadata PAL::get_AVFoundation_AVMediaTypeMetadata() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *) >-#define AVURLAssetInheritURIQueryComponentFromReferencingURIKey PAL::get_AVFoundation_AVURLAssetInheritURIQueryComponentFromReferencingURIKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *) >-#define AVAssetImageGeneratorApertureModeCleanAperture PAL::get_AVFoundation_AVAssetImageGeneratorApertureModeCleanAperture() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >-#define AVURLAssetReferenceRestrictionsKey PAL::get_AVFoundation_AVURLAssetReferenceRestrictionsKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >-#define AVLayerVideoGravityResizeAspect PAL::get_AVFoundation_AVLayerVideoGravityResizeAspect() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >-#define AVLayerVideoGravityResizeAspectFill PAL::get_AVFoundation_AVLayerVideoGravityResizeAspectFill() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVLayerVideoGravityResize, NSString *) >-#define AVLayerVideoGravityResize PAL::get_AVFoundation_AVLayerVideoGravityResize() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *) >-#define AVStreamingKeyDeliveryContentKeyType PAL::get_AVFoundation_AVStreamingKeyDeliveryContentKeyType() >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *) >-#define AVCaptureDeviceWasConnectedNotification PAL::get_AVFoundation_AVCaptureDeviceWasConnectedNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >-#define AVCaptureDeviceWasDisconnectedNotification PAL::get_AVFoundation_AVCaptureDeviceWasDisconnectedNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *) >-#define AVPlayerItemDidPlayToEndTimeNotification PAL::get_AVFoundation_AVPlayerItemDidPlayToEndTimeNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *) >-#define AVStreamSessionContentProtectionSessionIdentifierChangedNotification PAL::get_AVFoundation_AVStreamSessionContentProtectionSessionIdentifierChangedNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*) >-#define AVSampleBufferDisplayLayerFailedToDecodeNotification PAL::get_AVFoundation_AVSampleBufferDisplayLayerFailedToDecodeNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*) >-#define AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey PAL::get_AVFoundation_AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey() >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >-#define AVMediaCharacteristicContainsOnlyForcedSubtitles PAL::get_AVFoundation_AVMediaCharacteristicContainsOnlyForcedSubtitles() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicLegible, NSString *) >-#define AVMediaCharacteristicLegible PAL::get_AVFoundation_AVMediaCharacteristicLegible() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *) >-#define AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly PAL::get_AVFoundation_AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly() >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataCommonKeyTitle, NSString *) >-#define AVMetadataCommonKeyTitle PAL::get_AVFoundation_AVMetadataCommonKeyTitle() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceCommon, NSString *) >-#define AVMetadataKeySpaceCommon PAL::get_AVFoundation_AVMetadataKeySpaceCommon() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaTypeSubtitle, NSString *) >-#define AVMediaTypeSubtitle PAL::get_AVFoundation_AVMediaTypeSubtitle() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >-#define AVMediaCharacteristicIsMainProgramContent PAL::get_AVFoundation_AVMediaCharacteristicIsMainProgramContent() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicEasyToRead, NSString *) >-#define AVMediaCharacteristicEasyToRead PAL::get_AVFoundation_AVMediaCharacteristicEasyToRead() >- >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *) >-#define AVURLAssetOutOfBandMIMETypeKey PAL::get_AVFoundation_AVURLAssetOutOfBandMIMETypeKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *) >-#define AVURLAssetUseClientURLLoadingExclusively PAL::get_AVFoundation_AVURLAssetUseClientURLLoadingExclusively() >- >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*) >-#define AVContentKeySystemFairPlayStreaming PAL::get_AVFoundation_AVContentKeySystemFairPlayStreaming() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *) >-#define AVContentKeyRequestProtocolVersionsKey PAL::get_AVFoundation_AVContentKeyRequestProtocolVersionsKey() >- >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *) >-#define AVVideoCodecTypeHEVCWithAlpha PAL::get_AVFoundation_AVVideoCodecTypeHEVCWithAlpha() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVFileTypeMPEG4, NSString *) >-#define AVFileTypeMPEG4 PAL::get_AVFoundation_AVFileTypeMPEG4() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCodecKey, NSString *) >-#define AVVideoCodecKey PAL::get_AVFoundation_AVVideoCodecKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCodecH264, NSString *) >-#define AVVideoCodecH264 PAL::get_AVFoundation_AVVideoCodecH264() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoWidthKey, NSString *) >-#define AVVideoWidthKey PAL::get_AVFoundation_AVVideoWidthKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoHeightKey, NSString *) >-#define AVVideoHeightKey PAL::get_AVFoundation_AVVideoHeightKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *) >-#define AVVideoExpectedSourceFrameRateKey PAL::get_AVFoundation_AVVideoExpectedSourceFrameRateKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoProfileLevelKey, NSString *) >-#define AVVideoProfileLevelKey PAL::get_AVFoundation_AVVideoProfileLevelKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoAverageBitRateKey, NSString *) >-#define AVVideoAverageBitRateKey PAL::get_AVFoundation_AVVideoAverageBitRateKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *) >-#define AVVideoMaxKeyFrameIntervalKey PAL::get_AVFoundation_AVVideoMaxKeyFrameIntervalKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *) >-#define AVVideoProfileLevelH264MainAutoLevel PAL::get_AVFoundation_AVVideoProfileLevelH264MainAutoLevel() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVVideoCompressionPropertiesKey, NSString *) >-#define AVVideoCompressionPropertiesKey PAL::get_AVFoundation_AVVideoCompressionPropertiesKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVEncoderBitRateKey, NSString *) >-#define AVEncoderBitRateKey PAL::get_AVFoundation_AVEncoderBitRateKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVFormatIDKey, NSString *) >-#define AVFormatIDKey PAL::get_AVFoundation_AVFormatIDKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVNumberOfChannelsKey, NSString *) >-#define AVNumberOfChannelsKey PAL::get_AVFoundation_AVNumberOfChannelsKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVSampleRateKey, NSString *) >-#define AVSampleRateKey PAL::get_AVFoundation_AVSampleRateKey() >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetCacheKey, NSString *) >-#define AVURLAssetCacheKey PAL::get_AVFoundation_AVURLAssetCacheKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *) >-#define AVURLAssetOutOfBandAlternateTracksKey PAL::get_AVFoundation_AVURLAssetOutOfBandAlternateTracksKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >-#define AVURLAssetUsesNoPersistentCacheKey PAL::get_AVFoundation_AVURLAssetUsesNoPersistentCacheKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *) >-#define AVOutOfBandAlternateTrackDisplayNameKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackDisplayNameKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *) >-#define AVOutOfBandAlternateTrackExtendedLanguageTagKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackExtendedLanguageTagKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *) >-#define AVOutOfBandAlternateTrackIsDefaultKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackIsDefaultKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *) >-#define AVOutOfBandAlternateTrackMediaCharactersticsKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackMediaCharactersticsKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *) >-#define AVOutOfBandAlternateTrackIdentifierKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackIdentifierKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *) >-#define AVOutOfBandAlternateTrackSourceKey PAL::get_AVFoundation_AVOutOfBandAlternateTrackSourceKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >-#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicDescribesMusicAndSoundForAccessibility() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >-#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >-#define AVMediaCharacteristicIsAuxiliaryContent PAL::get_AVFoundation_AVMediaCharacteristicIsAuxiliaryContent() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *) >-#define AVMediaCharacteristicDescribesVideoForAccessibility PAL::get_AVFoundation_AVMediaCharacteristicDescribesVideoForAccessibility() >- >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *) >-#define AVMetadataKeySpaceQuickTimeUserData PAL::get_AVFoundation_AVMetadataKeySpaceQuickTimeUserData() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *) >-#define AVMetadataKeySpaceQuickTimeMetadata PAL::get_AVFoundation_AVMetadataKeySpaceQuickTimeMetadata() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceiTunes, NSString *) >-#define AVMetadataKeySpaceiTunes PAL::get_AVFoundation_AVMetadataKeySpaceiTunes() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceID3, NSString *) >-#define AVMetadataKeySpaceID3 PAL::get_AVFoundation_AVMetadataKeySpaceID3() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVMetadataKeySpaceISOUserData, NSString *) >-#define AVMetadataKeySpaceISOUserData PAL::get_AVFoundation_AVMetadataKeySpaceISOUserData() >- >-#if PLATFORM(MAC) >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *) >-#define AVStreamDataParserContentKeyRequestProtocolVersionsKey PAL::get_AVFoundation_AVStreamDataParserContentKeyRequestProtocolVersionsKey() >-#endif >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *) >-#define AVURLAssetBoundNetworkInterfaceName PAL::get_AVFoundation_AVURLAssetBoundNetworkInterfaceName() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *) >-#define AVURLAssetClientBundleIdentifierKey PAL::get_AVFoundation_AVURLAssetClientBundleIdentifierKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetHTTPCookiesKey, NSString *) >-#define AVURLAssetHTTPCookiesKey PAL::get_AVFoundation_AVURLAssetHTTPCookiesKey() >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_HEADER(PAL, AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *) >-#define AVURLAssetRequiresCustomURLLoadingKey PAL::get_AVFoundation_AVURLAssetRequiresCustomURLLoadingKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) >-#define AVCaptureSessionRuntimeErrorNotification PAL::get_AVFoundation_AVCaptureSessionRuntimeErrorNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >-#define AVCaptureSessionWasInterruptedNotification PAL::get_AVFoundation_AVCaptureSessionWasInterruptedNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *) >-#define AVCaptureSessionInterruptionEndedNotification PAL::get_AVFoundation_AVCaptureSessionInterruptionEndedNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *) >-#define AVCaptureSessionInterruptionReasonKey PAL::get_AVFoundation_AVCaptureSessionInterruptionReasonKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVCaptureSessionErrorKey, NSString *) >-#define AVCaptureSessionErrorKey PAL::get_AVFoundation_AVCaptureSessionErrorKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryAmbient, NSString *) >-#define AVAudioSessionCategoryAmbient PAL::get_AVFoundation_AVAudioSessionCategoryAmbient() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *) >-#define AVAudioSessionCategorySoloAmbient PAL::get_AVFoundation_AVAudioSessionCategorySoloAmbient() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryPlayback, NSString *) >-#define AVAudioSessionCategoryPlayback PAL::get_AVFoundation_AVAudioSessionCategoryPlayback() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryRecord, NSString *) >-#define AVAudioSessionCategoryRecord PAL::get_AVFoundation_AVAudioSessionCategoryRecord() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *) >-#define AVAudioSessionCategoryPlayAndRecord PAL::get_AVFoundation_AVAudioSessionCategoryPlayAndRecord() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *) >-#define AVAudioSessionCategoryAudioProcessing PAL::get_AVFoundation_AVAudioSessionCategoryAudioProcessing() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionModeDefault, NSString *) >-#define AVAudioSessionModeDefault PAL::get_AVFoundation_AVAudioSessionModeDefault() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionModeVideoChat, NSString *) >-#define AVAudioSessionModeVideoChat PAL::get_AVFoundation_AVAudioSessionModeVideoChat() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >-#define AVAudioSessionInterruptionNotification PAL::get_AVFoundation_AVAudioSessionInterruptionNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >-#define AVAudioSessionInterruptionTypeKey PAL::get_AVFoundation_AVAudioSessionInterruptionTypeKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *) >-#define AVAudioSessionInterruptionOptionKey PAL::get_AVFoundation_AVAudioSessionInterruptionOptionKey() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *) >-#define AVRouteDetectorMultipleRoutesDetectedDidChangeNotification PAL::get_AVFoundation_AVRouteDetectorMultipleRoutesDetectedDidChangeNotification() >-SOFT_LINK_CONSTANT_FOR_HEADER(PAL, AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *) >-#define AVAudioSessionMediaServicesWereResetNotification PAL::get_AVFoundation_AVAudioSessionMediaServicesWereResetNotification() >-#endif // PLATFORM(IOS_FAMILY) >- >-#endif // USE(AVFOUNDATION) >diff --git a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm b/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm >deleted file mode 100644 >index 131b04adfd14bbc042494bdc686bb4b7422a22e9..0000000000000000000000000000000000000000 >--- a/Source/WebCore/PAL/pal/cocoa/AVFoundationSoftLink.mm >+++ /dev/null >@@ -1,183 +0,0 @@ >-/* >- * Copyright (C) 2019 Apple Inc. All rights reserved. >- * >- * Redistribution and use in source and binary forms, with or without >- * modification, are permitted provided that the following conditions >- * are met: >- * 1. Redistributions of source code must retain the above copyright >- * notice, this list of conditions and the following disclaimer. >- * 2. Redistributions in binary form must reproduce the above copyright >- * notice, this list of conditions and the following disclaimer in the >- * documentation and/or other materials provided with the distribution. >- * >- * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >- * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >- * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >- * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >- * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >- * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >- * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >- * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >- * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE, PAL_EXPORT) >- * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >- * THE POSSIBILITY OF SUCH DAMAGE. >- */ >- >-#import "config.h" >- >-#if USE(AVFOUNDATION) >- >-#import <AVFoundation/AVFoundation.h> >-#import <wtf/SoftLinking.h> >- >-SOFT_LINK_FRAMEWORK_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, PAL_EXPORT) >- >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetCache, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetImageGenerator, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetReader, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetWriter, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetWriterInput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureConnection, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDevice, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceFormat, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceInput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureOutput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSession, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureVideoDataOutput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFrameRateRange, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaSelectionGroup, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaSelectionOption, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataItem, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMutableAudioMix, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMutableAudioMixInputParameters, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutputContext, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItem, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemLegibleOutput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemVideoOutput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerLayer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAsset, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVAssetReaderSampleReferenceOutput, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVAssetResourceLoadingRequest, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVContentKeyResponse, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVContentKeySession, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferAudioRenderer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferRenderSynchronizer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVStreamDataParser, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_OPTIONAL_WITH_EXPORT(PAL, AVFoundation, AVVideoPerformanceMetrics, PAL_EXPORT) >- >-#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamSession, PAL_EXPORT) >-#endif >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSession, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPersistableContentKeyRequest, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechSynthesisVoice, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechSynthesizer, PAL_EXPORT) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSpeechUtterance, PAL_EXPORT) >-#endif >- >-#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >-SOFT_LINK_CLASS_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVRouteDetector, PAL_EXPORT) >-#endif >- >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFileTypeMPEG4, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResize, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResizeAspect, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicAudible, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicEasyToRead, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicLegible, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaCharacteristicVisual, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeAudio, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeClosedCaption, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeMetadata, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeMuxed, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeSubtitle, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMediaTypeVideo, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataCommonKeyTitle, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceCommon, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceID3, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceISOUserData, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVMetadataKeySpaceiTunes, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetCacheKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoAverageBitRateKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecH264, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCompressionPropertiesKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoHeightKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoProfileLevelKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoWidthKey, NSString *, PAL_EXPORT) >- >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVEncoderBitRateKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVFormatIDKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVNumberOfChannelsKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVSampleRateKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *, PAL_EXPORT) >- >-#if PLATFORM(IOS_FAMILY) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryAmbient, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryPlayback, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategoryRecord, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionModeDefault, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVAudioSessionModeVideoChat, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionErrorKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *, PAL_EXPORT) >- >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetHTTPCookiesKey, NSString *, PAL_EXPORT) >-SOFT_LINK_CONSTANT_MAY_FAIL_FOR_SOURCE_WITH_EXPORT(PAL, AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *, PAL_EXPORT) >-#endif >- >-#endif // USE(AVFOUNDATION) >diff --git a/Source/WebCore/WebCore.xcodeproj/project.pbxproj b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >index d717275a8b79aaa462797c96125a5b87ddf62554..ee2f1b435374a097057828ee608fb41694e43370 100644 >--- a/Source/WebCore/WebCore.xcodeproj/project.pbxproj >+++ b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >@@ -100,7 +100,6 @@ > 0709FC4E1025DEE30059CDBA /* AccessibilitySlider.h in Headers */ = {isa = PBXBuildFile; fileRef = 0709FC4D1025DEE30059CDBA /* AccessibilitySlider.h */; }; > 070E09191875EEFC003A1D3C /* PlatformMediaSession.h in Headers */ = {isa = PBXBuildFile; fileRef = 070E09181875ED93003A1D3C /* PlatformMediaSession.h */; settings = {ATTRIBUTES = (Private, ); }; }; > 070E81D11BF27656001FDA48 /* VideoTrackPrivateMediaStream.h in Headers */ = {isa = PBXBuildFile; fileRef = 070E81D01BF27656001FDA48 /* VideoTrackPrivateMediaStream.h */; }; >- CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDA79825170A279000D45C55 /* AudioSessionIOS.mm */; }; > 070F549817F12F6B00169E04 /* MediaStreamConstraintsValidationClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 070F549717F12F6B00169E04 /* MediaStreamConstraintsValidationClient.h */; }; > 0719427F1D088F21002AA51D /* AVFoundationMIMETypeCache.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07C8AD111D073D630087C5CE /* AVFoundationMIMETypeCache.mm */; }; > 071A9EC2168FBC43002629F9 /* TextTrackCueGeneric.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 071A9EC0168FB56C002629F9 /* TextTrackCueGeneric.cpp */; }; >@@ -4080,6 +4079,7 @@ > CDA29A321CC01A9500901CCF /* PlaybackSessionInterfaceAVKit.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA29A2E1CBF73FC00901CCF /* PlaybackSessionInterfaceAVKit.h */; settings = {ATTRIBUTES = (Private, ); }; }; > CDA595932146DEC300A84185 /* HEVCUtilities.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA595912146DEC300A84185 /* HEVCUtilities.h */; }; > CDA595982146DF7800A84185 /* HEVCUtilitiesCocoa.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA595962146DF7800A84185 /* HEVCUtilitiesCocoa.h */; }; >+ CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDA79825170A279000D45C55 /* AudioSessionIOS.mm */; }; > CDA7982A170A3D0000D45C55 /* AudioSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CDA79821170A22DC00D45C55 /* AudioSession.h */; settings = {ATTRIBUTES = (Private, ); }; }; > CDA98E0B1603CD6000FEA3B1 /* LegacyCDM.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDA98E091603CD5900FEA3B1 /* LegacyCDM.cpp */; }; > CDAB6D2917C7DE6C00C60B34 /* MediaControlsHost.h in Headers */ = {isa = PBXBuildFile; fileRef = CDAB6D2717C7DE6C00C60B34 /* MediaControlsHost.h */; }; >diff --git a/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm b/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >index 1b2f65efcda04159cad2f6f1639633d2ae3bd2b1..90924e6fa6f8205c9a9297a390536830ea9e9fd3 100644 >--- a/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >+++ b/Source/WebCore/platform/audio/ios/AudioSessionIOS.mm >@@ -34,8 +34,29 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/OSObjectPtr.h> > #import <wtf/RetainPtr.h> >- >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <wtf/SoftLinking.h> >+ >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryAmbient, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategorySoloAmbient, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryPlayback, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryRecord, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryPlayAndRecord, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionCategoryAudioProcessing, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionModeDefault, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionModeVideoChat, NSString *) >+ >+#define AVAudioSession getAVAudioSessionClass() >+#define AVAudioSessionCategoryAmbient getAVAudioSessionCategoryAmbient() >+#define AVAudioSessionCategorySoloAmbient getAVAudioSessionCategorySoloAmbient() >+#define AVAudioSessionCategoryPlayback getAVAudioSessionCategoryPlayback() >+#define AVAudioSessionCategoryRecord getAVAudioSessionCategoryRecord() >+#define AVAudioSessionCategoryPlayAndRecord getAVAudioSessionCategoryPlayAndRecord() >+#define AVAudioSessionCategoryAudioProcessing getAVAudioSessionCategoryAudioProcessing() >+#define AVAudioSessionModeDefault getAVAudioSessionModeDefault() >+#define AVAudioSessionModeVideoChat getAVAudioSessionModeVideoChat() > > namespace WebCore { > >@@ -124,7 +145,7 @@ void AudioSession::setCategory(CategoryType newCategory, RouteSharingPolicy poli > } > > NSError *error = nil; >- [[PAL::getAVAudioSessionClass() sharedInstance] setCategory:categoryString mode:categoryMode routeSharingPolicy:static_cast<AVAudioSessionRouteSharingPolicy>(policy) options:options error:&error]; >+ [[AVAudioSession sharedInstance] setCategory:categoryString mode:categoryMode routeSharingPolicy:static_cast<AVAudioSessionRouteSharingPolicy>(policy) options:options error:&error]; > #if !PLATFORM(IOS_FAMILY_SIMULATOR) && !PLATFORM(IOSMAC) > ASSERT(!error); > #endif >@@ -132,7 +153,7 @@ void AudioSession::setCategory(CategoryType newCategory, RouteSharingPolicy poli > > AudioSession::CategoryType AudioSession::category() const > { >- NSString *categoryString = [[PAL::getAVAudioSessionClass() sharedInstance] category]; >+ NSString *categoryString = [[AVAudioSession sharedInstance] category]; > if ([categoryString isEqual:AVAudioSessionCategoryAmbient]) > return AmbientSound; > if ([categoryString isEqual:AVAudioSessionCategorySoloAmbient]) >@@ -161,7 +182,7 @@ ALLOW_DEPRECATED_DECLARATIONS_END > #endif > static_assert(static_cast<size_t>(RouteSharingPolicy::Independent) == static_cast<size_t>(AVAudioSessionRouteSharingPolicyIndependent), "RouteSharingPolicy::Independent is not AVAudioSessionRouteSharingPolicyIndependent as expected"); > >- AVAudioSessionRouteSharingPolicy policy = [[PAL::getAVAudioSessionClass() sharedInstance] routeSharingPolicy]; >+ AVAudioSessionRouteSharingPolicy policy = [[AVAudioSession sharedInstance] routeSharingPolicy]; > ASSERT(static_cast<RouteSharingPolicy>(policy) <= RouteSharingPolicy::LongFormVideo); > return static_cast<RouteSharingPolicy>(policy); > } >@@ -169,7 +190,7 @@ ALLOW_DEPRECATED_DECLARATIONS_END > String AudioSession::routingContextUID() const > { > #if !PLATFORM(IOS_FAMILY_SIMULATOR) && !PLATFORM(IOSMAC) && !PLATFORM(WATCHOS) >- return [[PAL::getAVAudioSessionClass() sharedInstance] routingContextUID]; >+ return [[AVAudioSession sharedInstance] routingContextUID]; > #else > return emptyString(); > #endif >@@ -191,17 +212,17 @@ AudioSession::CategoryType AudioSession::categoryOverride() const > > float AudioSession::sampleRate() const > { >- return [[PAL::getAVAudioSessionClass() sharedInstance] sampleRate]; >+ return [[AVAudioSession sharedInstance] sampleRate]; > } > > size_t AudioSession::bufferSize() const > { >- return [[PAL::getAVAudioSessionClass() sharedInstance] IOBufferDuration] * sampleRate(); >+ return [[AVAudioSession sharedInstance] IOBufferDuration] * sampleRate(); > } > > size_t AudioSession::numberOfOutputChannels() const > { >- return [[PAL::getAVAudioSessionClass() sharedInstance] outputNumberOfChannels]; >+ return [[AVAudioSession sharedInstance] outputNumberOfChannels]; > } > > bool AudioSession::tryToSetActiveInternal(bool active) >@@ -216,14 +237,14 @@ bool AudioSession::tryToSetActiveInternal(bool active) > // returns, so do it synchronously on the same serial queue. > if (active) { > dispatch_sync(m_private->m_dispatchQueue.get(), ^{ >- [[PAL::getAVAudioSessionClass() sharedInstance] setActive:YES withOptions:0 error:&error]; >+ [[AVAudioSession sharedInstance] setActive:YES withOptions:0 error:&error]; > }); > > return !error; > } > > dispatch_async(m_private->m_dispatchQueue.get(), ^{ >- [[PAL::getAVAudioSessionClass() sharedInstance] setActive:NO withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error]; >+ [[AVAudioSession sharedInstance] setActive:NO withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error]; > }); > > return true; >@@ -231,14 +252,14 @@ bool AudioSession::tryToSetActiveInternal(bool active) > > size_t AudioSession::preferredBufferSize() const > { >- return [[PAL::getAVAudioSessionClass() sharedInstance] preferredIOBufferDuration] * sampleRate(); >+ return [[AVAudioSession sharedInstance] preferredIOBufferDuration] * sampleRate(); > } > > void AudioSession::setPreferredBufferSize(size_t bufferSize) > { > NSError *error = nil; > float duration = bufferSize / sampleRate(); >- [[PAL::getAVAudioSessionClass() sharedInstance] setPreferredIOBufferDuration:duration error:&error]; >+ [[AVAudioSession sharedInstance] setPreferredIOBufferDuration:duration error:&error]; > ASSERT(!error); > } > >diff --git a/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm b/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >index faf5b44db7d3a97da9d27e27da949be98332e449..d62593a3b55f786fab43a75657b1f84983cac44e 100644 >--- a/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >+++ b/Source/WebCore/platform/audio/ios/MediaSessionManagerIOS.mm >@@ -45,12 +45,12 @@ > #import <wtf/RAMSize.h> > #import <wtf/RetainPtr.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >- >-WEBCORE_EXPORT NSString* WebUIApplicationWillResignActiveNotification = @"WebUIApplicationWillResignActiveNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationWillEnterForegroundNotification = @"WebUIApplicationWillEnterForegroundNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationDidBecomeActiveNotification = @"WebUIApplicationDidBecomeActiveNotification"; >-WEBCORE_EXPORT NSString* WebUIApplicationDidEnterBackgroundNotification = @"WebUIApplicationDidEnterBackgroundNotification"; >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionOptionKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVRouteDetectorMultipleRoutesDetectedDidChangeNotification, NSString *) > > #if HAVE(CELESTIAL) > SOFT_LINK_PRIVATE_FRAMEWORK_OPTIONAL(Celestial) >@@ -58,6 +58,20 @@ SOFT_LINK_CLASS_OPTIONAL(Celestial, AVSystemController) > SOFT_LINK_CONSTANT_MAY_FAIL(Celestial, AVSystemController_PIDToInheritApplicationStateFrom, NSString *) > #endif > >+#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >+SOFT_LINK_CLASS(AVFoundation, AVRouteDetector) >+#endif >+ >+#define AVAudioSession getAVAudioSessionClass() >+#define AVAudioSessionInterruptionNotification getAVAudioSessionInterruptionNotification() >+#define AVAudioSessionInterruptionTypeKey getAVAudioSessionInterruptionTypeKey() >+#define AVAudioSessionInterruptionOptionKey getAVAudioSessionInterruptionOptionKey() >+ >+WEBCORE_EXPORT NSString* WebUIApplicationWillResignActiveNotification = @"WebUIApplicationWillResignActiveNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationWillEnterForegroundNotification = @"WebUIApplicationWillEnterForegroundNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationDidBecomeActiveNotification = @"WebUIApplicationDidBecomeActiveNotification"; >+WEBCORE_EXPORT NSString* WebUIApplicationDidEnterBackgroundNotification = @"WebUIApplicationDidEnterBackgroundNotification"; >+ > using namespace WebCore; > > @interface WebMediaSessionHelper : NSObject { >@@ -203,7 +217,7 @@ void MediaSessionManageriOS::externalOutputDeviceAvailableDidChange() > _callback = callback; > > NSNotificationCenter *center = [NSNotificationCenter defaultCenter]; >- [center addObserver:self selector:@selector(interruption:) name:AVAudioSessionInterruptionNotification object:[PAL::getAVAudioSessionClass() sharedInstance]]; >+ [center addObserver:self selector:@selector(interruption:) name:AVAudioSessionInterruptionNotification object:[AVAudioSession sharedInstance]]; > > [center addObserver:self selector:@selector(applicationWillEnterForeground:) name:PAL::get_UIKit_UIApplicationWillEnterForegroundNotification() object:nil]; > [center addObserver:self selector:@selector(applicationWillEnterForeground:) name:WebUIApplicationWillEnterForegroundNotification object:nil]; >@@ -286,9 +300,9 @@ void MediaSessionManageriOS::externalOutputDeviceAvailableDidChange() > > if (protectedSelf->_callback) { > BEGIN_BLOCK_OBJC_EXCEPTIONS >- protectedSelf->_routeDetector = adoptNS([PAL::allocAVRouteDetectorInstance() init]); >+ protectedSelf->_routeDetector = adoptNS([allocAVRouteDetectorInstance() init]); > protectedSelf->_routeDetector.get().routeDetectionEnabled = protectedSelf->_monitoringAirPlayRoutes; >- [[NSNotificationCenter defaultCenter] addObserver:protectedSelf selector:@selector(wirelessRoutesAvailableDidChange:) name:AVRouteDetectorMultipleRoutesDetectedDidChangeNotification object:protectedSelf->_routeDetector.get()]; >+ [[NSNotificationCenter defaultCenter] addObserver:protectedSelf selector:@selector(wirelessRoutesAvailableDidChange:) name:getAVRouteDetectorMultipleRoutesDetectedDidChangeNotification() object:protectedSelf->_routeDetector.get()]; > END_BLOCK_OBJC_EXCEPTIONS > } > >diff --git a/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm b/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >index 01502cdb0beabe3880bf33d5432f2f6526f36c7f..eb6988840a1307157831fe38aab23f0e85a8be74 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/AVTrackPrivateAVFObjCImpl.mm >@@ -35,14 +35,38 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <AVFoundation/AVPlayerItemTrack.h> > #import <objc/runtime.h> >- >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <wtf/SoftLinking.h> > > @class AVMediaSelectionOption; > @interface AVMediaSelectionOption (WebKitInternal) > - (id)optionID; > @end > >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVAssetTrack) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItemTrack) >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+ >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicDescribesVideoForAccessibility, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >+ >+#define AVMetadataItem getAVMetadataItemClass() >+ >+#define AVMediaCharacteristicIsMainProgramContent getAVMediaCharacteristicIsMainProgramContent() >+#define AVMediaCharacteristicDescribesVideoForAccessibility getAVMediaCharacteristicDescribesVideoForAccessibility() >+#define AVMediaCharacteristicIsAuxiliaryContent getAVMediaCharacteristicIsAuxiliaryContent() >+#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >+#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >+#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() >+ > namespace WebCore { > > AVTrackPrivateAVFObjCImpl::AVTrackPrivateAVFObjCImpl(AVPlayerItemTrack* track) >@@ -88,22 +112,22 @@ void AVTrackPrivateAVFObjCImpl::setEnabled(bool enabled) > AudioTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::audioKind() const > { > if (m_assetTrack) { >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return AudioTrackPrivate::Alternative; >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return AudioTrackPrivate::Description; >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if (canLoadAVMediaCharacteristicIsMainProgramContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return AudioTrackPrivate::Main; > return AudioTrackPrivate::None; > } > > if (m_mediaSelectionOption) { > AVMediaSelectionOption *option = m_mediaSelectionOption->avMediaSelectionOption(); >- if ([option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return AudioTrackPrivate::Alternative; >- if ([option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return AudioTrackPrivate::Description; >- if ([option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if (canLoadAVMediaCharacteristicIsMainProgramContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return AudioTrackPrivate::Main; > return AudioTrackPrivate::None; > } >@@ -115,26 +139,26 @@ AudioTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::audioKind() const > VideoTrackPrivate::Kind AVTrackPrivateAVFObjCImpl::videoKind() const > { > if (m_assetTrack) { >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return VideoTrackPrivate::Sign; >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) >+ if (canLoadAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) > return VideoTrackPrivate::Captions; >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return VideoTrackPrivate::Alternative; >- if ([m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if (canLoadAVMediaCharacteristicIsMainProgramContent() && [m_assetTrack hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return VideoTrackPrivate::Main; > return VideoTrackPrivate::None; > } > > if (m_mediaSelectionOption) { > AVMediaSelectionOption *option = m_mediaSelectionOption->avMediaSelectionOption(); >- if ([option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) >+ if (canLoadAVMediaCharacteristicDescribesVideoForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicDescribesVideoForAccessibility]) > return VideoTrackPrivate::Sign; >- if ([option hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) >+ if (canLoadAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() && [option hasMediaCharacteristic:AVMediaCharacteristicTranscribesSpokenDialogForAccessibility]) > return VideoTrackPrivate::Captions; >- if ([option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) >+ if (canLoadAVMediaCharacteristicIsAuxiliaryContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsAuxiliaryContent]) > return VideoTrackPrivate::Alternative; >- if ([option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) >+ if (canLoadAVMediaCharacteristicIsMainProgramContent() && [option hasMediaCharacteristic:AVMediaCharacteristicIsMainProgramContent]) > return VideoTrackPrivate::Main; > return VideoTrackPrivate::None; > } >@@ -165,6 +189,9 @@ AtomicString AVTrackPrivateAVFObjCImpl::id() const > > AtomicString AVTrackPrivateAVFObjCImpl::label() const > { >+ if (!canLoadAVMetadataCommonKeyTitle() || !canLoadAVMetadataKeySpaceCommon()) >+ return emptyAtom(); >+ > NSArray *commonMetadata = nil; > if (m_assetTrack) > commonMetadata = [m_assetTrack commonMetadata]; >@@ -173,12 +200,12 @@ AtomicString AVTrackPrivateAVFObjCImpl::label() const > else > ASSERT_NOT_REACHED(); > >- NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:commonMetadata withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [AVMetadataItem metadataItemsFromArray:commonMetadata withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if (![titles count]) > return emptyAtom(); > > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > return [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > return [[titles objectAtIndex:0] stringValue]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >index a576b6b71149421932c1433c950de5897a4a5b07..46090561811ed1c4482721015e7ec41b2eb90f8d 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/AudioSourceProviderAVFObjC.mm >@@ -48,11 +48,15 @@ > #endif > > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> > >+SOFT_LINK_FRAMEWORK(AVFoundation) > SOFT_LINK_FRAMEWORK(MediaToolbox) > SOFT_LINK_FRAMEWORK(AudioToolbox) > >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >+SOFT_LINK_CLASS(AVFoundation, AVMutableAudioMix) >+SOFT_LINK_CLASS(AVFoundation, AVMutableAudioMixInputParameters) >+ > SOFT_LINK(AudioToolbox, AudioConverterConvertComplexBuffer, OSStatus, (AudioConverterRef inAudioConverter, UInt32 inNumberPCMFrames, const AudioBufferList* inInputData, AudioBufferList* outOutputData), (inAudioConverter, inNumberPCMFrames, inInputData, outOutputData)) > SOFT_LINK(AudioToolbox, AudioConverterNew, OSStatus, (const AudioStreamBasicDescription* inSourceFormat, const AudioStreamBasicDescription* inDestinationFormat, AudioConverterRef* outAudioConverter), (inSourceFormat, inDestinationFormat, outAudioConverter)) > >@@ -203,7 +207,7 @@ void AudioSourceProviderAVFObjC::createMix() > ASSERT(m_avPlayerItem); > ASSERT(m_client); > >- m_avAudioMix = adoptNS([PAL::allocAVMutableAudioMixInstance() init]); >+ m_avAudioMix = adoptNS([allocAVMutableAudioMixInstance() init]); > > MTAudioProcessingTapCallbacks callbacks = { > 0, >@@ -220,7 +224,7 @@ void AudioSourceProviderAVFObjC::createMix() > ASSERT(tap); > ASSERT(m_tap == tap); > >- RetainPtr<AVMutableAudioMixInputParameters> parameters = adoptNS([PAL::allocAVMutableAudioMixInputParametersInstance() init]); >+ RetainPtr<AVMutableAudioMixInputParameters> parameters = adoptNS([allocAVMutableAudioMixInputParametersInstance() init]); > [parameters setAudioTapProcessor:m_tap.get()]; > > CMPersistentTrackID trackID = m_avAssetTrack.get().trackID; >diff --git a/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm b/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >index 8e6ed7c8b2a1261ed74b939084752c93dd1e7f43..5d5e1473d09450f3d629d6fd989031bd35f48c86 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/MediaPlaybackTargetMac.mm >@@ -30,8 +30,10 @@ > > #import <objc/runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVOutputContext) > > namespace WebCore { > >diff --git a/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >index 83b642172e19e67be9eabd9a6f99abc349048f20..f782d8b7e0fd897d02baa5ad1f77331abb9fabcc 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/MediaSelectionGroupAVFObjC.mm >@@ -33,15 +33,19 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <objc/runtime.h> > #import <wtf/Language.h> >+#import <wtf/SoftLinking.h> > #import <wtf/text/WTFString.h> > >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >+ > #if HAVE(MEDIA_ACCESSIBILITY_FRAMEWORK) > #include <MediaAccessibility/MediaAccessibility.h> > #include "MediaAccessibilitySoftLink.h" > #endif > >-#import <pal/cocoa/AVFoundationSoftLink.h> >- > namespace WebCore { > > Ref<MediaSelectionOptionAVFObjC> MediaSelectionOptionAVFObjC::create(MediaSelectionGroupAVFObjC& group, AVMediaSelectionOption *option) >@@ -102,7 +106,7 @@ MediaSelectionGroupAVFObjC::~MediaSelectionGroupAVFObjC() > > void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteristics) > { >- RetainPtr<NSSet> newAVOptions = adoptNS([[NSSet alloc] initWithArray:[PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[m_mediaSelectionGroup options]]]); >+ RetainPtr<NSSet> newAVOptions = adoptNS([[NSSet alloc] initWithArray:[getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[m_mediaSelectionGroup options]]]); > RetainPtr<NSMutableSet> oldAVOptions = adoptNS([[NSMutableSet alloc] initWithCapacity:m_options.size()]); > for (auto& avOption : m_options.keys()) > [oldAVOptions addObject:(__bridge AVMediaSelectionOption *)avOption]; >@@ -135,7 +139,7 @@ void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteris > RetainPtr<NSMutableArray> nsLanguages = adoptNS([[NSMutableArray alloc] initWithCapacity:userPreferredLanguages().size()]); > for (auto& language : userPreferredLanguages()) > [nsLanguages addObject:(NSString*)language]; >- NSArray* filteredOptions = [PAL::getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:[m_mediaSelectionGroup options] filteredAndSortedAccordingToPreferredLanguages:nsLanguages.get()]; >+ NSArray* filteredOptions = [getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:[m_mediaSelectionGroup options] filteredAndSortedAccordingToPreferredLanguages:nsLanguages.get()]; > > if (![filteredOptions count] && characteristics.isEmpty()) > return; >@@ -148,7 +152,7 @@ void MediaSelectionGroupAVFObjC::updateOptions(const Vector<String>& characteris > for (auto& characteristic : characteristics) > [nsCharacteristics addObject:(NSString *)characteristic]; > >- NSArray* optionsWithCharacteristics = [PAL::getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:filteredOptions withMediaCharacteristics:nsCharacteristics.get()]; >+ NSArray* optionsWithCharacteristics = [getAVMediaSelectionGroupClass() mediaSelectionOptionsFromArray:filteredOptions withMediaCharacteristics:nsCharacteristics.get()]; > if (optionsWithCharacteristics && [optionsWithCharacteristics count]) > filteredOptions = optionsWithCharacteristics; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm b/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >index be47667ccd372fbad6aee0ed58eb8436752db887..0b32f79f5e93368628015670bbd4475431ea12ad 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/AVFoundationMIMETypeCache.mm >@@ -29,14 +29,18 @@ > #if PLATFORM(COCOA) > > #import "ContentType.h" >+#import <AVFoundation/AVAsset.h> > #import <wtf/HashSet.h> > > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> > >+#if ENABLE(VIDEO) && USE(AVFOUNDATION) > #if !PLATFORM(IOSMAC) > SOFT_LINK_FRAMEWORK_OPTIONAL_PREFLIGHT(AVFoundation) > #endif >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >+#endif > > namespace WebCore { > >@@ -81,7 +85,7 @@ bool AVFoundationMIMETypeCache::canDecodeType(const String& mimeType) > return false; > > #if ENABLE(VIDEO) && USE(AVFOUNDATION) >- return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:mimeType]; >+ return [getAVURLAssetClass() isPlayableExtendedMIMEType:mimeType]; > #endif > > return false; >@@ -109,10 +113,10 @@ void AVFoundationMIMETypeCache::loadMIMETypes() > #if ENABLE(VIDEO) && USE(AVFOUNDATION) > static std::once_flag onceFlag; > std::call_once(onceFlag, [this] { >- if (!PAL::AVFoundationLibrary()) >+ if (!AVFoundationLibrary()) > return; > >- for (NSString* type in [PAL::getAVURLAssetClass() audiovisualMIMETypes]) >+ for (NSString* type in [getAVURLAssetClass() audiovisualMIMETypes]) > m_cache->add(type); > > if (m_cacheTypeCallback) >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >index 43ef2cd70004902653e431dd2b08ac342abe7ebd..620014d90437b4dd25fc211ac56b2ddbe4640259 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMInstanceFairPlayStreamingAVFObjC.mm >@@ -39,9 +39,18 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/Algorithms.h> > #import <wtf/FileSystem.h> >+#import <wtf/SoftLinking.h> > #import <wtf/text/StringHash.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeySession); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeyResponse); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset); >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeySystemFairPlayStreaming, NSString*) >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPersistableContentKeyRequest); >+#endif > > static const NSString *PlaybackSessionIdKey = @"PlaybackSessionID"; > >@@ -129,13 +138,13 @@ namespace WebCore { > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistableState() > { >- return [PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]; >+ return [getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]; > } > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistentKeys() > { > #if PLATFORM(IOS_FAMILY) >- return PAL::getAVPersistableContentKeyRequestClass(); >+ return getAVPersistableContentKeyRequestClass(); > #else > return false; > #endif >@@ -143,7 +152,7 @@ bool CDMInstanceFairPlayStreamingAVFObjC::supportsPersistentKeys() > > bool CDMInstanceFairPlayStreamingAVFObjC::supportsMediaCapability(const CDMMediaCapability& capability) > { >- if (![PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:capability.contentType]) >+ if (![getAVURLAssetClass() isPlayableExtendedMIMEType:capability.contentType]) > return false; > > // FairPlay only supports 'cbcs' encryption: >@@ -168,7 +177,7 @@ CDMInstance::SuccessValue CDMInstanceFairPlayStreamingAVFObjC::initializeWithCon > if (configuration.sessionTypes.contains(CDMSessionType::PersistentLicense) && !supportsPersistentKeys()) > return Failed; > >- if (!PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) >+ if (!canLoadAVContentKeySystemFairPlayStreaming()) > return Failed; > > return Succeeded; >@@ -361,7 +370,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::updateLicense(const String&, Li > } > > RetainPtr<NSData> appIdentifier = certificate->createNSData(); >- [PAL::getAVContentKeySessionClass() removePendingExpiredSessionReports:expiredSessions.get() withAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]; >+ [getAVContentKeySessionClass() removePendingExpiredSessionReports:expiredSessions.get() withAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]; > callback(false, { }, WTF::nullopt, WTF::nullopt, Succeeded); > return; > } >@@ -382,7 +391,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::updateLicense(const String&, Li > return; > } > >- [m_currentRequest processContentKeyResponse:[PAL::getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:responseData.createNSData().get()]]; >+ [m_currentRequest processContentKeyResponse:[getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:responseData.createNSData().get()]]; > > // FIXME(rdar://problem/35592277): stash the callback and call it once AVContentKeyResponse supports a success callback. > struct objc_method_description method = protocol_getMethodDescription(@protocol(AVContentKeySessionDelegate), @selector(contentKeySession:contentKeyRequestDidSucceed:), NO, YES); >@@ -414,7 +423,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::loadSession(LicenseType license > > RetainPtr<NSData> appIdentifier = certificate->createNSData(); > KeyStatusVector changedKeys; >- for (NSData* expiredSessionData in [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { >+ for (NSData* expiredSessionData in [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; > if (![playbackSessionIdValue isKindOfClass:[NSString class]]) >@@ -473,7 +482,7 @@ void CDMInstanceSessionFairPlayStreamingAVFObjC::removeSessionData(const String& > RetainPtr<NSData> appIdentifier = certificate->createNSData(); > RetainPtr<NSMutableArray> expiredSessionsArray = adoptNS([[NSMutableArray alloc] init]); > KeyStatusVector changedKeys; >- for (NSData* expiredSessionData in [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { >+ for (NSData* expiredSessionData in [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:appIdentifier.get() storageDirectoryAtURL:storageURL]) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; > if (![playbackSessionIdValue isKindOfClass:[NSString class]]) >@@ -706,9 +715,9 @@ AVContentKeySession* CDMInstanceSessionFairPlayStreamingAVFObjC::ensureSession() > > auto storageURL = m_instance->storageURL(); > if (!m_instance->persistentStateAllowed() || !storageURL) >- m_session = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming]; >+ m_session = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming()]; > else >- m_session = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming storageDirectoryAtURL:storageURL]; >+ m_session = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming() storageDirectoryAtURL:storageURL]; > > if (!m_session) > return nullptr; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >index 1f468797700c05f39697c528b86d6471e83c313f..9839f353f82c1ac7a2b58cf765f0cdc7a6420eb8 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVContentKeySession.mm >@@ -40,8 +40,14 @@ > #import <objc/objc-runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/FileSystem.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeySession); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVContentKeyResponse); >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeyRequestProtocolVersionsKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVContentKeySystemFairPlayStreaming, NSString *) > > typedef NSString *AVContentKeySystem; > >@@ -121,7 +127,7 @@ CDMSessionAVContentKeySession::~CDMSessionAVContentKeySession() > > bool CDMSessionAVContentKeySession::isAvailable() > { >- return PAL::getAVContentKeySessionClass(); >+ return getAVContentKeySessionClass(); > } > > RefPtr<Uint8Array> CDMSessionAVContentKeySession::generateKeyRequest(const String& mimeType, Uint8Array* initData, String& destinationURL, unsigned short& errorCode, uint32_t& systemCode) >@@ -167,7 +173,7 @@ void CDMSessionAVContentKeySession::releaseKeys() > if (!m_certificate) > return; > >- if (![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) >+ if (![getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) > return; > > auto storagePath = this->storagePath(); >@@ -175,7 +181,7 @@ void CDMSessionAVContentKeySession::releaseKeys() > return; > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); >- NSArray* expiredSessions = [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > for (NSData* expiredSessionData in expiredSessions) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; >@@ -223,8 +229,8 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > >- if ([PAL::getAVContentKeySessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >- [PAL::getAVContentKeySessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ if ([getAVContentKeySessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >+ [getAVContentKeySessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > m_expiredSession = nullptr; > return true; > } >@@ -278,7 +284,7 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > RetainPtr<NSDictionary> options; >- if (!m_protocolVersions.isEmpty() && PAL::canLoad_AVFoundation_AVContentKeyRequestProtocolVersionsKey()) { >+ if (!m_protocolVersions.isEmpty() && canLoadAVContentKeyRequestProtocolVersionsKey()) { > RetainPtr<NSMutableArray> protocolVersionsOption = adoptNS([[NSMutableArray alloc] init]); > for (auto& version : m_protocolVersions) { > if (!version) >@@ -286,7 +292,7 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > [protocolVersionsOption addObject:@(version)]; > } > >- options = @{ AVContentKeyRequestProtocolVersionsKey: protocolVersionsOption.get() }; >+ options = @{ getAVContentKeyRequestProtocolVersionsKey(): protocolVersionsOption.get() }; > } > > errorCode = MediaPlayer::NoError; >@@ -310,8 +316,8 @@ bool CDMSessionAVContentKeySession::update(Uint8Array* key, RefPtr<Uint8Array>& > systemCode = 0; > RetainPtr<NSData> keyData = adoptNS([[NSData alloc] initWithBytes:key->data() length:key->length()]); > >- if ([m_keyRequest respondsToSelector:@selector(processContentKeyResponse:)] && [PAL::getAVContentKeyResponseClass() respondsToSelector:@selector(contentKeyResponseWithFairPlayStreamingKeyResponseData:)]) >- [m_keyRequest processContentKeyResponse:[PAL::getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:keyData.get()]]; >+ if ([m_keyRequest respondsToSelector:@selector(processContentKeyResponse:)] && [getAVContentKeyResponseClass() respondsToSelector:@selector(contentKeyResponseWithFairPlayStreamingKeyResponseData:)]) >+ [m_keyRequest processContentKeyResponse:[getAVContentKeyResponseClass() contentKeyResponseWithFairPlayStreamingKeyResponseData:keyData.get()]]; > else > [m_keyRequest processContentKeyResponseData:keyData.get()]; > >@@ -340,13 +346,13 @@ RefPtr<Uint8Array> CDMSessionAVContentKeySession::generateKeyReleaseMessage(unsi > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { >+ if (storagePath.isEmpty() || ![getAVContentKeySessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { > errorCode = MediaPlayer::KeySystemNotSupported; > systemCode = '!mor'; > return nullptr; > } > >- NSArray* expiredSessions = [PAL::getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [getAVContentKeySessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > if (![expiredSessions count]) { > LOG(Media, "CDMSessionAVContentKeySession::generateKeyReleaseMessage(%p) - no expired sessions found", this); > >@@ -375,10 +381,10 @@ AVContentKeySession* CDMSessionAVContentKeySession::contentKeySession() > > String storagePath = this->storagePath(); > if (storagePath.isEmpty()) { >- if (![PAL::getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:)] || !PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) >+ if (![getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:)] || !canLoadAVContentKeySystemFairPlayStreaming()) > return nil; > >- m_contentKeySession = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming]; >+ m_contentKeySession = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming()]; > } else { > String storageDirectory = FileSystem::directoryName(storagePath); > >@@ -388,10 +394,10 @@ AVContentKeySession* CDMSessionAVContentKeySession::contentKeySession() > } > > auto url = [NSURL fileURLWithPath:storagePath]; >- if ([PAL::getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:storageDirectoryAtURL:)] && PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()) >- m_contentKeySession = [PAL::getAVContentKeySessionClass() contentKeySessionWithKeySystem:AVContentKeySystemFairPlayStreaming storageDirectoryAtURL:url]; >+ if ([getAVContentKeySessionClass() respondsToSelector:@selector(contentKeySessionWithKeySystem:storageDirectoryAtURL:)] && canLoadAVContentKeySystemFairPlayStreaming()) >+ m_contentKeySession = [getAVContentKeySessionClass() contentKeySessionWithKeySystem:getAVContentKeySystemFairPlayStreaming() storageDirectoryAtURL:url]; > else >- m_contentKeySession = adoptNS([PAL::allocAVContentKeySessionInstance() initWithStorageDirectoryAtURL:url]); >+ m_contentKeySession = adoptNS([allocAVContentKeySessionInstance() initWithStorageDirectoryAtURL:url]); > } > > m_contentKeySession.get().delegate = m_contentKeySessionDelegate.get(); >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >index c1efd51c95c285dd9e3bf971ba9416276a40629e..ca19bae08fec5fccba57e49aa0f4eaf8713e5278 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVFoundationObjC.mm >@@ -41,6 +41,12 @@ > #import <wtf/SoftLinking.h> > #import <wtf/UUID.h> > >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >+SOFT_LINK_CLASS(AVFoundation, AVAssetResourceLoadingRequest) >+#define AVURLAsset getAVURLAssetClass() >+#define AVAssetResourceLoadingRequest getAVAssetResourceLoadingRequest() >+ > namespace WebCore { > > CDMSessionAVFoundationObjC::CDMSessionAVFoundationObjC(MediaPlayerPrivateAVFoundationObjC* parent, LegacyCDMSessionClient* client) >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >index 99175a75e2ac39fa57487f077b61a84a12880ce1..75587c629f2f392bbe6cc2bb70dc8444e40b4933 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/CDMSessionAVStreamSession.mm >@@ -40,9 +40,14 @@ > #import <objc/objc-runtime.h> > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/FileSystem.h> >+#import <wtf/SoftLinking.h> > #import <wtf/UUID.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamSession); >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVStreamDataParserContentKeyRequestProtocolVersionsKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVStreamSessionContentProtectionSessionIdentifierChangedNotification, NSString *) > > @interface AVStreamSession : NSObject > - (void)addStreamDataParser:(AVStreamDataParser *)streamDataParser; >@@ -137,11 +142,11 @@ void CDMSessionAVStreamSession::releaseKeys() > return; > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![PAL::getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) >+ if (storagePath.isEmpty() || ![getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) > return; > > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); >- NSArray* expiredSessions = [PAL::getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > for (NSData* expiredSessionData in expiredSessions) { > NSDictionary *expiredSession = [NSPropertyListSerialization propertyListWithData:expiredSessionData options:kCFPropertyListImmutable format:nullptr error:nullptr]; > NSString *playbackSessionIdValue = (NSString *)[expiredSession objectForKey:PlaybackSessionIdKey]; >@@ -195,8 +200,8 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > > IGNORE_WARNINGS_BEGIN("objc-literal-conversion") > String storagePath = this->storagePath(); >- if (!storagePath.isEmpty() && [PAL::getAVStreamSessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >- [PAL::getAVStreamSessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ if (!storagePath.isEmpty() && [getAVStreamSessionClass() respondsToSelector:@selector(removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:)]) >+ [getAVStreamSessionClass() removePendingExpiredSessionReports:@[m_expiredSession.get()] withAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > IGNORE_WARNINGS_END > m_expiredSession = nullptr; > return true; >@@ -225,7 +230,7 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > RetainPtr<NSData> initData = adoptNS([[NSData alloc] initWithBytes:m_initData->data() length:m_initData->length()]); > > RetainPtr<NSDictionary> options; >- if (!m_protocolVersions.isEmpty()) { >+ if (!m_protocolVersions.isEmpty() && canLoadAVStreamDataParserContentKeyRequestProtocolVersionsKey()) { > RetainPtr<NSMutableArray> protocolVersionsOption = adoptNS([[NSMutableArray alloc] init]); > for (auto& version : m_protocolVersions) { > if (!version) >@@ -233,7 +238,7 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > [protocolVersionsOption addObject:@(version)]; > } > >- options = @{ AVStreamDataParserContentKeyRequestProtocolVersionsKey: protocolVersionsOption.get() }; >+ options = @{ getAVStreamDataParserContentKeyRequestProtocolVersionsKey(): protocolVersionsOption.get() }; > } > > NSError* error = nil; >@@ -275,15 +280,16 @@ bool CDMSessionAVStreamSession::update(Uint8Array* key, RefPtr<Uint8Array>& next > > void CDMSessionAVStreamSession::setStreamSession(AVStreamSession *streamSession) > { >- if (m_streamSession) >- [[NSNotificationCenter defaultCenter] removeObserver:m_dataParserObserver.get() name:AVStreamSessionContentProtectionSessionIdentifierChangedNotification object:m_streamSession.get()]; >+ if (m_streamSession && canLoadAVStreamSessionContentProtectionSessionIdentifierChangedNotification()) >+ [[NSNotificationCenter defaultCenter] removeObserver:m_dataParserObserver.get() name:getAVStreamSessionContentProtectionSessionIdentifierChangedNotification() object:m_streamSession.get()]; > > m_streamSession = streamSession; > > if (!m_streamSession) > return; > >- [[NSNotificationCenter defaultCenter] addObserver:m_dataParserObserver.get() selector:@selector(contentProtectionSessionIdentifierChanged:) name:AVStreamSessionContentProtectionSessionIdentifierChangedNotification object:m_streamSession.get()]; >+ if (canLoadAVStreamSessionContentProtectionSessionIdentifierChangedNotification()) >+ [[NSNotificationCenter defaultCenter] addObserver:m_dataParserObserver.get() selector:@selector(contentProtectionSessionIdentifierChanged:) name:getAVStreamSessionContentProtectionSessionIdentifierChangedNotification() object:m_streamSession.get()]; > > NSData* identifier = [streamSession contentProtectionSessionIdentifier]; > RetainPtr<NSString> sessionIdentifierString = identifier ? adoptNS([[NSString alloc] initWithData:identifier encoding:(NSUTF8StringEncoding)]) : nil; >@@ -309,13 +315,13 @@ RefPtr<Uint8Array> CDMSessionAVStreamSession::generateKeyReleaseMessage(unsigned > RetainPtr<NSData> certificateData = adoptNS([[NSData alloc] initWithBytes:m_certificate->data() length:m_certificate->length()]); > > String storagePath = this->storagePath(); >- if (storagePath.isEmpty() || ![PAL::getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { >+ if (storagePath.isEmpty() || ![getAVStreamSessionClass() respondsToSelector:@selector(pendingExpiredSessionReportsWithAppIdentifier:storageDirectoryAtURL:)]) { > errorCode = MediaPlayer::KeySystemNotSupported; > systemCode = '!mor'; > return nullptr; > } > >- NSArray* expiredSessions = [PAL::getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; >+ NSArray* expiredSessions = [getAVStreamSessionClass() pendingExpiredSessionReportsWithAppIdentifier:certificateData.get() storageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]; > if (![expiredSessions count]) { > LOG(Media, "CDMSessionAVStreamSession::generateKeyReleaseMessage(%p) - no expired sessions found", this); > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >index d3f547f99c46f5de47165e05f441462b374eaa44..17dfd7a8f2c276c08db1f7f8aff7eb6d6d0dd2ae 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm >@@ -52,12 +52,25 @@ > #import <wtf/MediaTime.h> > #import <wtf/NeverDestroyed.h> > #import <wtf/Optional.h> >+#import <wtf/SoftLinking.h> > #import <wtf/Vector.h> > >+#import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" > #import "VideoToolboxSoftLink.h" >-#import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+#pragma mark - Soft Linking >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAssetReader) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAssetReaderSampleReferenceOutput) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMediaCharacteristicVisual, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >+#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >+#define AVURLAssetReferenceRestrictionsKey getAVURLAssetReferenceRestrictionsKey() >+#define AVURLAssetUsesNoPersistentCacheKey getAVURLAssetUsesNoPersistentCacheKey() > > #pragma mark - > >@@ -225,6 +238,9 @@ static NSURL *customSchemeURL() > static NSDictionary *imageDecoderAssetOptions() > { > static NSDictionary *options = [] { >+ // FIXME: Are these keys really optional? >+ if (!canLoadAVURLAssetReferenceRestrictionsKey() || !canLoadAVURLAssetUsesNoPersistentCacheKey()) >+ return [@{ } retain]; > return [@{ > AVURLAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidAll), > AVURLAssetUsesNoPersistentCacheKey: @YES, >@@ -350,7 +366,7 @@ ImageDecoderAVFObjC::ImageDecoderAVFObjC(SharedBuffer& data, const String& mimeT > : ImageDecoder() > , m_mimeType(mimeType) > , m_uti(WebCore::UTIFromMIMEType(mimeType)) >- , m_asset(adoptNS([PAL::allocAVURLAssetInstance() initWithURL:customSchemeURL() options:imageDecoderAssetOptions()])) >+ , m_asset(adoptNS([allocAVURLAssetInstance() initWithURL:customSchemeURL() options:imageDecoderAssetOptions()])) > , m_loader(adoptNS([[WebCoreSharedBufferResourceLoaderDelegate alloc] initWithParent:this])) > , m_decompressionSession(WebCoreDecompressionSession::createRGB()) > { >@@ -383,6 +399,12 @@ bool ImageDecoderAVFObjC::canDecodeType(const String& mimeType) > > AVAssetTrack *ImageDecoderAVFObjC::firstEnabledTrack() > { >+ // FIXME: Is AVMediaCharacteristicVisual truly optional? >+ if (!canLoadAVMediaCharacteristicVisual()) { >+ LOG(Images, "ImageDecoderAVFObjC::firstEnabledTrack(%p) - AVMediaCharacteristicVisual is not supported", this); >+ return nil; >+ } >+ > NSArray<AVAssetTrack *> *videoTracks = [m_asset tracksWithMediaCharacteristic:AVMediaCharacteristicVisual]; > NSUInteger firstEnabledIndex = [videoTracks indexOfObjectPassingTest:^(AVAssetTrack *track, NSUInteger, BOOL*) { > return track.enabled; >@@ -401,8 +423,8 @@ void ImageDecoderAVFObjC::readSamples() > if (!m_sampleData.empty()) > return; > >- auto assetReader = adoptNS([PAL::allocAVAssetReaderInstance() initWithAsset:m_asset.get() error:nil]); >- auto referenceOutput = adoptNS([PAL::allocAVAssetReaderSampleReferenceOutputInstance() initWithTrack:m_track.get()]); >+ auto assetReader = adoptNS([allocAVAssetReaderInstance() initWithAsset:m_asset.get() error:nil]); >+ auto referenceOutput = adoptNS([allocAVAssetReaderSampleReferenceOutputInstance() initWithTrack:m_track.get()]); > > referenceOutput.get().alwaysCopiesSampleData = NO; > [assetReader addOutput:referenceOutput.get()]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >index 3067273ed62e5a1fa6fec4adbd228258f5c0b540..d88f7c7479b2b4be07890f9096f7d94a0f7124a3 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm >@@ -39,8 +39,42 @@ > #import <AVFoundation/AVPlayerItem.h> > #import <AVFoundation/AVPlayerItemOutput.h> > #import <objc/runtime.h> >- >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <wtf/SoftLinking.h> >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVPlayer) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItemLegibleOutput) >+#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >+#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >+#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >+#define AVMediaCharacteristicContainsOnlyForcedSubtitles getAVMediaCharacteristicContainsOnlyForcedSubtitles() >+#define AVMediaCharacteristicIsMainProgramContent getAVMediaCharacteristicIsMainProgramContent() >+#define AVMediaCharacteristicEasyToRead getAVMediaCharacteristicEasyToRead() >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeSubtitle, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicIsMainProgramContent, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicEasyToRead, NSString *) >+ >+#define AVPlayer getAVPlayerClass() >+#define AVPlayerItem getAVPlayerItemClass() >+#define AVMetadataItem getAVMetadataItemClass() >+#define AVPlayerItemLegibleOutput getAVPlayerItemLegibleOutputClass() >+#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >+#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >+#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() >+#define AVMediaTypeSubtitle getAVMediaTypeSubtitle() >+#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >+#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility getAVMediaCharacteristicDescribesMusicAndSoundForAccessibility() > > namespace WebCore { > >@@ -136,10 +170,10 @@ AtomicString InbandTextTrackPrivateAVFObjC::label() const > > NSString *title = 0; > >- NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:[m_mediaSelectionOption.get() commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [AVMetadataItem metadataItemsFromArray:[m_mediaSelectionOption.get() commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if ([titles count]) { > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > title = [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >index ba0dfdea3e4ab45d71ab790e931f4764a31ba427..4c7a45f2d3079189880c736934b494d386a1794a 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm >@@ -33,8 +33,24 @@ > #import "Logging.h" > #import "MediaPlayerPrivateAVFoundationObjC.h" > #import <objc/runtime.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataCommonKeyTitle, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceCommon, NSString *) >+ >+#define AVPlayerItem getAVPlayerItemClass() >+#define AVMetadataItem getAVMetadataItemClass() >+#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >+#define AVMetadataCommonKeyTitle getAVMetadataCommonKeyTitle() >+#define AVMetadataKeySpaceCommon getAVMetadataKeySpaceCommon() > > namespace WebCore { > >@@ -85,10 +101,10 @@ AtomicString InbandTextTrackPrivateLegacyAVFObjC::label() const > > NSString *title = 0; > >- NSArray *titles = [PAL::getAVMetadataItemClass() metadataItemsFromArray:[[m_playerItemTrack assetTrack] commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; >+ NSArray *titles = [AVMetadataItem metadataItemsFromArray:[[m_playerItemTrack assetTrack] commonMetadata] withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; > if ([titles count]) { > // If possible, return a title in one of the user's preferred languages. >- NSArray *titlesForPreferredLanguages = [PAL::getAVMetadataItemClass() metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; >+ NSArray *titlesForPreferredLanguages = [AVMetadataItem metadataItemsFromArray:titles filteredAndSortedAccordingToPreferredLanguages:[NSLocale preferredLanguages]]; > if ([titlesForPreferredLanguages count]) > title = [[titlesForPreferredLanguages objectAtIndex:0] stringValue]; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >index 9d03f98f090c1ad84cd9fed8d1e7d91f832926e9..2763744ad0119b4d3f84f515876df5050484a483 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm >@@ -37,9 +37,14 @@ > #import <pal/spi/mac/AVFoundationSPI.h> > #import <wtf/MainThread.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+typedef AVOutputContext AVOutputContextWKType; >+typedef AVOutputDeviceMenuController AVOutputDeviceMenuControllerWKType; >+ > > SOFTLINK_AVKIT_FRAMEWORK() >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVOutputContext) > SOFT_LINK_CLASS_OPTIONAL(AVKit, AVOutputDeviceMenuController) > > using namespace WebCore; >@@ -82,7 +87,7 @@ Ref<MediaPlaybackTarget> MediaPlaybackTargetPickerMac::playbackTarget() > return WebCore::MediaPlaybackTargetMac::create(context); > } > >-AVOutputDeviceMenuController *MediaPlaybackTargetPickerMac::devicePicker() >+AVOutputDeviceMenuControllerWKType *MediaPlaybackTargetPickerMac::devicePicker() > { > if (!getAVOutputDeviceMenuControllerClass()) > return nullptr; >@@ -90,7 +95,7 @@ AVOutputDeviceMenuController *MediaPlaybackTargetPickerMac::devicePicker() > if (!m_outputDeviceMenuController) { > LOG(Media, "MediaPlaybackTargetPickerMac::devicePicker - allocating picker"); > >- RetainPtr<AVOutputContext> context = adoptNS([PAL::allocAVOutputContextInstance() init]); >+ RetainPtr<AVOutputContextWKType> context = adoptNS([allocAVOutputContextInstance() init]); > m_outputDeviceMenuController = adoptNS([allocAVOutputDeviceMenuControllerInstance() initWithOutputContext:context.get()]); > > [m_outputDeviceMenuController.get() addObserver:m_outputDeviceMenuControllerDelegate.get() forKeyPath:externalOutputDeviceAvailableKeyName options:NSKeyValueObservingOptionNew context:nullptr]; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >index 4e5b82372e9b758c26d4a3e5d5801cfa7a1840bc..5e293ba0518be0e4fc8ba0da4abfbdee8fb4946e 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm >@@ -136,8 +136,168 @@ template <> struct iterator_traits<HashSet<RefPtr<WebCore::MediaSelectionOptionA > @property (nonatomic, readonly) NSURL *resolvedURL; > @end > >+typedef AVPlayer AVPlayerType; >+typedef AVPlayerItem AVPlayerItemType; >+typedef AVPlayerItemLegibleOutput AVPlayerItemLegibleOutputType; >+typedef AVPlayerItemVideoOutput AVPlayerItemVideoOutputType; >+typedef AVMetadataItem AVMetadataItemType; >+typedef AVMediaSelectionGroup AVMediaSelectionGroupType; >+typedef AVMediaSelectionOption AVMediaSelectionOptionType; >+typedef AVAssetCache AVAssetCacheType; >+ >+#pragma mark - Soft Linking >+ >+// Soft-linking headers must be included last since they #define functions, constants, etc. > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(CoreImage) >+ >+SOFT_LINK_CLASS(AVFoundation, AVPlayer) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItem) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItemVideoOutput) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >+SOFT_LINK_CLASS(AVFoundation, AVURLAsset) >+SOFT_LINK_CLASS(AVFoundation, AVAssetImageGenerator) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+SOFT_LINK_CLASS(AVFoundation, AVAssetCache) >+ >+SOFT_LINK_CLASS(CoreImage, CIContext) >+SOFT_LINK_CLASS(CoreImage, CIImage) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicVisual, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicAudible, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeClosedCaption, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeMetadata, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVPlayerItemDidPlayToEndTimeNotification, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetInheritURIQueryComponentFromReferencingURIKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAssetImageGeneratorApertureModeCleanAperture, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVStreamingKeyDeliveryContentKeyType, NSString *) >+ >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetOutOfBandMIMETypeKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetUseClientURLLoadingExclusively, NSString *) >+ >+#define AVPlayer initAVPlayer() >+#define AVPlayerItem initAVPlayerItem() >+#define AVPlayerLayer initAVPlayerLayer() >+#define AVURLAsset initAVURLAsset() >+#define AVAssetImageGenerator initAVAssetImageGenerator() >+#define AVPlayerItemVideoOutput initAVPlayerItemVideoOutput() >+#define AVMetadataItem initAVMetadataItem() >+#define AVAssetCache initAVAssetCache() >+ >+#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral() >+#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed() >+#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >+#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >+#define AVMediaTypeClosedCaption getAVMediaTypeClosedCaption() >+#define AVMediaTypeVideo getAVMediaTypeVideo() >+#define AVMediaTypeAudio getAVMediaTypeAudio() >+#define AVMediaTypeMetadata getAVMediaTypeMetadata() >+#define AVPlayerItemDidPlayToEndTimeNotification getAVPlayerItemDidPlayToEndTimeNotification() >+#define AVURLAssetInheritURIQueryComponentFromReferencingURIKey getAVURLAssetInheritURIQueryComponentFromReferencingURIKey() >+#define AVURLAssetOutOfBandMIMETypeKey getAVURLAssetOutOfBandMIMETypeKey() >+#define AVURLAssetUseClientURLLoadingExclusively getAVURLAssetUseClientURLLoadingExclusively() >+#define AVAssetImageGeneratorApertureModeCleanAperture getAVAssetImageGeneratorApertureModeCleanAperture() >+#define AVURLAssetReferenceRestrictionsKey getAVURLAssetReferenceRestrictionsKey() >+#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect() >+#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill() >+#define AVLayerVideoGravityResize getAVLayerVideoGravityResize() >+#define AVStreamingKeyDeliveryContentKeyType getAVStreamingKeyDeliveryContentKeyType() >+ >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ >+typedef AVMediaSelectionGroup AVMediaSelectionGroupType; >+typedef AVMediaSelectionOption AVMediaSelectionOptionType; >+ >+SOFT_LINK_CLASS(AVFoundation, AVPlayerItemLegibleOutput) >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionGroup) >+SOFT_LINK_CLASS(AVFoundation, AVMediaSelectionOption) >+SOFT_LINK_CLASS(AVFoundation, AVOutputContext) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeSubtitle, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicContainsOnlyForcedSubtitles, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly, NSString *) >+ >+#define AVPlayerItemLegibleOutput getAVPlayerItemLegibleOutputClass() >+#define AVMediaSelectionGroup getAVMediaSelectionGroupClass() >+#define AVMediaSelectionOption getAVMediaSelectionOptionClass() >+#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() >+#define AVMediaTypeSubtitle getAVMediaTypeSubtitle() >+#define AVMediaCharacteristicContainsOnlyForcedSubtitles getAVMediaCharacteristicContainsOnlyForcedSubtitles() >+#define AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly getAVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly() >+ >+#endif >+ >+#if ENABLE(AVF_CAPTIONS) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetCacheKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetOutOfBandAlternateTracksKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetUsesNoPersistentCacheKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackDisplayNameKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackExtendedLanguageTagKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackIsDefaultKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackMediaCharactersticsKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackIdentifierKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVOutOfBandAlternateTrackSourceKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicDescribesMusicAndSoundForAccessibility, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicTranscribesSpokenDialogForAccessibility, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicIsAuxiliaryContent, NSString *) >+ >+#define AVURLAssetOutOfBandAlternateTracksKey getAVURLAssetOutOfBandAlternateTracksKey() >+#define AVURLAssetCacheKey getAVURLAssetCacheKey() >+#define AVURLAssetUsesNoPersistentCacheKey getAVURLAssetUsesNoPersistentCacheKey() >+#define AVOutOfBandAlternateTrackDisplayNameKey getAVOutOfBandAlternateTrackDisplayNameKey() >+#define AVOutOfBandAlternateTrackExtendedLanguageTagKey getAVOutOfBandAlternateTrackExtendedLanguageTagKey() >+#define AVOutOfBandAlternateTrackIsDefaultKey getAVOutOfBandAlternateTrackIsDefaultKey() >+#define AVOutOfBandAlternateTrackMediaCharactersticsKey getAVOutOfBandAlternateTrackMediaCharactersticsKey() >+#define AVOutOfBandAlternateTrackIdentifierKey getAVOutOfBandAlternateTrackIdentifierKey() >+#define AVOutOfBandAlternateTrackSourceKey getAVOutOfBandAlternateTrackSourceKey() >+#define AVMediaCharacteristicDescribesMusicAndSoundForAccessibility getAVMediaCharacteristicDescribesMusicAndSoundForAccessibility() >+#define AVMediaCharacteristicTranscribesSpokenDialogForAccessibility getAVMediaCharacteristicTranscribesSpokenDialogForAccessibility() >+#define AVMediaCharacteristicIsAuxiliaryContent getAVMediaCharacteristicIsAuxiliaryContent() >+ >+#endif >+ >+#if ENABLE(DATACUE_VALUE) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceQuickTimeUserData, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVMetadataKeySpaceISOUserData, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceQuickTimeMetadata, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceiTunes, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMetadataKeySpaceID3, NSString *) >+ >+#define AVMetadataKeySpaceQuickTimeUserData getAVMetadataKeySpaceQuickTimeUserData() >+#define AVMetadataKeySpaceISOUserData getAVMetadataKeySpaceISOUserData() >+#define AVMetadataKeySpaceQuickTimeMetadata getAVMetadataKeySpaceQuickTimeMetadata() >+#define AVMetadataKeySpaceiTunes getAVMetadataKeySpaceiTunes() >+#define AVMetadataKeySpaceID3 getAVMetadataKeySpaceID3() >+ >+#endif >+ >+#if PLATFORM(IOS_FAMILY) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVURLAssetBoundNetworkInterfaceName, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetClientBundleIdentifierKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetHTTPCookiesKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVURLAssetRequiresCustomURLLoadingKey, NSString *) >+ >+#define AVURLAssetBoundNetworkInterfaceName getAVURLAssetBoundNetworkInterfaceName() >+#define AVURLAssetClientBundleIdentifierKey getAVURLAssetClientBundleIdentifierKey() >+#define AVURLAssetHTTPCookiesKey getAVURLAssetHTTPCookiesKey() >+#define AVURLAssetRequiresCustomURLLoadingKey getAVURLAssetRequiresCustomURLLoadingKey() >+ >+#endif > > SOFT_LINK_FRAMEWORK(MediaToolbox) > SOFT_LINK_OPTIONAL(MediaToolbox, MTEnableCaption2015Behavior, Boolean, (), ()) >@@ -165,7 +325,11 @@ enum MediaPlayerAVFoundationObservationContext { > MediaPlayerAVFoundationObservationContextAVPlayerLayer, > }; > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > @interface WebCoreAVFMovieObserver : NSObject <AVPlayerItemLegibleOutputPushDelegate> >+#else >+@interface WebCoreAVFMovieObserver : NSObject >+#endif > { > WeakPtr<MediaPlayerPrivateAVFoundationObjC> m_player; > GenericTaskQueue<Timer, std::atomic<unsigned>> m_taskQueue; >@@ -176,8 +340,10 @@ enum MediaPlayerAVFoundationObservationContext { > -(void)metadataLoaded; > -(void)didEnd:(NSNotification *)notification; > -(void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(MediaPlayerAVFoundationObservationContext)context; >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > - (void)legibleOutput:(id)output didOutputAttributedStrings:(NSArray *)strings nativeSampleBuffers:(NSArray *)nativeSamples forItemTime:(CMTime)itemTime; > - (void)outputSequenceWasFlushed:(id)output; >+#endif > @end > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) >@@ -243,7 +409,7 @@ void MediaPlayerPrivateAVFoundationObjC::registerMediaEngine(MediaEngineRegistra > ASSERT(AVFoundationMIMETypeCache::singleton().isAvailable()); > } > >-static AVAssetCache *assetCacheForPath(const String& path) >+static AVAssetCacheType *assetCacheForPath(const String& path) > { > NSURL *assetCacheURL; > >@@ -252,7 +418,7 @@ static AVAssetCache *assetCacheForPath(const String& path) > else > assetCacheURL = [NSURL fileURLWithPath:path isDirectory:YES]; > >- return [PAL::getAVAssetCacheClass() assetCacheWithURL:assetCacheURL]; >+ return [initAVAssetCache() assetCacheWithURL:assetCacheURL]; > } > > HashSet<RefPtr<SecurityOrigin>> MediaPlayerPrivateAVFoundationObjC::originsInMediaCache(const String& path) >@@ -274,7 +440,7 @@ static WallTime toSystemClockTime(NSDate *date) > > void MediaPlayerPrivateAVFoundationObjC::clearMediaCache(const String& path, WallTime modifiedSince) > { >- AVAssetCache* assetCache = assetCacheForPath(path); >+ AVAssetCacheType* assetCache = assetCacheForPath(path); > > for (NSString *key in [assetCache allKeys]) { > if (toSystemClockTime([assetCache lastModifiedDateOfEntryForKey:key]) > modifiedSince) >@@ -316,7 +482,7 @@ void MediaPlayerPrivateAVFoundationObjC::clearMediaCache(const String& path, Wal > > void MediaPlayerPrivateAVFoundationObjC::clearMediaCacheForOrigins(const String& path, const HashSet<RefPtr<SecurityOrigin>>& origins) > { >- AVAssetCache* assetCache = assetCacheForPath(path); >+ AVAssetCacheType* assetCache = assetCacheForPath(path); > for (NSString *key in [assetCache allKeys]) { > URL keyAsURL = URL(URL(), key); > if (keyAsURL.isValid()) { >@@ -394,11 +560,13 @@ void MediaPlayerPrivateAVFoundationObjC::cancelLoad() > > clearTextTracks(); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > if (m_legibleOutput) { > if (m_avPlayerItem) > [m_avPlayerItem.get() removeOutput:m_legibleOutput.get()]; > m_legibleOutput = nil; > } >+#endif > > if (m_avPlayerItem) { > for (NSString *keyName in itemKVOProperties()) >@@ -477,7 +645,7 @@ void MediaPlayerPrivateAVFoundationObjC::createImageGenerator() > if (!m_avAsset || m_imageGenerator) > return; > >- m_imageGenerator = [PAL::getAVAssetImageGeneratorClass() assetImageGeneratorWithAsset:m_avAsset.get()]; >+ m_imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:m_avAsset.get()]; > > [m_imageGenerator.get() setApertureMode:AVAssetImageGeneratorApertureModeCleanAperture]; > [m_imageGenerator.get() setAppliesPreferredTrackTransform:YES]; >@@ -533,7 +701,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerLayer() > if (!m_avPlayer) > return; > >- m_videoLayer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); >+ m_videoLayer = adoptNS([[AVPlayerLayer alloc] init]); > [m_videoLayer setPlayer:m_avPlayer.get()]; > > #ifndef NDEBUG >@@ -632,7 +800,7 @@ void MediaPlayerPrivateAVFoundationObjC::synchronizeTextTrackState() > continue; > > RefPtr<OutOfBandTextTrackPrivateAVF> trackPrivate = static_cast<OutOfBandTextTrackPrivateAVF*>(textTrack.get()); >- RetainPtr<AVMediaSelectionOption> currentOption = trackPrivate->mediaSelectionOption(); >+ RetainPtr<AVMediaSelectionOptionType> currentOption = trackPrivate->mediaSelectionOption(); > > for (auto& track : outOfBandTrackSources) { > RetainPtr<CFStringRef> uniqueID = String::number(track->uniqueId()).createCFString(); >@@ -722,19 +890,19 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > if (player()->doesHaveAttribute("x-itunes-inherit-uri-query-component")) > [options.get() setObject:@YES forKey: AVURLAssetInheritURIQueryComponentFromReferencingURIKey]; > >- if (PAL::canLoad_AVFoundation_AVURLAssetUseClientURLLoadingExclusively()) >+ if (canLoadAVURLAssetUseClientURLLoadingExclusively()) > [options setObject:@YES forKey:AVURLAssetUseClientURLLoadingExclusively]; > #if PLATFORM(IOS_FAMILY) >- else if (PAL::canLoad_AVFoundation_AVURLAssetRequiresCustomURLLoadingKey()) >+ else if (canLoadAVURLAssetRequiresCustomURLLoadingKey()) > [options setObject:@YES forKey:AVURLAssetRequiresCustomURLLoadingKey]; > // FIXME: rdar://problem/20354688 > String identifier = player()->sourceApplicationIdentifier(); >- if (!identifier.isEmpty()) >+ if (!identifier.isEmpty() && canLoadAVURLAssetClientBundleIdentifierKey()) > [options setObject:identifier forKey:AVURLAssetClientBundleIdentifierKey]; > #endif > > auto type = player()->contentMIMEType(); >- if (PAL::canLoad_AVFoundation_AVURLAssetOutOfBandMIMETypeKey() && !type.isEmpty() && !player()->contentMIMETypeWasInferredFromExtension()) { >+ if (canLoadAVURLAssetOutOfBandMIMETypeKey() && !type.isEmpty() && !player()->contentMIMETypeWasInferredFromExtension()) { > auto codecs = player()->contentTypeCodecs(); > if (!codecs.isEmpty()) { > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)type, (NSString *)codecs]; >@@ -779,7 +947,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > for (auto& cookie : cookies) > [nsCookies addObject:toNSHTTPCookie(cookie)]; > >- if (PAL::canLoad_AVFoundation_AVURLAssetHTTPCookiesKey()) >+ if (canLoadAVURLAssetHTTPCookiesKey()) > [options setObject:nsCookies.get() forKey:AVURLAssetHTTPCookiesKey]; > } > #endif >@@ -791,7 +959,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > [options setObject:assetCacheForPath(player()->client().mediaPlayerMediaCacheDirectory()) forKey:AVURLAssetCacheKey]; > > NSURL *cocoaURL = canonicalURL(url); >- m_avAsset = adoptNS([PAL::allocAVURLAssetInstance() initWithURL:cocoaURL options:options.get()]); >+ m_avAsset = adoptNS([[AVURLAsset alloc] initWithURL:cocoaURL options:options.get()]); > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) > AVAssetResourceLoader *resourceLoader = m_avAsset.get().resourceLoader; >@@ -813,7 +981,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVAssetForURL(const URL& url) > setDelayCallbacks(false); > } > >-void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItem *item) >+void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItemType *item) > { > if (!m_avPlayer) > return; >@@ -823,8 +991,8 @@ void MediaPlayerPrivateAVFoundationObjC::setAVPlayerItem(AVPlayerItem *item) > return; > } > >- RetainPtr<AVPlayer> strongPlayer = m_avPlayer.get(); >- RetainPtr<AVPlayerItem> strongItem = item; >+ RetainPtr<AVPlayerType> strongPlayer = m_avPlayer.get(); >+ RetainPtr<AVPlayerItemType> strongItem = item; > dispatch_async(dispatch_get_main_queue(), [strongPlayer, strongItem] { > [strongPlayer replaceCurrentItemWithPlayerItem:strongItem.get()]; > }); >@@ -839,13 +1007,15 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayer() > > setDelayCallbacks(true); > >- m_avPlayer = adoptNS([PAL::allocAVPlayerInstance() init]); >+ m_avPlayer = adoptNS([[AVPlayer alloc] init]); > for (NSString *keyName in playerKVOProperties()) > [m_avPlayer.get() addObserver:m_objcObserver.get() forKeyPath:keyName options:NSKeyValueObservingOptionNew context:(void *)MediaPlayerAVFoundationObservationContextPlayer]; > > setShouldObserveTimeControlStatus(true); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > [m_avPlayer.get() setAppliesMediaSelectionCriteriaAutomatically:NO]; >+#endif > > #if ENABLE(WIRELESS_PLAYBACK_TARGET) > updateDisableExternalPlayback(); >@@ -889,7 +1059,7 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerItem() > setDelayCallbacks(true); > > // Create the player item so we can load media data. >- m_avPlayerItem = adoptNS([PAL::allocAVPlayerItemInstance() initWithAsset:m_avAsset.get()]); >+ m_avPlayerItem = adoptNS([[AVPlayerItem alloc] initWithAsset:m_avAsset.get()]); > > [[NSNotificationCenter defaultCenter] addObserver:m_objcObserver.get() selector:@selector(didEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:m_avPlayerItem.get()]; > >@@ -902,16 +1072,18 @@ void MediaPlayerPrivateAVFoundationObjC::createAVPlayerItem() > if (m_avPlayer) > setAVPlayerItem(m_avPlayerItem.get()); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) && HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) > const NSTimeInterval legibleOutputAdvanceInterval = 2; > > RetainPtr<NSArray> subtypes = adoptNS([[NSArray alloc] initWithObjects:[NSNumber numberWithUnsignedInt:kCMSubtitleFormatType_WebVTT], nil]); >- m_legibleOutput = adoptNS([PAL::allocAVPlayerItemLegibleOutputInstance() initWithMediaSubtypesForNativeRepresentation:subtypes.get()]); >+ m_legibleOutput = adoptNS([[AVPlayerItemLegibleOutput alloc] initWithMediaSubtypesForNativeRepresentation:subtypes.get()]); > [m_legibleOutput.get() setSuppressesPlayerRendering:YES]; > > [m_legibleOutput.get() setDelegate:m_objcObserver.get() queue:dispatch_get_main_queue()]; > [m_legibleOutput.get() setAdvanceIntervalForDelegateInvocation:legibleOutputAdvanceInterval]; > [m_legibleOutput.get() setTextStylingResolution:AVPlayerItemLegibleOutputTextStylingResolutionSourceAndRulesOnly]; > [m_avPlayerItem.get() addOutput:m_legibleOutput.get()]; >+#endif > > #if ENABLE(WEB_AUDIO) && USE(MEDIATOOLBOX) > if (m_provider) { >@@ -1550,7 +1722,7 @@ MediaPlayer::SupportsType MediaPlayerPrivateAVFoundationObjC::supportsType(const > return MediaPlayer::IsNotSupported; > > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)containerType, (NSString *)parameters.type.parameter(ContentType::codecsParameter())]; >- return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported; >+ return [AVURLAsset isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported; > } > > bool MediaPlayerPrivateAVFoundationObjC::supportsKeySystem(const String& keySystem, const String& mimeType) >@@ -1696,7 +1868,7 @@ void MediaPlayerPrivateAVFoundationObjC::didStopLoadingRequest(AVAssetResourceLo > > bool MediaPlayerPrivateAVFoundationObjC::isAvailable() > { >- return PAL::AVFoundationLibrary() && isCoreMediaFrameworkAvailable(); >+ return AVFoundationLibrary() && isCoreMediaFrameworkAvailable(); > } > > MediaTime MediaPlayerPrivateAVFoundationObjC::mediaTimeForTimeValue(const MediaTime& timeValue) const >@@ -1761,6 +1933,9 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > AVAssetTrack* firstEnabledVideoTrack = firstEnabledTrack([m_avAsset.get() tracksWithMediaCharacteristic:AVMediaCharacteristicVisual]); > setHasVideo(firstEnabledVideoTrack); > setHasAudio(firstEnabledTrack([m_avAsset.get() tracksWithMediaCharacteristic:AVMediaCharacteristicAudible])); >+#if !HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ hasCaptions = [[m_avAsset.get() tracksWithMediaType:AVMediaTypeClosedCaption] count]; >+#endif > auto size = firstEnabledVideoTrack ? FloatSize(CGSizeApplyAffineTransform([firstEnabledVideoTrack naturalSize], [firstEnabledVideoTrack preferredTransform])) : FloatSize(); > // For videos with rotation tag set, the transformation above might return a CGSize instance with negative width or height. > // See https://bugs.webkit.org/show_bug.cgi?id=172648. >@@ -1782,6 +1957,9 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > else if ([mediaType isEqualToString:AVMediaTypeAudio]) > hasAudio = true; > else if ([mediaType isEqualToString:AVMediaTypeClosedCaption]) { >+#if !HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ hasCaptions = true; >+#endif > haveCCTrack = true; > } else if ([mediaType isEqualToString:AVMediaTypeMetadata]) { > hasMetaData = true; >@@ -1789,11 +1967,15 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > } > } > >+#if ENABLE(VIDEO_TRACK) > updateAudioTracks(); > updateVideoTracks(); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > hasAudio |= (m_audibleGroup && m_audibleGroup->selectedOption()); > hasVideo |= (m_visualGroup && m_visualGroup->selectedOption()); >+#endif >+#endif > > // Always says we have video if the AVPlayerLayer is ready for diaplay to work around > // an AVFoundation bug which causes it to sometimes claim a track is disabled even >@@ -1807,12 +1989,22 @@ void MediaPlayerPrivateAVFoundationObjC::tracksChanged() > #endif > } > >- AVMediaSelectionGroup *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ AVMediaSelectionGroupType *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); > if (legibleGroup && m_cachedTracks) { >- hasCaptions = [[PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[legibleGroup options]] count]; >+ hasCaptions = [[AVMediaSelectionGroup playableMediaSelectionOptionsFromArray:[legibleGroup options]] count]; > if (hasCaptions) > processMediaSelectionOptions(); > } >+#endif >+ >+#if !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) && HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ if (!hasCaptions && haveCCTrack) >+ processLegacyClosedCaptionsTracks(); >+#elif !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) >+ if (haveCCTrack) >+ processLegacyClosedCaptionsTracks(); >+#endif > > setHasClosedCaptions(hasCaptions); > >@@ -1877,6 +2069,8 @@ void determineChangedTracksFromNewTracksAndOldItems(NSArray* tracks, NSString* t > (player->*addedFunction)(*addedItem); > } > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ > template <typename RefT, typename PassRefT> > void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* group, Vector<RefT>& oldItems, const Vector<String>& characteristics, RefT (*itemFactory)(MediaSelectionOptionAVFObjC&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT)) > { >@@ -1886,7 +2080,7 @@ void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* > for (auto& option : group->options()) { > if (!option) > continue; >- AVMediaSelectionOption* avOption = option->avMediaSelectionOption(); >+ AVMediaSelectionOptionType* avOption = option->avMediaSelectionOption(); > if (!avOption) > continue; > newSelectionOptions.add(option); >@@ -1937,21 +2131,25 @@ void determineChangedTracksFromNewTracksAndOldItems(MediaSelectionGroupAVFObjC* > (player->*addedFunction)(*addedItem); > } > >+#endif >+ > void MediaPlayerPrivateAVFoundationObjC::updateAudioTracks() > { > #if !RELEASE_LOG_DISABLED > size_t count = m_audioTracks.size(); > #endif > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > Vector<String> characteristics = player()->preferredAudioCharacteristics(); > if (!m_audibleGroup) { >- if (AVMediaSelectionGroup *group = safeMediaSelectionGroupForAudibleMedia()) >+ if (AVMediaSelectionGroupType *group = safeMediaSelectionGroupForAudibleMedia()) > m_audibleGroup = MediaSelectionGroupAVFObjC::create(m_avPlayerItem.get(), group, characteristics); > } > > if (m_audibleGroup) > determineChangedTracksFromNewTracksAndOldItems(m_audibleGroup.get(), m_audioTracks, characteristics, &AudioTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack); > else >+#endif > determineChangedTracksFromNewTracksAndOldItems(m_cachedTracks.get(), AVMediaTypeAudio, m_audioTracks, &AudioTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack); > > for (auto& track : m_audioTracks) >@@ -1970,13 +2168,15 @@ void MediaPlayerPrivateAVFoundationObjC::updateVideoTracks() > > determineChangedTracksFromNewTracksAndOldItems(m_cachedTracks.get(), AVMediaTypeVideo, m_videoTracks, &VideoTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > if (!m_visualGroup) { >- if (AVMediaSelectionGroup *group = safeMediaSelectionGroupForVisualMedia()) >+ if (AVMediaSelectionGroupType *group = safeMediaSelectionGroupForVisualMedia()) > m_visualGroup = MediaSelectionGroupAVFObjC::create(m_avPlayerItem.get(), group, Vector<String>()); > } > > if (m_visualGroup) > determineChangedTracksFromNewTracksAndOldItems(m_visualGroup.get(), m_videoTracks, Vector<String>(), &VideoTrackPrivateAVFObjC::create, player(), &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack); >+#endif > > for (auto& track : m_audioTracks) > track->resetPropertiesFromTrack(); >@@ -2072,7 +2272,7 @@ void MediaPlayerPrivateAVFoundationObjC::createVideoOutput() > #else > NSDictionary* attributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]; > #endif >- m_videoOutput = adoptNS([PAL::allocAVPlayerItemVideoOutputInstance() initWithPixelBufferAttributes:attributes]); >+ m_videoOutput = adoptNS([[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attributes]); > ASSERT(m_videoOutput); > > [m_videoOutput setDelegate:m_videoOutputDelegate.get() queue:globalPullDelegateQueue()]; >@@ -2222,7 +2422,7 @@ void MediaPlayerPrivateAVFoundationObjC::waitForVideoOutputMediaDataWillChange() > ERROR_LOG(LOGIDENTIFIER, "timed out"); > } > >-void MediaPlayerPrivateAVFoundationObjC::outputMediaDataWillChange(AVPlayerItemVideoOutput *) >+void MediaPlayerPrivateAVFoundationObjC::outputMediaDataWillChange(AVPlayerItemVideoOutputType *) > { > m_videoOutputSemaphore.signal(); > } >@@ -2351,6 +2551,45 @@ void MediaPlayerPrivateAVFoundationObjC::setWaitingForKey(bool waitingForKey) > } > #endif > >+#if !HAVE(AVFOUNDATION_LEGIBLE_OUTPUT_SUPPORT) >+ >+void MediaPlayerPrivateAVFoundationObjC::processLegacyClosedCaptionsTracks() >+{ >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ [m_avPlayerItem.get() selectMediaOption:nil inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >+#endif >+ >+ Vector<RefPtr<InbandTextTrackPrivateAVF>> removedTextTracks = m_textTracks; >+ for (AVPlayerItemTrack *playerItemTrack in m_cachedTracks.get()) { >+ >+ AVAssetTrack *assetTrack = [playerItemTrack assetTrack]; >+ if (![[assetTrack mediaType] isEqualToString:AVMediaTypeClosedCaption]) >+ continue; >+ >+ bool newCCTrack = true; >+ for (unsigned i = removedTextTracks.size(); i > 0; --i) { >+ if (removedTextTracks[i - 1]->textTrackCategory() != InbandTextTrackPrivateAVF::LegacyClosedCaption) >+ continue; >+ >+ RefPtr<InbandTextTrackPrivateLegacyAVFObjC> track = static_cast<InbandTextTrackPrivateLegacyAVFObjC*>(m_textTracks[i - 1].get()); >+ if (track->avPlayerItemTrack() == playerItemTrack) { >+ removedTextTracks.remove(i - 1); >+ newCCTrack = false; >+ break; >+ } >+ } >+ >+ if (!newCCTrack) >+ continue; >+ >+ m_textTracks.append(InbandTextTrackPrivateLegacyAVFObjC::create(this, playerItemTrack)); >+ } >+ >+ processNewAndRemovedTextTracks(removedTextTracks); >+} >+ >+#endif >+ > NSArray* MediaPlayerPrivateAVFoundationObjC::safeAVAssetTracksForAudibleMedia() > { > if (!m_avAsset) >@@ -2362,6 +2601,8 @@ NSArray* MediaPlayerPrivateAVFoundationObjC::safeAVAssetTracksForAudibleMedia() > return [m_avAsset tracksWithMediaCharacteristic:AVMediaCharacteristicAudible]; > } > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ > bool MediaPlayerPrivateAVFoundationObjC::hasLoadedMediaSelectionGroups() > { > if (!m_avAsset) >@@ -2373,7 +2614,7 @@ bool MediaPlayerPrivateAVFoundationObjC::hasLoadedMediaSelectionGroups() > return true; > } > >-AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForLegibleMedia() >+AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForLegibleMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2381,7 +2622,7 @@ AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGro > return [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicLegible]; > } > >-AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForAudibleMedia() >+AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForAudibleMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2389,7 +2630,7 @@ AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGro > return [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; > } > >-AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForVisualMedia() >+AVMediaSelectionGroupType* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGroupForVisualMedia() > { > if (!hasLoadedMediaSelectionGroups()) > return nil; >@@ -2399,7 +2640,7 @@ AVMediaSelectionGroup* MediaPlayerPrivateAVFoundationObjC::safeMediaSelectionGro > > void MediaPlayerPrivateAVFoundationObjC::processMediaSelectionOptions() > { >- AVMediaSelectionGroup *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); >+ AVMediaSelectionGroupType *legibleGroup = safeMediaSelectionGroupForLegibleMedia(); > if (!legibleGroup) { > INFO_LOG(LOGIDENTIFIER, "no mediaSelectionGroup"); > return; >@@ -2411,14 +2652,14 @@ void MediaPlayerPrivateAVFoundationObjC::processMediaSelectionOptions() > [m_avPlayerItem.get() selectMediaOption:nil inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; > > Vector<RefPtr<InbandTextTrackPrivateAVF>> removedTextTracks = m_textTracks; >- NSArray *legibleOptions = [PAL::getAVMediaSelectionGroupClass() playableMediaSelectionOptionsFromArray:[legibleGroup options]]; >- for (AVMediaSelectionOption *option in legibleOptions) { >+ NSArray *legibleOptions = [AVMediaSelectionGroup playableMediaSelectionOptionsFromArray:[legibleGroup options]]; >+ for (AVMediaSelectionOptionType *option in legibleOptions) { > bool newTrack = true; > for (unsigned i = removedTextTracks.size(); i > 0; --i) { > if (removedTextTracks[i - 1]->textTrackCategory() == InbandTextTrackPrivateAVF::LegacyClosedCaption) > continue; > >- RetainPtr<AVMediaSelectionOption> currentOption; >+ RetainPtr<AVMediaSelectionOptionType> currentOption; > #if ENABLE(AVF_CAPTIONS) > if (removedTextTracks[i - 1]->textTrackCategory() == InbandTextTrackPrivateAVF::OutOfBand) { > RefPtr<OutOfBandTextTrackPrivateAVF> track = static_cast<OutOfBandTextTrackPrivateAVF*>(removedTextTracks[i - 1].get()); >@@ -2483,6 +2724,8 @@ void MediaPlayerPrivateAVFoundationObjC::flushCues() > m_currentTextTrack->resetCueValues(); > } > >+#endif // HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ > void MediaPlayerPrivateAVFoundationObjC::setCurrentTextTrack(InbandTextTrackPrivateAVF *track) > { > if (m_currentTextTrack == track) >@@ -2497,14 +2740,18 @@ void MediaPlayerPrivateAVFoundationObjC::setCurrentTextTrack(InbandTextTrackPriv > ALLOW_DEPRECATED_DECLARATIONS_BEGIN > [m_avPlayer.get() setClosedCaptionDisplayEnabled:YES]; > ALLOW_DEPRECATED_DECLARATIONS_END >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > #if ENABLE(AVF_CAPTIONS) > else if (track->textTrackCategory() == InbandTextTrackPrivateAVF::OutOfBand) > [m_avPlayerItem.get() selectMediaOption:static_cast<OutOfBandTextTrackPrivateAVF*>(track)->mediaSelectionOption() inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; > #endif > else > [m_avPlayerItem.get() selectMediaOption:static_cast<InbandTextTrackPrivateAVFObjC*>(track)->mediaSelectionOption() inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >+#endif > } else { >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > [m_avPlayerItem.get() selectMediaOption:0 inMediaSelectionGroup:safeMediaSelectionGroupForLegibleMedia()]; >+#endif > ALLOW_DEPRECATED_DECLARATIONS_BEGIN > [m_avPlayer.get() setClosedCaptionDisplayEnabled:NO]; > ALLOW_DEPRECATED_DECLARATIONS_END >@@ -2520,10 +2767,11 @@ String MediaPlayerPrivateAVFoundationObjC::languageOfPrimaryAudioTrack() const > if (!m_avPlayerItem.get()) > return emptyString(); > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > // If AVFoundation has an audible group, return the language of the currently selected audible option. >- AVMediaSelectionGroup *audibleGroup = [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; >+ AVMediaSelectionGroupType *audibleGroup = [m_avAsset.get() mediaSelectionGroupForMediaCharacteristic:AVMediaCharacteristicAudible]; > ALLOW_DEPRECATED_DECLARATIONS_BEGIN >- AVMediaSelectionOption *currentlySelectedAudibleOption = [m_avPlayerItem.get() selectedMediaOptionInMediaSelectionGroup:audibleGroup]; >+ AVMediaSelectionOptionType *currentlySelectedAudibleOption = [m_avPlayerItem.get() selectedMediaOptionInMediaSelectionGroup:audibleGroup]; > ALLOW_DEPRECATED_DECLARATIONS_END > if (currentlySelectedAudibleOption) { > m_languageOfPrimaryAudioTrack = [[currentlySelectedAudibleOption locale] localeIdentifier]; >@@ -2531,6 +2779,7 @@ String MediaPlayerPrivateAVFoundationObjC::languageOfPrimaryAudioTrack() const > > return m_languageOfPrimaryAudioTrack; > } >+#endif // HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) > > // AVFoundation synthesizes an audible group when there is only one ungrouped audio track if there is also a legible group (one or > // more in-band text tracks). It doesn't know about out-of-band tracks, so if there is a single audio track return its language. >@@ -2576,7 +2825,7 @@ MediaPlayer::WirelessPlaybackTargetType MediaPlayerPrivateAVFoundationObjC::wire > return MediaPlayer::TargetTypeNone; > > #if PLATFORM(IOS_FAMILY) >- if (!PAL::AVFoundationLibrary()) >+ if (!AVFoundationLibrary()) > return MediaPlayer::TargetTypeNone; > > switch ([m_avPlayer externalPlaybackType]) { >@@ -2597,14 +2846,14 @@ MediaPlayer::WirelessPlaybackTargetType MediaPlayerPrivateAVFoundationObjC::wire > } > > #if PLATFORM(IOS_FAMILY) >-static NSString *exernalDeviceDisplayNameForPlayer(AVPlayer *player) >+static NSString *exernalDeviceDisplayNameForPlayer(AVPlayerType *player) > { > #if HAVE(CELESTIAL) >- if (!PAL::AVFoundationLibrary()) >+ if (!AVFoundationLibrary()) > return nil; > >- if ([PAL::getAVOutputContextClass() respondsToSelector:@selector(sharedAudioPresentationOutputContext)]) { >- AVOutputContext *outputContext = [PAL::getAVOutputContextClass() sharedAudioPresentationOutputContext]; >+ if ([getAVOutputContextClass() respondsToSelector:@selector(sharedAudioPresentationOutputContext)]) { >+ AVOutputContext *outputContext = [getAVOutputContextClass() sharedAudioPresentationOutputContext]; > > if (![outputContext respondsToSelector:@selector(supportsMultipleOutputDevices)] > || ![outputContext supportsMultipleOutputDevices] >@@ -2887,7 +3136,7 @@ static const AtomicString& metadataType(NSString *avMetadataKeySpace) > > if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceQuickTimeUserData]) > return quickTimeUserData; >- if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceISOUserData]) >+ if (canLoadAVMetadataKeySpaceISOUserData() && [avMetadataKeySpace isEqualToString:AVMetadataKeySpaceISOUserData]) > return isoUserData; > if ([avMetadataKeySpace isEqualToString:AVMetadataKeySpaceQuickTimeMetadata]) > return quickTimeMetadata; >@@ -2921,14 +3170,14 @@ void MediaPlayerPrivateAVFoundationObjC::metadataDidArrive(const RetainPtr<NSArr > > // Set the duration of all incomplete cues before adding new ones. > MediaTime earliestStartTime = MediaTime::positiveInfiniteTime(); >- for (AVMetadataItem *item in m_currentMetaData.get()) { >+ for (AVMetadataItemType *item in m_currentMetaData.get()) { > MediaTime start = std::max(PAL::toMediaTime(item.time), MediaTime::zeroTime()); > if (start < earliestStartTime) > earliestStartTime = start; > } > m_metadataTrack->updatePendingCueEndTimes(earliestStartTime); > >- for (AVMetadataItem *item in m_currentMetaData.get()) { >+ for (AVMetadataItemType *item in m_currentMetaData.get()) { > MediaTime start = std::max(PAL::toMediaTime(item.time), MediaTime::zeroTime()); > MediaTime end = MediaTime::positiveInfiniteTime(); > if (CMTIME_IS_VALID(item.duration)) >@@ -3274,7 +3523,7 @@ NSArray* playerKVOProperties() > player->durationDidChange(PAL::toMediaTime([newValue CMTimeValue])); > else if ([keyPath isEqualToString:@"timedMetadata"] && newValue) { > MediaTime now; >- CMTime itemTime = [(AVPlayerItem *)object.get() currentTime]; >+ CMTime itemTime = [(AVPlayerItemType *)object.get() currentTime]; > if (CMTIME_IS_NUMERIC(itemTime)) > now = std::max(PAL::toMediaTime(itemTime), MediaTime::zeroTime()); > player->metadataDidArrive(RetainPtr<NSArray>(newValue), now); >@@ -3319,6 +3568,8 @@ NSArray* playerKVOProperties() > }); > } > >+#if HAVE(AVFOUNDATION_MEDIA_SELECTION_GROUP) >+ > - (void)legibleOutput:(id)output didOutputAttributedStrings:(NSArray *)strings nativeSampleBuffers:(NSArray *)nativeSamples forItemTime:(CMTime)itemTime > { > UNUSED_PARAM(output); >@@ -3341,6 +3592,8 @@ NSArray* playerKVOProperties() > }); > } > >+#endif >+ > @end > > #if HAVE(AVFOUNDATION_LOADER_DELEGATE) >@@ -3408,13 +3661,13 @@ NSArray* playerKVOProperties() > return self; > } > >-- (void)outputMediaDataWillChange:(AVPlayerItemVideoOutput *)output >+- (void)outputMediaDataWillChange:(AVPlayerItemVideoOutputType *)output > { > if (m_player) > m_player->outputMediaDataWillChange(output); > } > >-- (void)outputSequenceWasFlushed:(AVPlayerItemVideoOutput *)output >+- (void)outputSequenceWasFlushed:(AVPlayerItemVideoOutputType *)output > { > UNUSED_PARAM(output); > // No-op. >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >index 7fb1a5ec305a854f66ee207579814c6618d1bec9..d831b05d180ebb78801d6ec6cf127a63a8255d4f 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm >@@ -52,9 +52,31 @@ > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> > >-#import "CoreVideoSoftLink.h" >+#pragma mark - Soft Linking >+ > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import "CoreVideoSoftLink.h" >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVAsset) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset) >+ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer) >+ALLOW_NEW_API_WITHOUT_GUARDS_END >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer) >+ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer) >+ALLOW_NEW_API_WITHOUT_GUARDS_END >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamDataParser) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVStreamSession); >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVVideoPerformanceMetrics) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*) >+ >+#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral() >+#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed() > > #pragma mark - > #pragma mark AVStreamSession >@@ -95,7 +117,7 @@ static void CMTimebaseEffectiveRateChangedCallback(CMNotificationCenterRef, cons > > MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC(MediaPlayer* player) > : m_player(player) >- , m_synchronizer(adoptNS([PAL::allocAVSampleBufferRenderSynchronizerInstance() init])) >+ , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init])) > , m_seekTimer(*this, &MediaPlayerPrivateMediaSourceAVFObjC::seekInternal) > , m_networkState(MediaPlayer::Empty) > , m_readyState(MediaPlayer::HaveNothing) >@@ -179,7 +201,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::registerMediaEngine(MediaEngineRegist > > bool MediaPlayerPrivateMediaSourceAVFObjC::isAvailable() > { >- return PAL::AVFoundationLibrary() >+ return AVFoundationLibrary() > && isCoreMediaFrameworkAvailable() > && getAVStreamDataParserClass() > && getAVSampleBufferAudioRendererClass() >@@ -212,14 +234,14 @@ MediaPlayer::SupportsType MediaPlayerPrivateMediaSourceAVFObjC::supportsType(con > return MediaPlayer::MayBeSupported; > > NSString *outputCodecs = codecs; >- if ([PAL::getAVStreamDataParserClass() respondsToSelector:@selector(outputMIMECodecParameterForInputMIMECodecParameter:)]) >- outputCodecs = [PAL::getAVStreamDataParserClass() outputMIMECodecParameterForInputMIMECodecParameter:outputCodecs]; >+ if ([getAVStreamDataParserClass() respondsToSelector:@selector(outputMIMECodecParameterForInputMIMECodecParameter:)]) >+ outputCodecs = [getAVStreamDataParserClass() outputMIMECodecParameterForInputMIMECodecParameter:outputCodecs]; > > if (!contentTypeMeetsHardwareDecodeRequirements(parameters.type, parameters.contentTypesRequiringHardwareSupport)) > return MediaPlayer::IsNotSupported; > > NSString *typeString = [NSString stringWithFormat:@"%@; codecs=\"%@\"", (NSString *)parameters.type.containerType(), (NSString *)outputCodecs]; >- return [PAL::getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported;; >+ return [getAVURLAssetClass() isPlayableExtendedMIMEType:typeString] ? MediaPlayer::IsSupported : MediaPlayer::MayBeSupported;; > } > > #pragma mark - >@@ -697,7 +719,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer() > if (m_sampleBufferDisplayLayer) > return; > >- m_sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]); >+ m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); > #ifndef NDEBUG > [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaSource AVSampleBufferDisplayLayer"]; > #endif >@@ -915,7 +937,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::flushPendingSizeChanges() > #if HAVE(AVSTREAMSESSION) > AVStreamSession* MediaPlayerPrivateMediaSourceAVFObjC::streamSession() > { >- if (!getAVStreamSessionClass() || ![PAL::getAVStreamSessionClass() instancesRespondToSelector:@selector(initWithStorageDirectoryAtURL:)]) >+ if (!getAVStreamSessionClass() || ![getAVStreamSessionClass() instancesRespondToSelector:@selector(initWithStorageDirectoryAtURL:)]) > return nil; > > if (!m_streamSession) { >@@ -929,7 +951,7 @@ AVStreamSession* MediaPlayerPrivateMediaSourceAVFObjC::streamSession() > } > > String storagePath = FileSystem::pathByAppendingComponent(storageDirectory, "SecureStop.plist"); >- m_streamSession = adoptNS([PAL::allocAVStreamSessionInstance() initWithStorageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]); >+ m_streamSession = adoptNS([allocAVStreamSessionInstance() initWithStorageDirectoryAtURL:[NSURL fileURLWithPath:storagePath]]); > } > return m_streamSession.get(); > } >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >index cda97c91fda083def6acf5fac3ded2f448171d4c..d0e92b2a32c9101779939a444b7ba762214ae03b 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm >@@ -46,9 +46,23 @@ > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> > >-#import "CoreVideoSoftLink.h" >+ >+#pragma mark - Soft Linking >+ > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import "CoreVideoSoftLink.h" >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >+ >+#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect() >+#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill() >+#define AVLayerVideoGravityResize getAVLayerVideoGravityResize() > > using namespace WebCore; > >@@ -119,7 +133,7 @@ using namespace WebCore; > if (!_parent) > return; > >- if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) { >+ if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) { > RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object; > ASSERT(layer.get() == _parent->displayLayer()); > >@@ -226,7 +240,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::registerMediaEngine(MediaEngineRegist > > bool MediaPlayerPrivateMediaStreamAVFObjC::isAvailable() > { >- return PAL::AVFoundationLibrary() && isCoreMediaFrameworkAvailable() && getAVSampleBufferDisplayLayerClass(); >+ return AVFoundationLibrary() && isCoreMediaFrameworkAvailable() && getAVSampleBufferDisplayLayerClass(); > } > > void MediaPlayerPrivateMediaStreamAVFObjC::getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types) >@@ -470,7 +484,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers() > if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->activeVideoTrack() || !m_mediaStreamPrivate->activeVideoTrack()->enabled()) > return; > >- m_sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]); >+ m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); > if (!m_sampleBufferDisplayLayer) { > ERROR_LOG(LOGIDENTIFIER, "+[AVSampleBufferDisplayLayer alloc] failed."); > return; >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >index 5c0f3dc061f05fc693a3787361efd8928f2209aa..ffc69d7785772dcad1a49ad7a4a1b07d6cb91eec 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm >@@ -32,8 +32,8 @@ > #import <wtf/PrintStream.h> > #import <wtf/cf/TypeCastsCF.h> > >-#import "CoreVideoSoftLink.h" > #import <pal/cf/CoreMediaSoftLink.h> >+#import "CoreVideoSoftLink.h" > > using namespace PAL; > >diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm b/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >index 72ed5c9cf2fd0998274d628cd2d6599bc232795d..cb94d66592f5757d803d107d00a21801431858d7 100644 >--- a/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >+++ b/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm >@@ -65,7 +65,29 @@ > #pragma mark - Soft Linking > > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVAssetTrack) >+SOFT_LINK_CLASS(AVFoundation, AVStreamDataParser) >+ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN >+SOFT_LINK_CLASS(AVFoundation, AVSampleBufferAudioRenderer) >+ALLOW_NEW_API_WITHOUT_GUARDS_END >+SOFT_LINK_CLASS(AVFoundation, AVSampleBufferDisplayLayer) >+SOFT_LINK_CLASS(AVFoundation, AVStreamSession) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicVisual, NSString*) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicAudible, NSString*) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaCharacteristicLegible, NSString*) >+SOFT_LINK_CONSTANT(AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotification, NSString*) >+SOFT_LINK_CONSTANT(AVFoundation, AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey, NSString*) >+ >+#define AVSampleBufferDisplayLayerFailedToDecodeNotification getAVSampleBufferDisplayLayerFailedToDecodeNotification() >+#define AVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey getAVSampleBufferDisplayLayerFailedToDecodeNotificationErrorKey() >+ >+#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual() >+#define AVMediaCharacteristicAudible getAVMediaCharacteristicAudible() >+#define AVMediaCharacteristicLegible getAVMediaCharacteristicLegible() > > @interface AVSampleBufferDisplayLayer (WebCoreAVSampleBufferDisplayLayerQueueManagementPrivate) > - (void)prerollDecodeWithCompletionHandler:(void (^)(BOOL success))block; >@@ -332,7 +354,7 @@ ALLOW_NEW_API_WITHOUT_GUARDS_END > UNUSED_PARAM(keyPath); > ASSERT(_parent); > >- if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) { >+ if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) { > RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object; > ASSERT(_layers.contains(layer.get())); > >@@ -353,7 +375,7 @@ ALLOW_NEW_API_WITHOUT_GUARDS_END > } else > ASSERT_NOT_REACHED(); > >- } else if ([object isKindOfClass:PAL::getAVSampleBufferAudioRendererClass()]) { >+ } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) { > ALLOW_NEW_API_WITHOUT_GUARDS_BEGIN > RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object; > ALLOW_NEW_API_WITHOUT_GUARDS_END >@@ -464,7 +486,7 @@ Ref<SourceBufferPrivateAVFObjC> SourceBufferPrivateAVFObjC::create(MediaSourcePr > } > > SourceBufferPrivateAVFObjC::SourceBufferPrivateAVFObjC(MediaSourcePrivateAVFObjC* parent) >- : m_parser(adoptNS([PAL::allocAVStreamDataParserInstance() init])) >+ : m_parser(adoptNS([allocAVStreamDataParserInstance() init])) > , m_delegate(adoptNS([[WebAVStreamDataParserListener alloc] initWithParser:m_parser.get() parent:createWeakPtr()])) > , m_errorListener(adoptNS([[WebAVSampleBufferErrorListener alloc] initWithParent:createWeakPtr()])) > , m_isAppendingGroup(adoptOSObject(dispatch_group_create())) >@@ -476,8 +498,8 @@ SourceBufferPrivateAVFObjC::SourceBufferPrivateAVFObjC(MediaSourcePrivateAVFObjC > #endif > { > ALWAYS_LOG(LOGIDENTIFIER); >- >- if (![PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) >+ >+ if (![getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) > CMNotificationCenterAddListener(CMNotificationCenterGetDefaultLocalCenter(), reinterpret_cast<void*>(m_mapID), bufferWasConsumedCallback, kCMSampleBufferConsumerNotification_BufferConsumed, nullptr, 0); > > m_delegate.get().abortSemaphore = Box<Semaphore>::create(0); >@@ -494,7 +516,7 @@ SourceBufferPrivateAVFObjC::~SourceBufferPrivateAVFObjC() > destroyParser(); > destroyRenderers(); > >- if (![PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) >+ if (![getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]) > CMNotificationCenterRemoveListener(CMNotificationCenterGetDefaultLocalCenter(), this, bufferWasConsumedCallback, kCMSampleBufferConsumerNotification_BufferConsumed, nullptr); > > if (m_hasSessionSemaphore) >@@ -887,7 +909,7 @@ void SourceBufferPrivateAVFObjC::trackDidChangeEnabled(AudioTrackPrivateMediaSou > RetainPtr<AVSampleBufferAudioRenderer> renderer; > ALLOW_NEW_API_WITHOUT_GUARDS_END > if (!m_audioRenderers.contains(trackID)) { >- renderer = adoptNS([PAL::allocAVSampleBufferAudioRendererInstance() init]); >+ renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]); > auto weakThis = createWeakPtr(); > [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ > if (weakThis) >@@ -1129,7 +1151,7 @@ void SourceBufferPrivateAVFObjC::enqueueSample(Ref<MediaSample>&& sample, const > if (m_mediaSource && !m_mediaSource->player()->hasAvailableVideoFrame() && !sample->isNonDisplaying()) { > DEBUG_LOG(LOGIDENTIFIER, "adding buffer attachment"); > >- bool havePrerollDecodeWithCompletionHandler = [PAL::getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]; >+ bool havePrerollDecodeWithCompletionHandler = [getAVSampleBufferDisplayLayerClass() instancesRespondToSelector:@selector(prerollDecodeWithCompletionHandler:)]; > > if (!havePrerollDecodeWithCompletionHandler) { > CMSampleBufferRef rawSampleCopy; >diff --git a/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm b/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >index a48523cf5176b31f2acc4984b2ecd8e2c874f607..7daa25952bcbf679c535d4b32960e3d3c210b072 100644 >--- a/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >+++ b/Source/WebCore/platform/graphics/ca/cocoa/PlatformCALayerCocoa.mm >@@ -63,7 +63,9 @@ > #import "ThemeMac.h" > #endif > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPlayerLayer) > > namespace WebCore { > >@@ -196,12 +198,12 @@ static NSString *toCAFilterType(PlatformCALayer::FilterType type) > > PlatformCALayer::LayerType PlatformCALayerCocoa::layerTypeForPlatformLayer(PlatformLayer* layer) > { >- if ([layer isKindOfClass:PAL::getAVPlayerLayerClass()]) >+ if ([layer isKindOfClass:getAVPlayerLayerClass()]) > return LayerTypeAVPlayerLayer; > > if ([layer isKindOfClass:objc_getClass("WebVideoContainerLayer")] > && layer.sublayers.count == 1 >- && [layer.sublayers[0] isKindOfClass:PAL::getAVPlayerLayerClass()]) >+ && [layer.sublayers[0] isKindOfClass:getAVPlayerLayerClass()]) > return LayerTypeAVPlayerLayer; > > if ([layer isKindOfClass:[WebGLLayer class]]) >@@ -261,7 +263,7 @@ PlatformCALayerCocoa::PlatformCALayerCocoa(LayerType layerType, PlatformCALayerC > layerClass = [WebTiledBackingLayer class]; > break; > case LayerTypeAVPlayerLayer: >- layerClass = PAL::getAVPlayerLayerClass(); >+ layerClass = getAVPlayerLayerClass(); > break; > case LayerTypeContentsProvidedLayer: > // We don't create PlatformCALayerCocoas wrapped around WebGLLayers or WebGPULayers. >@@ -358,7 +360,7 @@ Ref<PlatformCALayer> PlatformCALayerCocoa::clone(PlatformCALayerClient* owner) c > newLayer->updateCustomAppearance(customAppearance()); > > if (type == LayerTypeAVPlayerLayer) { >- ASSERT([newLayer->platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]); >+ ASSERT([newLayer->platformLayer() isKindOfClass:getAVPlayerLayerClass()]); > > AVPlayerLayer *destinationPlayerLayer = static_cast<PlatformCALayerCocoa&>(newLayer.get()).avPlayerLayer(); > AVPlayerLayer *sourcePlayerLayer = avPlayerLayer(); >@@ -1269,12 +1271,12 @@ AVPlayerLayer *PlatformCALayerCocoa::avPlayerLayer() const > if (layerType() != LayerTypeAVPlayerLayer) > return nil; > >- if ([platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]) >+ if ([platformLayer() isKindOfClass:getAVPlayerLayerClass()]) > return static_cast<AVPlayerLayer *>(platformLayer()); > > if ([platformLayer() isKindOfClass:objc_getClass("WebVideoContainerLayer")]) { > ASSERT([platformLayer() sublayers].count == 1); >- ASSERT([[platformLayer() sublayers][0] isKindOfClass:PAL::getAVPlayerLayerClass()]); >+ ASSERT([[platformLayer() sublayers][0] isKindOfClass:getAVPlayerLayerClass()]); > return static_cast<AVPlayerLayer *>([platformLayer() sublayers][0]); > } > >diff --git a/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm b/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >index 50506676054828891c6c37c0771b60f158f71187..31c2d71b584c6c6f2a8b01c85ae7f0bcd0238916 100644 >--- a/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >+++ b/Source/WebCore/platform/graphics/cocoa/HEVCUtilitiesCocoa.mm >@@ -23,17 +23,19 @@ > * THE POSSIBILITY OF SUCH DAMAGE. > */ > >-#import "config.h" >-#import "HEVCUtilitiesCocoa.h" >+#include "config.h" >+#include "HEVCUtilitiesCocoa.h" > > #if PLATFORM(COCOA) > >-#import "FourCC.h" >-#import "HEVCUtilities.h" >-#import "MediaCapabilitiesInfo.h" >+#include "FourCC.h" >+#include "HEVCUtilities.h" >+#include "MediaCapabilitiesInfo.h" > >-#import "VideoToolboxSoftLink.h" >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#include "VideoToolboxSoftLink.h" >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVVideoCodecTypeHEVCWithAlpha, NSString *) > > namespace WebCore { > >@@ -41,10 +43,10 @@ bool validateHEVCParameters(HEVCParameterSet& parameters, MediaCapabilitiesInfo& > { > CMVideoCodecType codec = kCMVideoCodecType_HEVC; > if (hasAlphaChannel) { >- if (!PAL::AVFoundationLibrary() || !PAL::canLoad_AVFoundation_AVVideoCodecTypeHEVCWithAlpha()) >+ if (!AVFoundationLibrary() || !canLoadAVVideoCodecTypeHEVCWithAlpha()) > return false; > >- auto codecCode = FourCC::fromString(AVVideoCodecTypeHEVCWithAlpha); >+ auto codecCode = FourCC::fromString(getAVVideoCodecTypeHEVCWithAlpha()); > if (!codecCode) > return false; > >diff --git a/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm b/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >index 4b378fa17e6ef55dffedc58014d7f3feee625ca0..5a0fe71aee9c4fdaca78c302cda72837bde1b01c 100644 >--- a/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >+++ b/Source/WebCore/platform/ios/PlatformSpeechSynthesizerIOS.mm >@@ -32,36 +32,22 @@ > #import <AVFoundation/AVSpeechSynthesis.h> > #import <wtf/BlockObjCExceptions.h> > #import <wtf/RetainPtr.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVSpeechSynthesizer) >+SOFT_LINK_CLASS(AVFoundation, AVSpeechUtterance) >+SOFT_LINK_CLASS(AVFoundation, AVSpeechSynthesisVoice) > >-static float getAVSpeechUtteranceDefaultSpeechRate() >-{ >- static float value; >- static void* symbol; >- if (!symbol) { >- void* symbol = dlsym(PAL::AVFoundationLibrary(), "AVSpeechUtteranceDefaultSpeechRate"); >- RELEASE_ASSERT_WITH_MESSAGE(symbol, "%s", dlerror()); >- value = *static_cast<float const *>(symbol); >- } >- return value; >-} >- >-static float getAVSpeechUtteranceMaximumSpeechRate() >-{ >- static float value; >- static void* symbol; >- if (!symbol) { >- void* symbol = dlsym(PAL::AVFoundationLibrary(), "AVSpeechUtteranceMaximumSpeechRate"); >- RELEASE_ASSERT_WITH_MESSAGE(symbol, "%s", dlerror()); >- value = *static_cast<float const *>(symbol); >- } >- return value; >-} >+SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceDefaultSpeechRate, float) >+SOFT_LINK_CONSTANT(AVFoundation, AVSpeechUtteranceMaximumSpeechRate, float) > > #define AVSpeechUtteranceDefaultSpeechRate getAVSpeechUtteranceDefaultSpeechRate() > #define AVSpeechUtteranceMaximumSpeechRate getAVSpeechUtteranceMaximumSpeechRate() > >+#define AVSpeechUtteranceClass getAVSpeechUtteranceClass() >+#define AVSpeechSynthesisVoiceClass getAVSpeechSynthesisVoiceClass() >+ > @interface WebSpeechSynthesisWrapper : NSObject<AVSpeechSynthesizerDelegate> > { > WebCore::PlatformSpeechSynthesizer* m_synthesizerObject; >@@ -110,7 +96,7 @@ static float getAVSpeechUtteranceMaximumSpeechRate() > > BEGIN_BLOCK_OBJC_EXCEPTIONS > if (!m_synthesizer) { >- m_synthesizer = adoptNS([PAL::allocAVSpeechSynthesizerInstance() init]); >+ m_synthesizer = adoptNS([allocAVSpeechSynthesizerInstance() init]); > [m_synthesizer setDelegate:self]; > } > >@@ -120,7 +106,7 @@ static float getAVSpeechUtteranceMaximumSpeechRate() > NSString *voiceLanguage = nil; > if (!utteranceVoice) { > if (utterance->lang().isEmpty()) >- voiceLanguage = [PAL::getAVSpeechSynthesisVoiceClass() currentLanguageCode]; >+ voiceLanguage = [AVSpeechSynthesisVoiceClass currentLanguageCode]; > else > voiceLanguage = utterance->lang(); > } else >@@ -128,9 +114,9 @@ static float getAVSpeechUtteranceMaximumSpeechRate() > > AVSpeechSynthesisVoice *avVoice = nil; > if (voiceLanguage) >- avVoice = [PAL::getAVSpeechSynthesisVoiceClass() voiceWithLanguage:voiceLanguage]; >+ avVoice = [AVSpeechSynthesisVoiceClass voiceWithLanguage:voiceLanguage]; > >- AVSpeechUtterance *avUtterance = [PAL::getAVSpeechUtteranceClass() speechUtteranceWithString:utterance->text()]; >+ AVSpeechUtterance *avUtterance = [AVSpeechUtteranceClass speechUtteranceWithString:utterance->text()]; > > [avUtterance setRate:[self mapSpeechRateToPlatformRate:utterance->rate()]]; > [avUtterance setVolume:utterance->volume()]; >@@ -258,7 +244,7 @@ PlatformSpeechSynthesizer::~PlatformSpeechSynthesizer() > void PlatformSpeechSynthesizer::initializeVoiceList() > { > BEGIN_BLOCK_OBJC_EXCEPTIONS >- for (AVSpeechSynthesisVoice *voice in [PAL::getAVSpeechSynthesisVoiceClass() speechVoices]) { >+ for (AVSpeechSynthesisVoice *voice in [AVSpeechSynthesisVoiceClass speechVoices]) { > NSString *language = [voice language]; > bool isDefault = true; > NSString *voiceURI = [voice identifier]; >diff --git a/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm b/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >index 03ce20c90273378f4e862a82edf9d18e6c12442b..08724eb66c69b48bdfec0d236d720ac8defdc57b 100644 >--- a/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >+++ b/Source/WebCore/platform/ios/VideoFullscreenInterfaceAVKit.mm >@@ -43,6 +43,7 @@ > #import <UIKit/UIWindow.h> > #import <objc/message.h> > #import <objc/runtime.h> >+#import <pal/ios/UIKitSoftLink.h> > #import <pal/spi/cocoa/AVKitSPI.h> > #import <pal/spi/ios/UIKitSPI.h> > #import <wtf/RetainPtr.h> >@@ -51,9 +52,14 @@ > > using namespace WebCore; > >+// Soft-linking headers must be included last since they #define functions, constants, etc. > #import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >-#import <pal/ios/UIKitSoftLink.h> >+ >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResize, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *) > > SOFTLINK_AVKIT_FRAMEWORK() > SOFT_LINK_CLASS_OPTIONAL(AVKit, AVPictureInPictureController) >@@ -204,7 +210,7 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > self = [super init]; > if (self) { > [self setMasksToBounds:YES]; >- _videoGravity = AVLayerVideoGravityResizeAspect; >+ _videoGravity = getAVLayerVideoGravityResizeAspect(); > } > return self; > } >@@ -264,13 +270,13 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > FloatRect targetVideoFrame; > float videoAspectRatio = self.videoDimensions.width / self.videoDimensions.height; > >- if ([AVLayerVideoGravityResize isEqualToString:self.videoGravity]) { >+ if ([getAVLayerVideoGravityResize() isEqualToString:self.videoGravity]) { > sourceVideoFrame = self.modelVideoLayerFrame; > targetVideoFrame = self.bounds; >- } else if ([AVLayerVideoGravityResizeAspect isEqualToString:self.videoGravity]) { >+ } else if ([getAVLayerVideoGravityResizeAspect() isEqualToString:self.videoGravity]) { > sourceVideoFrame = largestRectWithAspectRatioInsideRect(videoAspectRatio, self.modelVideoLayerFrame); > targetVideoFrame = largestRectWithAspectRatioInsideRect(videoAspectRatio, self.bounds); >- } else if ([AVLayerVideoGravityResizeAspectFill isEqualToString:self.videoGravity]) { >+ } else if ([getAVLayerVideoGravityResizeAspectFill() isEqualToString:self.videoGravity]) { > sourceVideoFrame = smallestRectWithAspectRatioAroundRect(videoAspectRatio, self.modelVideoLayerFrame); > self.modelVideoLayerFrame = CGRectMake(0, 0, sourceVideoFrame.width(), sourceVideoFrame.height()); > if (auto* model = _fullscreenInterface->videoFullscreenModel()) >@@ -322,7 +328,7 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > #if PLATFORM(IOSMAC) > // FIXME<rdar://46011230>: remove this #if once this radar lands. > if (!videoGravity) >- videoGravity = AVLayerVideoGravityResizeAspect; >+ videoGravity = getAVLayerVideoGravityResizeAspect(); > #endif > > _videoGravity = videoGravity; >@@ -331,11 +337,11 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > return; > > WebCore::MediaPlayerEnums::VideoGravity gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspect; >- if (videoGravity == AVLayerVideoGravityResize) >+ if (videoGravity == getAVLayerVideoGravityResize()) > gravity = WebCore::MediaPlayerEnums::VideoGravityResize; >- if (videoGravity == AVLayerVideoGravityResizeAspect) >+ if (videoGravity == getAVLayerVideoGravityResizeAspect()) > gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspect; >- else if (videoGravity == AVLayerVideoGravityResizeAspectFill) >+ else if (videoGravity == getAVLayerVideoGravityResizeAspectFill()) > gravity = WebCore::MediaPlayerEnums::VideoGravityResizeAspectFill; > else > ASSERT_NOT_REACHED(); >@@ -356,9 +362,9 @@ static VideoFullscreenInterfaceAVKit::ExitFullScreenReason convertToExitFullScre > > float videoAspectRatio = self.videoDimensions.width / self.videoDimensions.height; > >- if ([AVLayerVideoGravityResizeAspect isEqualToString:self.videoGravity]) >+ if ([getAVLayerVideoGravityResizeAspect() isEqualToString:self.videoGravity]) > return largestRectWithAspectRatioInsideRect(videoAspectRatio, self.bounds); >- if ([AVLayerVideoGravityResizeAspectFill isEqualToString:self.videoGravity]) >+ if ([getAVLayerVideoGravityResizeAspectFill() isEqualToString:self.videoGravity]) > return smallestRectWithAspectRatioAroundRect(videoAspectRatio, self.bounds); > > return self.bounds; >@@ -454,7 +460,7 @@ static void WebAVPlayerLayerView_startRoutingVideoToPictureInPicturePlayerLayerV > > WebAVPlayerLayer *playerLayer = (WebAVPlayerLayer *)[playerLayerView playerLayer]; > WebAVPlayerLayer *pipPlayerLayer = (WebAVPlayerLayer *)[pipView layer]; >- [playerLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; >+ [playerLayer setVideoGravity:getAVLayerVideoGravityResizeAspect()]; > [pipPlayerLayer setVideoSublayer:playerLayer.videoSublayer]; > [pipPlayerLayer setVideoDimensions:playerLayer.videoDimensions]; > [pipPlayerLayer setVideoGravity:playerLayer.videoGravity]; >diff --git a/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm b/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >index c2953ca03e805cfdc3050cf60a35d84cffc95879..e997b36a7b4a50a1633ab0aeaaf6aa1b78f5db65 100644 >--- a/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >+++ b/Source/WebCore/platform/mac/SerializedPlatformRepresentationMac.mm >@@ -38,9 +38,14 @@ > #import <JavaScriptCore/JSObjectRef.h> > #import <JavaScriptCore/JavaScriptCore.h> > #import <objc/runtime.h> >+#import <wtf/SoftLinking.h> > #import <wtf/text/Base64.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+typedef AVMetadataItem AVMetadataItemType; >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVMetadataItem) >+#define AVMetadataItem getAVMetadataItemClass() >+ > > namespace WebCore { > >@@ -48,7 +53,7 @@ namespace WebCore { > static JSValue *jsValueWithDataInContext(NSData *, JSContext *); > static JSValue *jsValueWithArrayInContext(NSArray *, JSContext *); > static JSValue *jsValueWithDictionaryInContext(NSDictionary *, JSContext *); >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *, JSContext *); >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *, JSContext *); > static JSValue *jsValueWithValueInContext(id, JSContext *); > #endif > >@@ -131,7 +136,7 @@ static JSValue *jsValueWithValueInContext(id value, JSContext *context) > if ([value isKindOfClass:[NSData class]]) > return jsValueWithDataInContext(value, context); > >- if ([value isKindOfClass:PAL::getAVMetadataItemClass()]) >+ if ([value isKindOfClass:[AVMetadataItem class]]) > return jsValueWithAVMetadataItemInContext(value, context); > > return nil; >@@ -194,7 +199,7 @@ static JSValue *jsValueWithDictionaryInContext(NSDictionary *dictionary, JSConte > return result; > } > >-static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItem *item, JSContext *context) >+static JSValue *jsValueWithAVMetadataItemInContext(AVMetadataItemType *item, JSContext *context) > { > NSMutableDictionary *dictionary = [NSMutableDictionary dictionary]; > >diff --git a/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm b/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >index 75d46567869841dc04cd9dad39af9158bd717284..2b784a4deb6a198bd583006e539dff54edbb1f1a 100644 >--- a/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >+++ b/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm >@@ -37,15 +37,52 @@ > #include <pal/cf/CoreMediaSoftLink.h> > #include <wtf/FileSystem.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+typedef AVAssetWriter AVAssetWriterType; >+typedef AVAssetWriterInput AVAssetWriterInputType; >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVAssetWriter) >+SOFT_LINK_CLASS(AVFoundation, AVAssetWriterInput) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVFileTypeMPEG4, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoCodecKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoCodecH264, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoWidthKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoHeightKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoExpectedSourceFrameRateKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoProfileLevelKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoAverageBitRateKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoMaxKeyFrameIntervalKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoProfileLevelH264MainAutoLevel, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVVideoCompressionPropertiesKey, NSString *) >+ >+#define AVFileTypeMPEG4 getAVFileTypeMPEG4() >+#define AVMediaTypeAudio getAVMediaTypeAudio() >+#define AVMediaTypeVideo getAVMediaTypeVideo() >+#define AVVideoCodecKey getAVVideoCodecKey() >+#define AVVideoCodecH264 getAVVideoCodecH264() >+#define AVVideoWidthKey getAVVideoWidthKey() >+#define AVVideoHeightKey getAVVideoHeightKey() >+ >+#define AVVideoExpectedSourceFrameRateKey getAVVideoExpectedSourceFrameRateKey() >+#define AVVideoProfileLevelKey getAVVideoProfileLevelKey() >+#define AVVideoAverageBitRateKey getAVVideoAverageBitRateKey() >+#define AVVideoMaxKeyFrameIntervalKey getAVVideoMaxKeyFrameIntervalKey() >+#define AVVideoProfileLevelH264MainAutoLevel getAVVideoProfileLevelH264MainAutoLevel() >+#define AVVideoCompressionPropertiesKey getAVVideoCompressionPropertiesKey() >+ >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVEncoderBitRateKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVFormatIDKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVNumberOfChannelsKey, NSString *) >+SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVSampleRateKey, NSString *) > >-#undef AVEncoderBitRateKey > #define AVEncoderBitRateKey getAVEncoderBitRateKeyWithFallback() >-#undef AVFormatIDKey > #define AVFormatIDKey getAVFormatIDKeyWithFallback() >-#undef AVNumberOfChannelsKey > #define AVNumberOfChannelsKey getAVNumberOfChannelsKeyWithFallback() >-#undef AVSampleRateKey > #define AVSampleRateKey getAVSampleRateKeyWithFallback() > > namespace WebCore { >@@ -54,8 +91,8 @@ using namespace PAL; > > static NSString *getAVFormatIDKeyWithFallback() > { >- if (PAL::canLoad_AVFoundation_AVFormatIDKey()) >- return PAL::get_AVFoundation_AVFormatIDKey(); >+ if (canLoadAVFormatIDKey()) >+ return getAVFormatIDKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVFormatIDKey"); > return @"AVFormatIDKey"; >@@ -63,8 +100,8 @@ static NSString *getAVFormatIDKeyWithFallback() > > static NSString *getAVNumberOfChannelsKeyWithFallback() > { >- if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey()) >- return PAL::get_AVFoundation_AVNumberOfChannelsKey(); >+ if (canLoadAVNumberOfChannelsKey()) >+ return getAVNumberOfChannelsKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVNumberOfChannelsKey"); > return @"AVNumberOfChannelsKey"; >@@ -72,8 +109,8 @@ static NSString *getAVNumberOfChannelsKeyWithFallback() > > static NSString *getAVSampleRateKeyWithFallback() > { >- if (PAL::canLoad_AVFoundation_AVSampleRateKey()) >- return PAL::get_AVFoundation_AVSampleRateKey(); >+ if (canLoadAVSampleRateKey()) >+ return getAVSampleRateKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVSampleRateKey"); > return @"AVSampleRateKey"; >@@ -81,8 +118,8 @@ static NSString *getAVSampleRateKeyWithFallback() > > static NSString *getAVEncoderBitRateKeyWithFallback() > { >- if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey()) >- return PAL::get_AVFoundation_AVEncoderBitRateKey(); >+ if (canLoadAVEncoderBitRateKey()) >+ return getAVEncoderBitRateKey(); > > RELEASE_LOG_ERROR(Media, "Failed to load AVEncoderBitRateKey"); > return @"AVEncoderBitRateKey"; >@@ -97,7 +134,7 @@ RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(const Medi > NSURL *outputURL = [NSURL fileURLWithPath:path]; > String filePath = [path UTF8String]; > NSError *error = nil; >- auto avAssetWriter = adoptNS([PAL::allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]); >+ auto avAssetWriter = adoptNS([allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]); > if (error) { > RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)error.code); > return nullptr; >@@ -160,7 +197,7 @@ bool MediaRecorderPrivateWriter::setVideoInput(int width, int height) > AVVideoCompressionPropertiesKey: compressionProperties > }; > >- m_videoInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]); >+ m_videoInput = adoptNS([allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]); > [m_videoInput setExpectsMediaDataInRealTime:true]; > > if (![m_writer canAddInput:m_videoInput.get()]) { >@@ -184,7 +221,7 @@ bool MediaRecorderPrivateWriter::setAudioInput() > AVSampleRateKey : @(22050) > }; > >- m_audioInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]); >+ m_audioInput = adoptNS([allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]); > [m_audioInput setExpectsMediaDataInRealTime:true]; > > if (![m_writer canAddInput:m_audioInput.get()]) { >diff --git a/Source/WebCore/platform/mediastream/RealtimeVideoSource.h b/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >index 99bacbe242399ba46459c3bfda8c9a0daff18ea0..ac34400d5240a1036ef420af136956e0fd26fec1 100644 >--- a/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >+++ b/Source/WebCore/platform/mediastream/RealtimeVideoSource.h >@@ -27,6 +27,7 @@ > > #if ENABLE(MEDIA_STREAM) > >+#include "FontCascade.h" > #include "ImageBuffer.h" > #include "MediaSample.h" > #include "RealtimeMediaSource.h" >diff --git a/Source/WebCore/platform/mediastream/VideoPreset.h b/Source/WebCore/platform/mediastream/VideoPreset.h >index 2033c76db65fa39e6adca39dd82c4d092c7586ff..f99ff0a38c744f7613d79ad9a5ae243da019fe2c 100644 >--- a/Source/WebCore/platform/mediastream/VideoPreset.h >+++ b/Source/WebCore/platform/mediastream/VideoPreset.h >@@ -27,6 +27,7 @@ > > #if ENABLE(MEDIA_STREAM) > >+#include "FontCascade.h" > #include "ImageBuffer.h" > #include "MediaSample.h" > #include "RealtimeMediaSource.h" >diff --git a/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm b/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >index 46050d93834f3a937fee3755737deaa4e2c9c0a5..0406d6c9b531d5297a8938b274fb38a603f09440 100644 >--- a/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >+++ b/Source/WebCore/platform/mediastream/ios/AVAudioSessionCaptureDeviceManager.mm >@@ -23,17 +23,20 @@ > * THE POSSIBILITY OF SUCH DAMAGE. > */ > >-#import "config.h" >-#import "AVAudioSessionCaptureDeviceManager.h" >+#include "config.h" >+#include "AVAudioSessionCaptureDeviceManager.h" > > #if ENABLE(MEDIA_STREAM) && PLATFORM(IOS_FAMILY) > >-#import "AVAudioSessionCaptureDevice.h" >-#import "RealtimeMediaSourceCenter.h" >-#import <AVFoundation/AVAudioSession.h> >-#import <wtf/Vector.h> >+#include "AVAudioSessionCaptureDevice.h" >+#include "RealtimeMediaSourceCenter.h" >+#include <AVFoundation/AVAudioSession.h> >+#include <wtf/SoftLinking.h> >+#include <wtf/Vector.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >+#define AVAudioSession getAVAudioSessionClass() > > void* AvailableInputsContext = &AvailableInputsContext; > >@@ -124,13 +127,13 @@ void AVAudioSessionCaptureDeviceManager::refreshAudioCaptureDevices() > m_listener = adoptNS([[WebAVAudioSessionAvailableInputsListener alloc] initWithCallback:[this] { > refreshAudioCaptureDevices(); > }]); >- [[PAL::getAVAudioSessionClass() sharedInstance] addObserver:m_listener.get() forKeyPath:@"availableInputs" options:0 context:AvailableInputsContext]; >+ [[AVAudioSession sharedInstance] addObserver:m_listener.get() forKeyPath:@"availableInputs" options:0 context:AvailableInputsContext]; > } > > Vector<AVAudioSessionCaptureDevice> newAudioDevices; > Vector<CaptureDevice> newDevices; > >- for (AVAudioSessionPortDescription *portDescription in [PAL::getAVAudioSessionClass() sharedInstance].availableInputs) { >+ for (AVAudioSessionPortDescription *portDescription in [AVAudioSession sharedInstance].availableInputs) { > auto audioDevice = AVAudioSessionCaptureDevice::create(portDescription); > newDevices.append(audioDevice); > newAudioDevices.append(WTFMove(audioDevice)); >diff --git a/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm b/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >index 6fb3dad39c2d9e978dc521b7bb83f699a93253ff..2ed832427412244f2d6f38604c2093a41c543a55 100644 >--- a/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >+++ b/Source/WebCore/platform/mediastream/ios/CoreAudioCaptureSourceIOS.mm >@@ -31,8 +31,21 @@ > #import "Logging.h" > #import <AVFoundation/AVAudioSession.h> > #import <wtf/MainThread.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+typedef AVAudioSession AVAudioSessionType; >+ >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVAudioSession) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionNotification, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionInterruptionTypeKey, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVAudioSessionMediaServicesWereResetNotification, NSString *) >+ >+#define AVAudioSession getAVAudioSessionClass() >+#define AVAudioSessionInterruptionNotification getAVAudioSessionInterruptionNotification() >+#define AVAudioSessionInterruptionTypeKey getAVAudioSessionInterruptionTypeKey() >+#define AVAudioSessionMediaServicesWereResetNotification getAVAudioSessionMediaServicesWereResetNotification() > > using namespace WebCore; > >@@ -55,7 +68,7 @@ using namespace WebCore; > _callback = callback; > > NSNotificationCenter* center = [NSNotificationCenter defaultCenter]; >- AVAudioSession* session = [PAL::getAVAudioSessionClass() sharedInstance]; >+ AVAudioSessionType* session = [AVAudioSession sharedInstance]; > > [center addObserver:self selector:@selector(handleInterruption:) name:AVAudioSessionInterruptionNotification object:session]; > [center addObserver:self selector:@selector(sessionMediaServicesWereReset:) name:AVAudioSessionMediaServicesWereResetNotification object:session]; >@@ -82,7 +95,7 @@ using namespace WebCore; > > if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] intValue] == AVAudioSessionInterruptionTypeEnded) { > NSError *error = nil; >- [[PAL::getAVAudioSessionClass() sharedInstance] setActive:YES error:&error]; >+ [[AVAudioSession sharedInstance] setActive:YES error:&error]; > > #if !LOG_DISABLED > if (error) >diff --git a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >index a9ea3e1b3ecaa3f93916e6d157462d2313e7cd52..10626aaef8aa0233213a6f53f9aab997a7055332 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >@@ -41,8 +41,27 @@ > #import <objc/runtime.h> > #import <wtf/MainThread.h> > #import <wtf/NeverDestroyed.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+typedef AVCaptureDevice AVCaptureDeviceTypedef; >+typedef AVCaptureSession AVCaptureSessionType; >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeMuxed, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *) >+SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >+ >+#define AVMediaTypeAudio getAVMediaTypeAudio() >+#define AVMediaTypeMuxed getAVMediaTypeMuxed() >+#define AVMediaTypeVideo getAVMediaTypeVideo() >+#define AVCaptureDeviceWasConnectedNotification getAVCaptureDeviceWasConnectedNotification() >+#define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() > > using namespace WebCore; > >@@ -80,7 +99,7 @@ const Vector<CaptureDevice>& AVCaptureDeviceManager::captureDevices() > return captureDevicesInternal(); > } > >-inline static bool deviceIsAvailable(AVCaptureDevice *device) >+inline static bool deviceIsAvailable(AVCaptureDeviceTypedef *device) > { > if (![device isConnected]) > return false; >@@ -95,20 +114,20 @@ inline static bool deviceIsAvailable(AVCaptureDevice *device) > > void AVCaptureDeviceManager::updateCachedAVCaptureDevices() > { >- auto* currentDevices = [PAL::getAVCaptureDeviceClass() devices]; >+ auto* currentDevices = [getAVCaptureDeviceClass() devices]; > auto changedDevices = adoptNS([[NSMutableArray alloc] init]); >- for (AVCaptureDevice *cachedDevice in m_avCaptureDevices.get()) { >+ for (AVCaptureDeviceTypedef *cachedDevice in m_avCaptureDevices.get()) { > if (![currentDevices containsObject:cachedDevice]) > [changedDevices addObject:cachedDevice]; > } > > if ([changedDevices count]) { >- for (AVCaptureDevice *device in changedDevices.get()) >+ for (AVCaptureDeviceTypedef *device in changedDevices.get()) > [device removeObserver:m_objcObserver.get() forKeyPath:@"suspended"]; > [m_avCaptureDevices removeObjectsInArray:changedDevices.get()]; > } > >- for (AVCaptureDevice *device in currentDevices) { >+ for (AVCaptureDeviceTypedef *device in currentDevices) { > > if (![device hasMediaType:AVMediaTypeVideo] && ![device hasMediaType:AVMediaTypeMuxed]) > continue; >@@ -132,9 +151,9 @@ void AVCaptureDeviceManager::refreshCaptureDevices() > updateCachedAVCaptureDevices(); > > bool deviceHasChanged = false; >- auto* currentDevices = [PAL::getAVCaptureDeviceClass() devices]; >+ auto* currentDevices = [getAVCaptureDeviceClass() devices]; > Vector<CaptureDevice> deviceList; >- for (AVCaptureDevice *platformDevice in currentDevices) { >+ for (AVCaptureDeviceTypedef *platformDevice in currentDevices) { > > if (![platformDevice hasMediaType:AVMediaTypeVideo] && ![platformDevice hasMediaType:AVMediaTypeMuxed]) > continue; >@@ -160,7 +179,7 @@ void AVCaptureDeviceManager::refreshCaptureDevices() > > bool AVCaptureDeviceManager::isAvailable() > { >- return PAL::AVFoundationLibrary(); >+ return AVFoundationLibrary(); > } > > AVCaptureDeviceManager& AVCaptureDeviceManager::singleton() >@@ -178,7 +197,7 @@ AVCaptureDeviceManager::~AVCaptureDeviceManager() > { > [[NSNotificationCenter defaultCenter] removeObserver:m_objcObserver.get()]; > [m_objcObserver disconnect]; >- for (AVCaptureDevice *device in m_avCaptureDevices.get()) >+ for (AVCaptureDeviceTypedef *device in m_avCaptureDevices.get()) > [device removeObserver:m_objcObserver.get() forKeyPath:@"suspended"]; > } > >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >index 85144adf4590a6d8c42fe05b813fc51fa8b3cb3b..7f3ec82b3a8f3cfab79d84fe52e4a23c8d40c8c8 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >@@ -45,9 +45,55 @@ > #import <AVFoundation/AVError.h> > #import <objc/runtime.h> > >-#import "CoreVideoSoftLink.h" >-#import <pal/cocoa/AVFoundationSoftLink.h> > #import <pal/cf/CoreMediaSoftLink.h> >+#import "CoreVideoSoftLink.h" >+ >+typedef AVCaptureConnection AVCaptureConnectionType; >+typedef AVCaptureDevice AVCaptureDeviceTypedef; >+typedef AVCaptureDeviceFormat AVCaptureDeviceFormatType; >+typedef AVCaptureDeviceInput AVCaptureDeviceInputType; >+typedef AVCaptureOutput AVCaptureOutputType; >+typedef AVCaptureVideoDataOutput AVCaptureVideoDataOutputType; >+typedef AVFrameRateRange AVFrameRateRangeType; >+typedef AVCaptureSession AVCaptureSessionType; >+ >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+ >+SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceFormat) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureVideoDataOutput) >+SOFT_LINK_CLASS(AVFoundation, AVFrameRateRange) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) >+ >+#define AVCaptureConnection getAVCaptureConnectionClass() >+#define AVCaptureDevice getAVCaptureDeviceClass() >+#define AVCaptureDeviceFormat getAVCaptureDeviceFormatClass() >+#define AVCaptureDeviceInput getAVCaptureDeviceInputClass() >+#define AVCaptureOutput getAVCaptureOutputClass() >+#define AVCaptureVideoDataOutput getAVCaptureVideoDataOutputClass() >+#define AVFrameRateRange getAVFrameRateRangeClass() >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >+ >+SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >+#define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() >+ >+#if PLATFORM(IOS_FAMILY) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionErrorKey, NSString *) >+ >+#define AVCaptureSessionRuntimeErrorNotification getAVCaptureSessionRuntimeErrorNotification() >+#define AVCaptureSessionWasInterruptedNotification getAVCaptureSessionWasInterruptedNotification() >+#define AVCaptureSessionInterruptionEndedNotification getAVCaptureSessionInterruptionEndedNotification() >+#define AVCaptureSessionInterruptionReasonKey getAVCaptureSessionInterruptionReasonKey() >+#define AVCaptureSessionErrorKey getAVCaptureSessionErrorKey() >+#endif > > using namespace WebCore; > using namespace PAL; >@@ -60,7 +106,7 @@ using namespace PAL; > -(void)disconnect; > -(void)addNotificationObservers; > -(void)removeNotificationObservers; >--(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection; >+-(void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection; > -(void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context; > #if PLATFORM(IOS_FAMILY) > -(void)sessionRuntimeError:(NSNotification*)notification; >@@ -94,23 +140,23 @@ static dispatch_queue_t globaVideoCaptureSerialQueue() > > class AVVideoPreset : public VideoPreset { > public: >- static Ref<AVVideoPreset> create(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormat* format) >+ static Ref<AVVideoPreset> create(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormatType* format) > { > return adoptRef(*new AVVideoPreset(size, WTFMove(frameRateRanges), format)); > } > >- AVVideoPreset(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormat* format) >+ AVVideoPreset(IntSize size, Vector<FrameRateRange>&& frameRateRanges, AVCaptureDeviceFormatType* format) > : VideoPreset(size, WTFMove(frameRateRanges), AVCapture) > , format(format) > { > } > >- RetainPtr<AVCaptureDeviceFormat> format; >+ RetainPtr<AVCaptureDeviceFormatType> format; > }; > > CaptureSourceOrError AVVideoCaptureSource::create(String&& id, String&& hashSalt, const MediaConstraints* constraints) > { >- AVCaptureDevice *device = [PAL::getAVCaptureDeviceClass() deviceWithUniqueID:id]; >+ AVCaptureDeviceTypedef *device = [getAVCaptureDeviceClass() deviceWithUniqueID:id]; > if (!device) > return { }; > >@@ -124,7 +170,7 @@ CaptureSourceOrError AVVideoCaptureSource::create(String&& id, String&& hashSalt > return CaptureSourceOrError(WTFMove(source)); > } > >-AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDevice* device, String&& id, String&& hashSalt) >+AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDeviceTypedef* device, String&& id, String&& hashSalt) > : RealtimeVideoSource(device.localizedName, WTFMove(id), WTFMove(hashSalt)) > , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] initWithCallback:this])) > , m_device(device) >@@ -259,7 +305,7 @@ const RealtimeMediaSourceCapabilities& AVVideoCaptureSource::capabilities() > RealtimeMediaSourceCapabilities capabilities(settings().supportedConstraints()); > capabilities.setDeviceId(hashedId()); > >- AVCaptureDevice *videoDevice = device(); >+ AVCaptureDeviceTypedef *videoDevice = device(); > if ([videoDevice position] == AVCaptureDevicePositionFront) > capabilities.addFacingMode(RealtimeMediaSourceSettings::User); > if ([videoDevice position] == AVCaptureDevicePositionBack) >@@ -375,9 +421,9 @@ static inline int sensorOrientation(AVCaptureVideoOrientation videoOrientation) > #endif > } > >-static inline int sensorOrientationFromVideoOutput(AVCaptureVideoDataOutput* videoOutput) >+static inline int sensorOrientationFromVideoOutput(AVCaptureVideoDataOutputType* videoOutput) > { >- AVCaptureConnection* connection = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; >+ AVCaptureConnectionType* connection = [videoOutput connectionWithMediaType: getAVMediaTypeVideo()]; > return connection ? sensorOrientation([connection videoOrientation]) : 0; > } > >@@ -388,7 +434,7 @@ bool AVVideoCaptureSource::setupSession() > > ALWAYS_LOG_IF(loggerPtr(), LOGIDENTIFIER); > >- m_session = adoptNS([PAL::allocAVCaptureSessionInstance() init]); >+ m_session = adoptNS([allocAVCaptureSessionInstance() init]); > [m_session addObserver:m_objcObserver.get() forKeyPath:@"running" options:NSKeyValueObservingOptionNew context:(void *)nil]; > > [m_session beginConfiguration]; >@@ -401,10 +447,10 @@ bool AVVideoCaptureSource::setupSession() > return success; > } > >-AVFrameRateRange* AVVideoCaptureSource::frameDurationForFrameRate(double rate) >+AVFrameRateRangeType* AVVideoCaptureSource::frameDurationForFrameRate(double rate) > { >- AVFrameRateRange *bestFrameRateRange = nil; >- for (AVFrameRateRange *frameRateRange in [[device() activeFormat] videoSupportedFrameRateRanges]) { >+ AVFrameRateRangeType *bestFrameRateRange = nil; >+ for (AVFrameRateRangeType *frameRateRange in [[device() activeFormat] videoSupportedFrameRateRanges]) { > if (frameRateRangeIncludesRate({ [frameRateRange minFrameRate], [frameRateRange maxFrameRate] }, rate)) { > if (!bestFrameRateRange || CMTIME_COMPARE_INLINE([frameRateRange minFrameDuration], >, [bestFrameRateRange minFrameDuration])) > bestFrameRateRange = frameRateRange; >@@ -426,7 +472,7 @@ bool AVVideoCaptureSource::setupCaptureSession() > #endif > > NSError *error = nil; >- RetainPtr<AVCaptureDeviceInput> videoIn = adoptNS([PAL::allocAVCaptureDeviceInputInstance() initWithDevice:device() error:&error]); >+ RetainPtr<AVCaptureDeviceInputType> videoIn = adoptNS([allocAVCaptureDeviceInputInstance() initWithDevice:device() error:&error]); > if (error) { > ERROR_LOG_IF(loggerPtr(), LOGIDENTIFIER, "failed to allocate AVCaptureDeviceInput ", [[error localizedDescription] UTF8String]); > return false; >@@ -438,7 +484,7 @@ bool AVVideoCaptureSource::setupCaptureSession() > } > [session() addInput:videoIn.get()]; > >- m_videoOutput = adoptNS([PAL::allocAVCaptureVideoDataOutputInstance() init]); >+ m_videoOutput = adoptNS([allocAVCaptureVideoDataOutputInstance() init]); > auto settingsDictionary = adoptNS([[NSMutableDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithInt:avVideoCapturePixelBufferFormat()], kCVPixelBufferPixelFormatTypeKey, nil]); > > [m_videoOutput setVideoSettings:settingsDictionary.get()]; >@@ -522,7 +568,7 @@ void AVVideoCaptureSource::processNewFrame(Ref<MediaSample>&& sample) > dispatchMediaSampleToObservers(WTFMove(sample)); > } > >-void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef sampleBuffer, AVCaptureConnection* captureConnection) >+void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType* captureConnection) > { > if (m_framesToDropAtStartup && m_framesToDropAtStartup--) > return; >@@ -573,7 +619,7 @@ bool AVVideoCaptureSource::interrupted() const > void AVVideoCaptureSource::generatePresets() > { > Vector<Ref<VideoPreset>> presets; >- for (AVCaptureDeviceFormat* format in [device() formats]) { >+ for (AVCaptureDeviceFormatType* format in [device() formats]) { > > CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription); > IntSize size = { dimensions.width, dimensions.height }; >@@ -584,7 +630,7 @@ void AVVideoCaptureSource::generatePresets() > continue; > > Vector<FrameRateRange> frameRates; >- for (AVFrameRateRange* range in [format videoSupportedFrameRateRanges]) >+ for (AVFrameRateRangeType *range in [format videoSupportedFrameRateRanges]) > frameRates.append({ range.minFrameRate, range.maxFrameRate}); > > presets.append(AVVideoPreset::create(size, WTFMove(frameRates), format)); >@@ -666,7 +712,7 @@ void AVVideoCaptureSource::deviceDisconnected(RetainPtr<NSNotification> notifica > [center addObserver:self selector:@selector(deviceConnectedDidChange:) name:AVCaptureDeviceWasDisconnectedNotification object:nil]; > > #if PLATFORM(IOS_FAMILY) >- AVCaptureSession* session = m_callback->session(); >+ AVCaptureSessionType* session = m_callback->session(); > [center addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:session]; > [center addObserver:self selector:@selector(beginSessionInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:session]; > [center addObserver:self selector:@selector(endSessionInterrupted:) name:AVCaptureSessionInterruptionEndedNotification object:session]; >@@ -678,7 +724,7 @@ void AVVideoCaptureSource::deviceDisconnected(RetainPtr<NSNotification> notifica > [[NSNotificationCenter defaultCenter] removeObserver:self]; > } > >-- (void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection >+- (void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection > { > if (!m_callback) > return; >diff --git a/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm b/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >index 5c78987bb73a7868b2ac7b9bea0e3b4bf577c4b4..04759013b0c54914a13d9b0e79012eac1f3883b0 100644 >--- a/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >+++ b/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm >@@ -46,8 +46,8 @@ > #import <QuartzCore/CATransaction.h> > #import <objc/runtime.h> > >-#import "CoreVideoSoftLink.h" > #import <pal/cf/CoreMediaSoftLink.h> >+#import "CoreVideoSoftLink.h" > > namespace WebCore { > using namespace PAL; >diff --git a/Source/WebKit/Shared/ios/WebIconUtilities.mm b/Source/WebKit/Shared/ios/WebIconUtilities.mm >index 792c7de9b8ddad6d59fbe0ff9d8334aa4574dd02..470eca4567a0329273b3bf5c98b7dfd2da627416 100644 >--- a/Source/WebKit/Shared/ios/WebIconUtilities.mm >+++ b/Source/WebKit/Shared/ios/WebIconUtilities.mm >@@ -34,11 +34,14 @@ > #import <CoreMedia/CoreMedia.h> > #import <ImageIO/ImageIO.h> > #import <MobileCoreServices/MobileCoreServices.h> >+#import <pal/cf/CoreMediaSoftLink.h> > #import <wtf/MathExtras.h> > #import <wtf/RetainPtr.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cf/CoreMediaSoftLink.h> >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK(AVFoundation); >+SOFT_LINK_CLASS(AVFoundation, AVAssetImageGenerator); >+SOFT_LINK_CLASS(AVFoundation, AVURLAsset); > > namespace WebKit { > >@@ -117,8 +120,8 @@ UIImage* iconForVideoFile(NSURL *file) > { > ASSERT_ARG(file, [file isFileURL]); > >- RetainPtr<AVURLAsset> asset = adoptNS([PAL::allocAVURLAssetInstance() initWithURL:file options:nil]); >- RetainPtr<AVAssetImageGenerator> generator = adoptNS([PAL::allocAVAssetImageGeneratorInstance() initWithAsset:asset.get()]); >+ RetainPtr<AVURLAsset> asset = adoptNS([allocAVURLAssetInstance() initWithURL:file options:nil]); >+ RetainPtr<AVAssetImageGenerator> generator = adoptNS([allocAVAssetImageGeneratorInstance() initWithAsset:asset.get()]); > [generator setAppliesPreferredTrackTransform:YES]; > > NSError *error = nil; >diff --git a/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm b/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >index 5ba52c90aeb0fead6f448d7086f59a2ae6e9e952..21c006776a66741ab94ae69180bf6b410f994930 100644 >--- a/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >+++ b/Source/WebKit/Shared/mac/WebCoreArgumentCodersMac.mm >@@ -44,8 +44,11 @@ > #if ENABLE(WIRELESS_PLAYBACK_TARGET) > #import <WebCore/MediaPlaybackTargetContext.h> > #import <objc/runtime.h> >+#import <pal/spi/mac/AVFoundationSPI.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVOutputContext) > #endif > > namespace IPC { >@@ -577,16 +580,16 @@ bool ArgumentCoder<WebCore::ContentFilterUnblockHandler>::decode(Decoder& decode > > void ArgumentCoder<WebCore::MediaPlaybackTargetContext>::encodePlatformData(Encoder& encoder, const WebCore::MediaPlaybackTargetContext& target) > { >- if ([PAL::getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) >+ if ([getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) > encoder << target.avOutputContext(); > } > > bool ArgumentCoder<WebCore::MediaPlaybackTargetContext>::decodePlatformData(Decoder& decoder, WebCore::MediaPlaybackTargetContext& target) > { >- if (![PAL::getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) >+ if (![getAVOutputContextClass() conformsToProtocol:@protocol(NSSecureCoding)]) > return false; > >- auto context = IPC::decode<AVOutputContext>(decoder, PAL::getAVOutputContextClass()); >+ auto context = IPC::decode<AVOutputContext>(decoder, getAVOutputContextClass()); > if (!context) > return false; > >diff --git a/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm b/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >index a51a22362384151ab11d4894369f8a18580067da..5eaa27175f0f1ea455512b9f47c48908296728d0 100644 >--- a/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >+++ b/Source/WebKit/UIProcess/Cocoa/UIDelegate.mm >@@ -60,8 +60,12 @@ > #if HAVE(AUTHORIZATION_STATUS_FOR_MEDIA_TYPE) > #import <AVFoundation/AVCaptureDevice.h> > #import <AVFoundation/AVMediaFormat.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK(AVFoundation); >+SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice); >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *); >+SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *); > #endif > > namespace WebKit { >@@ -943,7 +947,7 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); > return; > } >- AVAuthorizationStatus cameraAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [PAL::getAVCaptureDeviceClass() authorizationStatusForMediaType:AVMediaTypeVideo]; >+ AVAuthorizationStatus cameraAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [getAVCaptureDeviceClass() authorizationStatusForMediaType:getAVMediaTypeVideo()]; > switch (cameraAuthorizationStatus) { > case AVAuthorizationStatusAuthorized: > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); >@@ -961,13 +965,13 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestUserMediaAuthorizationForFrame(frame, topLevelOrigin, protectedRequest, (id <WKUIDelegatePrivate>)m_uiDelegate.m_delegate.get(), *webView.get()); > }); > >- [PAL::getAVCaptureDeviceClass() requestAccessForMediaType:AVMediaTypeVideo completionHandler:decisionHandler.get()]; >+ [getAVCaptureDeviceClass() requestAccessForMediaType:getAVMediaTypeVideo() completionHandler:decisionHandler.get()]; > break; > } > }); > > if (requiresAudioCapture) { >- AVAuthorizationStatus microphoneAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [PAL::getAVCaptureDeviceClass() authorizationStatusForMediaType:AVMediaTypeAudio]; >+ AVAuthorizationStatus microphoneAuthorizationStatus = usingMockCaptureDevices ? AVAuthorizationStatusAuthorized : [getAVCaptureDeviceClass() authorizationStatusForMediaType:getAVMediaTypeAudio()]; > switch (microphoneAuthorizationStatus) { > case AVAuthorizationStatusAuthorized: > requestCameraAuthorization(); >@@ -985,7 +989,7 @@ void UIDelegate::UIClient::decidePolicyForUserMediaPermissionRequest(WebPageProx > requestCameraAuthorization(); > }); > >- [PAL::getAVCaptureDeviceClass() requestAccessForMediaType:AVMediaTypeAudio completionHandler:decisionHandler.get()]; >+ [getAVCaptureDeviceClass() requestAccessForMediaType:getAVMediaTypeAudio() completionHandler:decisionHandler.get()]; > break; > } > } else >diff --git a/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm b/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >index 52a9bd4dcc91758cdf000dbc0b4de727ac856cd8..5bdfb6dddbf99142f4229214dfb0d5844f1f3dd1 100644 >--- a/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >+++ b/Source/WebKit/WebProcess/WebPage/RemoteLayerTree/PlatformCALayerRemoteCustom.mm >@@ -35,8 +35,10 @@ > #import <WebCore/PlatformCALayerCocoa.h> > #import <WebCore/WebCoreCALayerExtras.h> > #import <wtf/RetainPtr.h> >+#import <wtf/SoftLinking.h> > >-#import <pal/cocoa/AVFoundationSoftLink.h> >+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) >+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVPlayerLayer) > > namespace WebKit { > using namespace WebCore; >@@ -103,8 +105,8 @@ Ref<WebCore::PlatformCALayer> PlatformCALayerRemoteCustom::clone(PlatformCALayer > > if (layerType() == LayerTypeAVPlayerLayer) { > >- if ([platformLayer() isKindOfClass:PAL::getAVPlayerLayerClass()]) { >- clonedLayer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); >+ if ([platformLayer() isKindOfClass:getAVPlayerLayerClass()]) { >+ clonedLayer = adoptNS([allocAVPlayerLayerInstance() init]); > > AVPlayerLayer *destinationPlayerLayer = static_cast<AVPlayerLayer *>(clonedLayer.get()); > AVPlayerLayer *sourcePlayerLayer = static_cast<AVPlayerLayer *>(platformLayer()); >diff --git a/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm b/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >index 23aac34426ba03ab93b4e963f1e82f851404a1cb..c6e07164caf05ca4819e30a786dcc8ad83d60533 100644 >--- a/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >+++ b/Source/WebKitLegacy/mac/WebView/WebVideoFullscreenController.mm >@@ -36,11 +36,13 @@ > #import <objc/runtime.h> > #import <pal/system/SleepDisabler.h> > #import <wtf/RetainPtr.h> >- >-#import <pal/cocoa/AVFoundationSoftLink.h> >+#import <wtf/SoftLinking.h> > > ALLOW_DEPRECATED_DECLARATIONS_BEGIN > >+SOFT_LINK_FRAMEWORK(AVFoundation) >+SOFT_LINK_CLASS(AVFoundation, AVPlayerLayer) >+ > @interface WebVideoFullscreenWindow : NSWindow<NSAnimationDelegate> { > SEL _controllerActionOnAnimationEnd; > WebWindowScaleAnimation *_fullscreenAnimation; // (retain) >@@ -116,7 +118,7 @@ ALLOW_DEPRECATED_DECLARATIONS_BEGIN > > auto contentView = [[self fullscreenWindow] contentView]; > >- auto layer = adoptNS([PAL::allocAVPlayerLayerInstance() init]); >+ auto layer = adoptNS([allocAVPlayerLayerInstance() init]); > [layer setPlayer:player]; > > [contentView setLayer:layer.get()]; >@@ -145,7 +147,7 @@ ALLOW_DEPRECATED_DECLARATIONS_BEGIN > - (void)windowDidExitFullscreen > { > CALayer *layer = [[[self window] contentView] layer]; >- if ([layer isKindOfClass:PAL::getAVPlayerLayerClass()]) >+ if ([layer isKindOfClass:getAVPlayerLayerClass()]) > [[(AVPlayerLayer *)layer player] removeObserver:self forKeyPath:@"rate"]; > > [self clearFadeAnimation]; >diff --git a/Tools/ChangeLog b/Tools/ChangeLog >index ad921a5b3e2bd992a3a14c85c7aff2d923d06ece..a2e60f5508b18a72a5c877d41925566868b59a75 100644 >--- a/Tools/ChangeLog >+++ b/Tools/ChangeLog >@@ -1,3 +1,17 @@ >+2019-04-25 Commit Queue <commit-queue@webkit.org> >+ >+ Unreviewed, rolling out r244627. >+ https://bugs.webkit.org/show_bug.cgi?id=197282 >+ >+ Causing internal build failures (Requested by ShawnRoberts on >+ #webkit). >+ >+ Reverted changeset: >+ >+ "Create AVFoundationSoftLink.{h,mm} to reduce duplicate code" >+ https://bugs.webkit.org/show_bug.cgi?id=197171 >+ https://trac.webkit.org/changeset/244627 >+ > 2019-04-24 Carlos Garcia Campos <cgarcia@igalia.com> > > [GTK] Hardcoded text color in input fields >diff --git a/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj b/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >index 180f19bfa96c2bed54deb269e44fb07d15fd5ca5..08f1661ac5084ec65d3dc8a3312ef447462c37fb 100644 >--- a/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >+++ b/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj >@@ -23,7 +23,6 @@ > > /* Begin PBXBuildFile section */ > 041A1E34216FFDBC00789E0A /* PublicSuffix.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 041A1E33216FFDBC00789E0A /* PublicSuffix.cpp */; }; >- 0711DF52226A95FC003DD2F7 /* AVFoundationSoftLinkTest.mm in Sources */ = {isa = PBXBuildFile; fileRef = 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */; }; > 07492B3B1DF8B14C00633DE1 /* EnumerateMediaDevices.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 07492B3A1DF8AE2D00633DE1 /* EnumerateMediaDevices.cpp */; }; > 07492B3C1DF8B86600633DE1 /* enumerateMediaDevices.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 07492B391DF8ADA400633DE1 /* enumerateMediaDevices.html */; }; > 074994421EA5034B000DA44E /* getUserMedia.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 4A410F4D19AF7BEF002EBAB5 /* getUserMedia.html */; }; >@@ -1339,7 +1338,6 @@ > /* Begin PBXFileReference section */ > 00CD9F6215BE312C002DA2CE /* BackForwardList.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = BackForwardList.mm; sourceTree = "<group>"; }; > 041A1E33216FFDBC00789E0A /* PublicSuffix.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PublicSuffix.cpp; sourceTree = "<group>"; }; >- 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVFoundationSoftLinkTest.mm; sourceTree = "<group>"; }; > 07492B391DF8ADA400633DE1 /* enumerateMediaDevices.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = enumerateMediaDevices.html; sourceTree = "<group>"; }; > 07492B3A1DF8AE2D00633DE1 /* EnumerateMediaDevices.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = EnumerateMediaDevices.cpp; sourceTree = "<group>"; }; > 0766DD1F1A5AD5200023E3BB /* PendingAPIRequestURL.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PendingAPIRequestURL.cpp; sourceTree = "<group>"; }; >@@ -3689,7 +3687,6 @@ > CD89D0371C4EDB1300040A04 /* cocoa */ = { > isa = PBXGroup; > children = ( >- 0711DF51226A95FB003DD2F7 /* AVFoundationSoftLinkTest.mm */, > 751B05D51F8EAC1A0028A09E /* DatabaseTrackerTest.mm */, > 5769C50A1D9B0001000847FB /* SerializedCryptoKeyWrap.mm */, > A17991861E1C994E00A505ED /* SharedBuffer.mm */, >@@ -4031,7 +4028,6 @@ > CDC8E48D1BC5CB4500594FEC /* AudioSessionCategoryIOS.mm in Sources */, > 7C83E0B91D0A64F100FEBCF3 /* AutoLayoutIntegration.mm in Sources */, > 07CD32F62065B5430064A4BE /* AVFoundationPreference.mm in Sources */, >- 0711DF52226A95FC003DD2F7 /* AVFoundationSoftLinkTest.mm in Sources */, > 7CCE7EB51A411A7E00447C4C /* BackForwardList.mm in Sources */, > 1C7FEB20207C0F2E00D23278 /* BackgroundColor.mm in Sources */, > 374B7A601DF36EEE00ACCB6C /* BundleEditingDelegate.mm in Sources */, >diff --git a/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm b/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm >deleted file mode 100644 >index 39c8dfacb08bdecf3c72adb89f5644d9dc092e1e..0000000000000000000000000000000000000000 >--- a/Tools/TestWebKitAPI/Tests/WebCore/cocoa/AVFoundationSoftLinkTest.mm >+++ /dev/null >@@ -1,194 +0,0 @@ >-/* >- * Copyright (C) 2019 Apple Inc. All rights reserved. >- * >- * Redistribution and use in source and binary forms, with or without >- * modification, are permitted provided that the following conditions >- * are met: >- * 1. Redistributions of source code must retain the above copyright >- * notice, this list of conditions and the following disclaimer. >- * 2. Redistributions in binary form must reproduce the above copyright >- * notice, this list of conditions and the following disclaimer in the >- * documentation and/or other materials provided with the distribution. >- * >- * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >- * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >- * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >- * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >- * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >- * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >- * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >- * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >- * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >- * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >- * THE POSSIBILITY OF SUCH DAMAGE. >- */ >- >-#include "config.h" >- >-#if PLATFORM(COCOA) >- >-#import <pal/cocoa/AVFoundationSoftLink.h> >- >-namespace TestWebKitAPI { >- >-TEST(AVFoundationSoftLink, Classes) >-{ >- EXPECT_NE(PAL::getAVPlayerClass(), nullptr); >- EXPECT_NE(PAL::getAVPlayerItemClass(), nullptr); >- EXPECT_NE(PAL::getAVPlayerItemVideoOutputClass(), nullptr); >- EXPECT_NE(PAL::getAVPlayerLayerClass(), nullptr); >- EXPECT_NE(PAL::getAVURLAssetClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetImageGeneratorClass(), nullptr); >- EXPECT_NE(PAL::getAVMetadataItemClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetCacheClass(), nullptr); >- EXPECT_NE(PAL::getAVPlayerItemLegibleOutputClass(), nullptr); >- EXPECT_NE(PAL::getAVMediaSelectionGroupClass(), nullptr); >- EXPECT_NE(PAL::getAVMediaSelectionOptionClass(), nullptr); >- EXPECT_NE(PAL::getAVOutputContextClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetReaderClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetWriterClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetWriterInputClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureSessionClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureConnectionClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureDeviceClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureDeviceFormatClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureDeviceInputClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureOutputClass(), nullptr); >- EXPECT_NE(PAL::getAVCaptureVideoDataOutputClass(), nullptr); >- EXPECT_NE(PAL::getAVFrameRateRangeClass(), nullptr); >- EXPECT_NE(PAL::getAVMutableAudioMixClass(), nullptr); >- EXPECT_NE(PAL::getAVMutableAudioMixInputParametersClass(), nullptr); >- >-#if HAVE(AVSTREAMSESSION) && ENABLE(LEGACY_ENCRYPTED_MEDIA) >- EXPECT_NE(PAL::getAVStreamSessionClass(), nullptr); >- EXPECT_NE(PAL::getAVStreamDataParserClass(), nullptr); >-#endif >- >-#if PLATFORM(IOS_FAMILY) >- EXPECT_NE(PAL::getAVPersistableContentKeyRequestClass(), nullptr); >- EXPECT_NE(PAL::getAVAudioSessionClass(), nullptr); >- EXPECT_NE(PAL::getAVSpeechSynthesizerClass(), nullptr); >- EXPECT_NE(PAL::getAVSpeechUtteranceClass(), nullptr); >- EXPECT_NE(PAL::getAVSpeechSynthesisVoiceClass(), nullptr); >-#endif >- >-#if HAVE(MEDIA_PLAYER) && !PLATFORM(WATCHOS) >- EXPECT_NE(PAL::getAVRouteDetectorClass(), nullptr); >-#endif >- >- EXPECT_NE(PAL::getAVContentKeyResponseClass(), nullptr); >- EXPECT_NE(PAL::getAVContentKeySessionClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetResourceLoadingRequestClass(), nullptr); >- EXPECT_NE(PAL::getAVAssetReaderSampleReferenceOutputClass(), nullptr); >- EXPECT_NE(PAL::getAVVideoPerformanceMetricsClass(), nullptr); >- EXPECT_NE(PAL::getAVSampleBufferAudioRendererClass(), nullptr); >- EXPECT_NE(PAL::getAVSampleBufferDisplayLayerClass(), nullptr); >- EXPECT_NE(PAL::getAVSampleBufferRenderSynchronizerClass(), nullptr); >-} >- >- >-TEST(AVFoundationSoftLink, Constants) >-{ >- EXPECT_TRUE([AVAudioTimePitchAlgorithmSpectral isEqualToString:@"Spectral"]); >- EXPECT_TRUE([AVAudioTimePitchAlgorithmVarispeed isEqualToString:@"Varispeed"]); >- EXPECT_TRUE([AVMediaTypeClosedCaption isEqualToString:@"clcp"]); >- EXPECT_TRUE([AVMediaTypeVideo isEqualToString:@"vide"]); >- EXPECT_TRUE([AVMediaTypeAudio isEqualToString:@"soun"]); >- EXPECT_TRUE([AVMediaTypeMuxed isEqualToString:@"muxx"]); >- EXPECT_TRUE([AVMediaTypeMetadata isEqualToString:@"meta"]); >- EXPECT_TRUE([AVAssetImageGeneratorApertureModeCleanAperture isEqualToString:@"CleanAperture"]); >- EXPECT_TRUE([AVStreamingKeyDeliveryContentKeyType isEqualToString:@"com.apple.streamingkeydelivery.contentkey"]); >- EXPECT_TRUE([AVMediaCharacteristicContainsOnlyForcedSubtitles isEqualToString:@"public.subtitles.forced-only"]); >- EXPECT_TRUE([AVMetadataCommonKeyTitle isEqualToString:@"title"]); >- EXPECT_TRUE([AVMetadataKeySpaceCommon isEqualToString:@"comn"]); >- EXPECT_TRUE([AVMediaTypeSubtitle isEqualToString:@"sbtl"]); >- EXPECT_TRUE([AVMediaCharacteristicIsMainProgramContent isEqualToString:@"public.main-program-content"]); >- EXPECT_TRUE([AVMediaCharacteristicEasyToRead isEqualToString:@"public.easy-to-read"]); >- EXPECT_TRUE([AVFileTypeMPEG4 isEqualToString:@"public.mpeg-4"]); >- EXPECT_TRUE([AVVideoCodecH264 isEqualToString:@"avc1"]); >- EXPECT_TRUE([AVVideoExpectedSourceFrameRateKey isEqualToString:@"ExpectedFrameRate"]); >- EXPECT_TRUE([AVVideoProfileLevelKey isEqualToString:@"ProfileLevel"]); >- EXPECT_TRUE([AVVideoAverageBitRateKey isEqualToString:@"AverageBitRate"]); >- EXPECT_TRUE([AVVideoMaxKeyFrameIntervalKey isEqualToString:@"MaxKeyFrameInterval"]); >- EXPECT_TRUE([AVVideoProfileLevelH264MainAutoLevel isEqualToString:@"H264_Main_AutoLevel"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackDisplayNameKey isEqualToString:@"MediaSelectionOptionsName"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackExtendedLanguageTagKey isEqualToString:@"MediaSelectionOptionsExtendedLanguageTag"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackIsDefaultKey isEqualToString:@"MediaSelectionOptionsIsDefault"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackMediaCharactersticsKey isEqualToString:@"MediaSelectionOptionsTaggedMediaCharacteristics"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackIdentifierKey isEqualToString:@"MediaSelectionOptionsClientIdentifier"]); >- EXPECT_TRUE([AVOutOfBandAlternateTrackSourceKey isEqualToString:@"MediaSelectionOptionsURL"]); >- EXPECT_TRUE([AVMediaCharacteristicDescribesMusicAndSoundForAccessibility isEqualToString:@"public.accessibility.describes-music-and-sound"]); >- EXPECT_TRUE([AVMediaCharacteristicTranscribesSpokenDialogForAccessibility isEqualToString:@"public.accessibility.transcribes-spoken-dialog"]); >- EXPECT_TRUE([AVMediaCharacteristicIsAuxiliaryContent isEqualToString:@"public.auxiliary-content"]); >- EXPECT_TRUE([AVMediaCharacteristicDescribesVideoForAccessibility isEqualToString:@"public.accessibility.describes-video"]); >- EXPECT_TRUE([AVMetadataKeySpaceQuickTimeUserData isEqualToString:@"udta"]); >- EXPECT_TRUE([AVMetadataKeySpaceQuickTimeMetadata isEqualToString:@"mdta"]); >- EXPECT_TRUE([AVMetadataKeySpaceiTunes isEqualToString:@"itsk"]); >- EXPECT_TRUE([AVMetadataKeySpaceID3 isEqualToString:@"org.id3"]); >- EXPECT_TRUE([AVMetadataKeySpaceISOUserData isEqualToString:@"uiso"]); >- >- if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey()) >- EXPECT_TRUE([AVEncoderBitRateKey isEqualToString:@"AVEncoderBitRateKey"]); >- if (PAL::canLoad_AVFoundation_AVFormatIDKey()) >- EXPECT_TRUE([AVFormatIDKey isEqualToString:@"AVFormatIDKey"]); >- if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey()) >- EXPECT_TRUE([AVNumberOfChannelsKey isEqualToString:@"AVNumberOfChannelsKey"]); >- if (PAL::canLoad_AVFoundation_AVSampleRateKey()) >- EXPECT_TRUE([AVSampleRateKey isEqualToString:@"AVSampleRateKey"]); >- >-#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101300) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 110000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 40000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 110000) >- EXPECT_TRUE(PAL::canLoad_AVFoundation_AVURLAssetOutOfBandMIMETypeKey()); >- EXPECT_TRUE([AVURLAssetOutOfBandMIMETypeKey isEqualToString:@"AVURLAssetOutOfBandMIMETypeKey"]); >-#endif >- >-#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101400) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 120000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 50000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 120000) >- EXPECT_TRUE(PAL::canLoad_AVFoundation_AVURLAssetUseClientURLLoadingExclusively()); >- EXPECT_TRUE([AVURLAssetUseClientURLLoadingExclusively isEqualToString:@"AVURLAssetUseClientURLLoadingExclusively"]); >-#endif >- >-#if ENABLE(ENCRYPTED_MEDIA) && HAVE(AVCONTENTKEYSESSION) >- EXPECT_TRUE(PAL::canLoad_AVFoundation_AVContentKeySystemFairPlayStreaming()); >- EXPECT_TRUE([AVContentKeySystemFairPlayStreaming isEqualToString:@"FairPlayStreaming"]); >-#endif >- >-#if ENABLE(LEGACY_ENCRYPTED_MEDIA) && ENABLE(MEDIA_SOURCE) >- EXPECT_TRUE(PAL::canLoad_AVFoundation_AVContentKeyRequestProtocolVersionsKey()); >- EXPECT_TRUE([AVContentKeyRequestProtocolVersionsKey isEqualToString:@"ProtocolVersionsKey"]); >-#endif >- >-#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101500) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 130000) || (PLATFORM(WATCHOS) && __WATCH_OS_VERSION_MIN_REQUIRED >= 60000) || (PLATFORM(APPLETV) && __TV_OS_VERSION_MIN_REQUIRED >= 130000) >- EXPECT_TRUE(PAL::canLoad_AVFoundation_AVVideoCodecTypeHEVCWithAlpha()); >- EXPECT_TRUE([AVVideoCodecTypeHEVCWithAlpha isEqualToString:@"muxa"]); >-#endif >- >-#if PLATFORM(MAC) >- EXPECT_TRUE([AVStreamDataParserContentKeyRequestProtocolVersionsKey isEqualToString:@"AVContentKeyRequestProtocolVersionsKey"]); >-#endif >- >-#if PLATFORM(IOS_FAMILY) >- EXPECT_TRUE([AVURLAssetBoundNetworkInterfaceName isEqualToString:@"AVURLAssetBoundNetworkInterfaceName"]); >- EXPECT_TRUE([AVURLAssetClientBundleIdentifierKey isEqualToString:@"AVURLAssetClientBundleIdentifierKey"]); >- EXPECT_TRUE([AVCaptureSessionRuntimeErrorNotification isEqualToString:@"AVCaptureSessionRuntimeErrorNotification"]); >- EXPECT_TRUE([AVCaptureSessionWasInterruptedNotification isEqualToString:@"AVCaptureSessionWasInterruptedNotification"]); >- EXPECT_TRUE([AVCaptureSessionInterruptionEndedNotification isEqualToString:@"AVCaptureSessionInterruptionEndedNotification"]); >- EXPECT_TRUE([AVCaptureSessionInterruptionReasonKey isEqualToString:@"AVCaptureSessionInterruptionReasonKey"]); >- EXPECT_TRUE([AVCaptureSessionErrorKey isEqualToString:@"AVCaptureSessionErrorKey"]); >- EXPECT_TRUE([AVAudioSessionCategoryAmbient isEqualToString:@"AVAudioSessionCategoryAmbient"]); >- EXPECT_TRUE([AVAudioSessionCategorySoloAmbient isEqualToString:@"AVAudioSessionCategorySoloAmbient"]); >- EXPECT_TRUE([AVAudioSessionCategoryPlayback isEqualToString:@"AVAudioSessionCategoryPlayback"]); >- EXPECT_TRUE([AVAudioSessionCategoryRecord isEqualToString:@"AVAudioSessionCategoryRecord"]); >- EXPECT_TRUE([AVAudioSessionCategoryPlayAndRecord isEqualToString:@"AVAudioSessionCategoryPlayAndRecord"]); >- EXPECT_TRUE([AVAudioSessionCategoryAudioProcessing isEqualToString:@"AVAudioSessionCategoryAudioProcessing"]); >- EXPECT_TRUE([AVAudioSessionModeDefault isEqualToString:@"AVAudioSessionModeDefault"]); >- EXPECT_TRUE([AVAudioSessionModeVideoChat isEqualToString:@"AVAudioSessionModeVideoChat"]); >- EXPECT_TRUE([AVAudioSessionInterruptionNotification isEqualToString:@"AVAudioSessionInterruptionNotification"]); >- EXPECT_TRUE([AVAudioSessionInterruptionTypeKey isEqualToString:@"AVAudioSessionInterruptionTypeKey"]); >- EXPECT_TRUE([AVAudioSessionInterruptionOptionKey isEqualToString:@"AVAudioSessionInterruptionOptionKey"]); >- EXPECT_TRUE([AVRouteDetectorMultipleRoutesDetectedDidChangeNotification isEqualToString:@"AVRouteDetectorMultipleRoutesDetectedDidChangeNotification"]); >-#endif >-} >- >-#endif // PLATFORM(COCOA) >- >-} // namespace TestWebKitAPI >-
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 197282
: 368238