| Summary: | Video elements do not work as sources for TexImage2D calls in GPU Process | ||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Product: | WebKit | Reporter: | Kimmo Kinnunen <kkinnunen> | ||||||||||||||||||||||||||
| Component: | WebGL | Assignee: | Peng Liu <peng.liu6> | ||||||||||||||||||||||||||
| Status: | RESOLVED FIXED | ||||||||||||||||||||||||||||
| Severity: | Normal | CC: | aakash_jain, cdumez, changseok, dino, don.olmstead, eric.carlson, esprehn+autocc, ews-watchlist, glenn, graouts, gyuyoung.kim, jer.noble, jonlee, kondapallykalyan, peng.liu6, philipj, sergio, stephan.szabo, webkit-bot-watchers-bugzilla, webkit-bug-importer | ||||||||||||||||||||||||||
| Priority: | P2 | Keywords: | InRadar | ||||||||||||||||||||||||||
| Version: | WebKit Local Build | ||||||||||||||||||||||||||||
| Hardware: | Mac | ||||||||||||||||||||||||||||
| OS: | Unspecified | ||||||||||||||||||||||||||||
| See Also: |
https://bugs.webkit.org/show_bug.cgi?id=217211 https://bugs.webkit.org/show_bug.cgi?id=217339 |
||||||||||||||||||||||||||||
| Bug Depends on: | 219641 | ||||||||||||||||||||||||||||
| Bug Blocks: | 217211 | ||||||||||||||||||||||||||||
| Attachments: |
|
||||||||||||||||||||||||||||
|
Description
Kimmo Kinnunen
2020-10-26 06:21:22 PDT
> This is needed to support drawing web process video to remote webgl context. (transitional feature if needed)
Actually meant the inverse:
This is needed to support remote video to web process webgl context. (transitional feature if needed)
Created attachment 420602 [details]
WIP patch
Comment on attachment 420602 [details] WIP patch View in context: https://bugs.webkit.org/attachment.cgi?id=420602&action=review Looks like a good start. However: I think the method is not media player method, it is graphics context method. Thus it might be better to organise the code around having the message in the RemoteGraphicsContextGL.messages.in . For this to work though, the WebCore part of the interface needs some existing type that it can use as the polymorphic type to the media player.. IIUC, canvas2d is implemented as canvas2d->drawMediaPlayer(player) so this should follow suit, e.g. webgl->drawMediaPlayer(player) > Source/WebCore/platform/graphics/GraphicsContextGL.h:66 > + virtual bool isRemoteGraphicsContextGLProxy() const { return false; } I don't think this is such a good idea, since RemoteGraphicsCotnextGLProxy is not a WebCore concept. I think it needs to be something like the CV interface. The tricky part is to figure out what to pass as the communication object (e.g. the texture you now pass). > Source/WebKit/GPUProcess/GPUConnectionToWebProcess.cpp:356 > + ASSERT(m_remoteGraphicsContextGLMap.contains(graphicsContextGLIdentifier)); This assert (probably?) cannot hold in all cases, especially if we don't want to assert if something attacks the GPU Process (during fuzzing, for example). > Source/WebKit/WebProcess/GPU/GPUProcessConnection.cpp:158 > +GraphicsContextGLIdentifier GPUProcessConnection::graphicsContextGLIdentifier(RemoteGraphicsContextGLProxy& remoteGraphicsContextGLProxy) const This does not look correct. if the caller has access to the Proxy, he can ask it for the identifier. (In reply to Kimmo Kinnunen from comment #4) > Comment on attachment 420602 [details] > WIP patch > > View in context: > https://bugs.webkit.org/attachment.cgi?id=420602&action=review > > Looks like a good start. However: > I think the method is not media player method, it is graphics context method. > Thus it might be better to organise the code around having the message in > the RemoteGraphicsContextGL.messages.in . > For this to work though, the WebCore part of the interface needs some > existing type that it can use as the polymorphic type to the media player.. > IIUC, canvas2d is implemented as > > canvas2d->drawMediaPlayer(player) > > so this should follow suit, e.g. > > webgl->drawMediaPlayer(player) > Thanks for the comments. Well, this is just a different implementation "flavor". In the current implementation (when the GPU Process is disabled), to use video texture for WebGL surfaces, a media player provides a function copyVideoTextureToPlatformTexture() to "paints" the video texture to a given graphics context. So I chose to follow that flavor to support using video texture for surfaces while running WebGL in the GPU process. In r268145, Wenson introduced the approach you mentioned for 2D canvas. Essentially that approach is to let the graphics context coordinate the drawing. Sounds like a good idea to have consistent implementation for both canvas and WebGL. I will upload a new patch with that approach. (In reply to Peng Liu from comment #5) > Well, this is just a different implementation "flavor". In the current > implementation (when the GPU Process is disabled), to use video texture for > WebGL surfaces, a media player provides a function > copyVideoTextureToPlatformTexture() to "paints" the video texture to a given > graphics context. Yes and no. The call visits media player, but the eventual "paint", e.g. copy from media player iosurface to the texture is done by the graphics context (in GraphicsContextGLCV::...). In GPU process, this needs to be followed, and for simplicity sake, it should be done in the WebProcess side, as the RemoteMediaPlayer command sequence will be different to RemoteGraphicsContextGL command sequence. Created attachment 420883 [details]
The implementation based on a new IPC message in RemoteGraphicsContextGL
The patch is not finished yet.
Created attachment 420912 [details]
Patch
(In reply to Kimmo Kinnunen from comment #6) > (In reply to Peng Liu from comment #5) > > Well, this is just a different implementation "flavor". In the current > > implementation (when the GPU Process is disabled), to use video texture for > > WebGL surfaces, a media player provides a function > > copyVideoTextureToPlatformTexture() to "paints" the video texture to a given > > graphics context. > > Yes and no. The call visits media player, but the eventual "paint", e.g. > copy from media player iosurface to the texture is done by the graphics > context (in GraphicsContextGLCV::...). > > In GPU process, this needs to be followed, and for simplicity sake, it > should be done in the WebProcess side, as the RemoteMediaPlayer command > sequence will be different to RemoteGraphicsContextGL command sequence. Yes, the "painting" is done by the GraphicsContext. The patch implements MediaPlayerPrivateRemote::copyVideoTextureToPlatformTexture() in the web process with a new IPC message, but the actually painting is done in RemoteMediaPlayerProxy::copyVideoTextureToPlatformTexture() in the GPU process. Created attachment 420918 [details]
Version A
Created attachment 420924 [details]
Version B
Kimmo, I guess you prefer patch B? :-) Created attachment 420926 [details]
Version B
Comment on attachment 420926 [details] Version B View in context: https://bugs.webkit.org/attachment.cgi?id=420926&action=review very nice :) (not a reviewer) > Source/WebCore/html/HTMLVideoElement.cpp:303 > bool HTMLVideoElement::copyVideoTextureToPlatformTexture(GraphicsContextGL* context, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY) You can now remove this function from the hierarchy as it's not doing any work. So you can then at the call site do - if (video->copyVideoTextureToPlatformTexture(m_context.get(), texture->object(), target, level, internalformat, format, type, m_unpackPremultiplyAlpha, m_unpackFlipY)) { + if (video->player() &&.m_context->copyVideoTextureToPlatformTexture(*video->player(), texture->object(), target, level, internalformat, format, type, m_unpackPremultiplyAlpha, m_unpackFlipY)) { + if (video->player() &&.m_context->copyVideoTextureToPlatformTexture (and use whatever function name you see appropriate, such as copyPlatformTextureFromMedia... Unfortunately for you, I got review on the bug 219641 which moves the RemoteGraphicsContextGL to its own thread. I discussed with Jon and he said that we should merge the above and then adapt this media work to work with the above. So what remains to be done in this patch is to: 1) Make RemoteGraphicsContextGL ref the remoteMediaPlayerManagerProxy. Currently the manager proxy is not a refcountedthreadsafe, but it probably should be. I've tried to avoid reffing the GPUConnectionToWebProcess in RemoteGraphicsContextGL, since that ref is droppend when the connection is broken, but the processing thread needs to still process the commands to end. 2) Make MediaPlayer thread safe. Currently RemoteRenderingBackend calls into MediaPlayer in non-threadsafe manner in call: context.paintFrameForMedia(*player, mediaItem.destination()); I'd perhaps not want to see that done in GraphicsContextGL. Rather the MediaPlayer should at least mutex itself so that paintFrameForMedia and private()->copyVideoTextureToPlatformTexture() are thread-safe. 3) Remove the passing expectations from gpu-process/TestExpectations Comment on attachment 420926 [details] Version B View in context: https://bugs.webkit.org/attachment.cgi?id=420926&action=review >> Source/WebCore/html/HTMLVideoElement.cpp:303 >> bool HTMLVideoElement::copyVideoTextureToPlatformTexture(GraphicsContextGL* context, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY) > > You can now remove this function from the hierarchy as it's not doing any work. > So you can then at the call site do > - if (video->copyVideoTextureToPlatformTexture(m_context.get(), texture->object(), target, level, internalformat, format, type, m_unpackPremultiplyAlpha, m_unpackFlipY)) { > + if (video->player() &&.m_context->copyVideoTextureToPlatformTexture(*video->player(), texture->object(), target, level, internalformat, format, type, m_unpackPremultiplyAlpha, m_unpackFlipY)) { Right! A nice cleanup. Comment on attachment 420926 [details] Version B View in context: https://bugs.webkit.org/attachment.cgi?id=420926&action=review > Source/WebCore/platform/graphics/opengl/GraphicsContextGLOpenGL.cpp:259 > + return player.playerPrivate()->copyVideoTextureToPlatformTexture(this, texture, target, level, internalFormat, format, type, premultiplyAlpha, flipY); `playerPrivate()` *should* be a private method, the private player should be an implementation detail. Please add a method to MediaPlayer that calls the private player. Comment on attachment 420926 [details] Version B View in context: https://bugs.webkit.org/attachment.cgi?id=420926&action=review >> Source/WebCore/platform/graphics/opengl/GraphicsContextGLOpenGL.cpp:259 >> + return player.playerPrivate()->copyVideoTextureToPlatformTexture(this, texture, target, level, internalFormat, format, type, premultiplyAlpha, flipY); > > `playerPrivate()` *should* be a private method, the private player should be an implementation detail. Please add a method to MediaPlayer that calls the private player. Oops, this is a mistake. MediaPlayer already has the method, so I should not call the private player here. Created attachment 421034 [details] WIP, prepare to merge with the patch for bug 219641 Comment on attachment 421034 [details] WIP, prepare to merge with the patch for bug 219641 View in context: https://bugs.webkit.org/attachment.cgi?id=421034&action=review The media specific parts of this look fine to me modulo my comments, but someone that has worked on the GL context stuff should review those changes. > Source/WebCore/html/HTMLVideoElement.cpp:-303 > -bool HTMLVideoElement::copyVideoTextureToPlatformTexture(GraphicsContextGL* context, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY) Isn't this used by other ports? > Source/WebCore/platform/graphics/MediaPlayer.h:65 > +typedef struct __CVBuffer* CVPixelBufferRef; This is platform specific, so I would put this in an `#if PLATFORM` block. > Source/WebCore/platform/graphics/MediaPlayer.h:444 > + CVPixelBufferRef pixelBufferForCurrentTime(); Ditto > Source/WebCore/platform/graphics/MediaPlayerPrivate.h:165 > + virtual CVPixelBufferRef pixelBufferForCurrentTime() { return nullptr; } Ditto. > Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:665 > + // We have been asked to paint into a WebGL canvas, so take that as a signal to create > + // a decompression session, even if that means the native video can't also be displayed > + // in page. > + if (!m_hasBeenAskedToPaintGL) { > + m_hasBeenAskedToPaintGL = true; > + acceleratedRenderingStateChanged(); > + } > + > + if (updateLastPixelBuffer()) { > + if (!m_lastPixelBuffer) > + return nullptr; > + } > + > + return m_lastPixelBuffer.get(); if `MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture` is still needed, you can remove most of the code in it and call this method to get the current pixel buffer. Created attachment 421047 [details]
Revise the patch based on Eric's comments
Created attachment 421051 [details]
WIP, fix build failures on gtk/wpe ports
Comment on attachment 421051 [details] WIP, fix build failures on gtk/wpe ports View in context: https://bugs.webkit.org/attachment.cgi?id=421051&action=review > Source/WebKit/GPUProcess/graphics/RemoteGraphicsContextGL.cpp:220 > +#if USE(AVFOUNDATION) > + UNUSED_VARIABLE(premultiplyAlpha); > + ASSERT_UNUSED(target, target == GraphicsContextGL::TEXTURE_2D); > + > + CVPixelBufferRef pixelBuffer = nullptr; > + > + // --- TODO: To be deleted > + if (isMainThread()) { > + if (!m_gpuConnectionToWebProcess) { > + completionHandler(false); > + return; > + } > + > + auto mediaPlayer = m_gpuConnectionToWebProcess->remoteMediaPlayerManagerProxy().mediaPlayer(mediaPlayerIdentifier); > + if (!mediaPlayer) { > + completionHandler(false); > + return; > + } > + > + pixelBuffer = mediaPlayer->pixelBufferForCurrentTime(); > + if (!pixelBuffer) { > + completionHandler(false); > + return; > + } > + > + auto contextCV = m_context->asCV(); > + if (!contextCV) { > + completionHandler(false); > + return; > + } > + > + completionHandler(contextCV->copyPixelBufferToTexture(pixelBuffer, texture, level, internalFormat, format, type, GraphicsContextGL::FlipY(flipY))); > + return; > + } > + // --- I think you could clean this up by doing the call to pixelBufferForCurrentTime() in a lambda, and calling the lambda inline if you're on the main thread, but sync dispatching to the main thread if you're called from the background. That would reduce the amount of duplicated code. Created attachment 421073 [details]
Revise the patch based on Jer's suggestion
Comment on attachment 421051 [details] WIP, fix build failures on gtk/wpe ports View in context: https://bugs.webkit.org/attachment.cgi?id=421051&action=review >> Source/WebKit/GPUProcess/graphics/RemoteGraphicsContextGL.cpp:220 >> + // --- > > I think you could clean this up by doing the call to pixelBufferForCurrentTime() in a lambda, and calling the lambda inline if you're on the main thread, but sync dispatching to the main thread if you're called from the background. That would reduce the amount of duplicated code. Great idea! Fixed. Comment on attachment 421034 [details] WIP, prepare to merge with the patch for bug 219641 View in context: https://bugs.webkit.org/attachment.cgi?id=421034&action=review >> Source/WebCore/html/HTMLVideoElement.cpp:-303 >> -bool HTMLVideoElement::copyVideoTextureToPlatformTexture(GraphicsContextGL* context, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY) > > Isn't this used by other ports? After refactoring, this method is not used in all ports. But a similar function is still used in MediaPlayer on some ports, e.g., the gstreamer-based port. >> Source/WebCore/platform/graphics/MediaPlayer.h:65 >> +typedef struct __CVBuffer* CVPixelBufferRef; > > This is platform specific, so I would put this in an `#if PLATFORM` block. Right! Added `#if ENABLE(AVFOUNDATION)`. >> Source/WebCore/platform/graphics/MediaPlayer.h:444 >> + CVPixelBufferRef pixelBufferForCurrentTime(); > > Ditto Fixed. >> Source/WebCore/platform/graphics/MediaPlayerPrivate.h:165 >> + virtual CVPixelBufferRef pixelBufferForCurrentTime() { return nullptr; } > > Ditto. Fixed. >> Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:665 >> + return m_lastPixelBuffer.get(); > > if `MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture` is still needed, you can remove most of the code in it and call this method to get the current pixel buffer. `MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture` is removed. Comment on attachment 421034 [details] WIP, prepare to merge with the patch for bug 219641 View in context: https://bugs.webkit.org/attachment.cgi?id=421034&action=review >> Source/WebCore/html/HTMLVideoElement.cpp:-303 >> -bool HTMLVideoElement::copyVideoTextureToPlatformTexture(GraphicsContextGL* context, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY) > > Isn't this used by other ports? After refactoring, this method is not used in all ports. But a similar function is still used in MediaPlayer on some ports, e.g., the gstreamer-based port. >> Source/WebCore/platform/graphics/MediaPlayer.h:65 >> +typedef struct __CVBuffer* CVPixelBufferRef; > > This is platform specific, so I would put this in an `#if PLATFORM` block. Right! Added `#if ENABLE(AVFOUNDATION)`. >> Source/WebCore/platform/graphics/MediaPlayer.h:444 >> + CVPixelBufferRef pixelBufferForCurrentTime(); > > Ditto Fixed. >> Source/WebCore/platform/graphics/MediaPlayerPrivate.h:165 >> + virtual CVPixelBufferRef pixelBufferForCurrentTime() { return nullptr; } > > Ditto. Fixed. >> Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:665 >> + return m_lastPixelBuffer.get(); > > if `MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture` is still needed, you can remove most of the code in it and call this method to get the current pixel buffer. `MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture` is removed. Created attachment 421109 [details]
Rebase the patch
Committed r273213: <https://commits.webkit.org/r273213> All reviewed patches have been landed. Closing bug and clearing flags on attachment 421109 [details]. (In reply to EWS from comment #29) > Committed r273213: <https://commits.webkit.org/r273213> This seems to have broken windows build. r273213 failed in https://build.webkit.org/#/builders/67/builds/456 r273212 passed in https://build.webkit.org/#/builders/67/builds/455 This seems to affect applewin and not wincairo, but I'd wonder if maybe applewin didn't enable WEBGL?
Early in the MediaPlayer.h file it says:
+#if ENABLE(WEBGL) && USE(AVFOUNDATION)
+typedef struct __CVBuffer* CVPixelBufferRef;
+#endif
but the usage later is only controlled by USE(AVFOUNDATION)
#if !USE(AVFOUNDATION)
bool copyVideoTextureToPlatformTexture(GraphicsContextGL*, PlatformGLObject texture, GCGLenum target, GCGLint level, GCGLenum internalFormat, GCGLenum format, GCGLenum type, bool premultiplyAlpha, bool flipY);
#else
CVPixelBufferRef pixelBufferForCurrentTime();
#endif
If that's the case, it's unclear to me which condition would require a change though.
I think we can remove "ENABLE(WEBGL)" here. +#if ENABLE(WEBGL) && USE(AVFOUNDATION) +typedef struct __CVBuffer* CVPixelBufferRef; +#endif Testing a patch locally now. Reopening to attach new patch. Created attachment 421207 [details]
A follow-up patch to fix build failures on the windows port
Committed r273272: <https://commits.webkit.org/r273272> All reviewed patches have been landed. Closing bug and clearing flags on attachment 421207 [details]. |