Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does not work in Firefox 79+ because SharedArrayBuffer is undefined #106

Open
amn opened this issue Nov 6, 2020 · 29 comments
Open

Does not work in Firefox 79+ because SharedArrayBuffer is undefined #106

amn opened this issue Nov 6, 2020 · 29 comments

Comments

@amn
Copy link

amn commented Nov 6, 2020

Describe the bug
The front page says:

Your browser doesn't support SharedArrayBuffer, thus ffmpeg.wasm cannot execute. Please use latest version of Chromium or any other browser supports SharedArrayBuffer.

To Reproduce
Steps to reproduce the behavior:

  1. Go to https://ffmpegwasm.github.io/
  2. Scroll down to 'Demo'
  3. See quoted error

Expected behavior
As SharedArrayBuffer is available in the aforementioned versions of Firefox (on certain conditions), I think it is fair we don't fall into the "this only works in Chrome" bane of Web applications made today, especially if the application may in fact work in Firefox and the latter only refuses to provide the object for security reasons: https://developer.mozilla.org/en-US/docs/Mozilla/Firefox/Releases/79#JavaScript

Desktop (please complete the following information):

  • OS: Windows 10 x86-64
  • Browser Firefox 64-bit
  • Version 81.0.1
@n0x3u5
Copy link

n0x3u5 commented Nov 9, 2020

Could it be because of missing COOP/COEP headers?

I faced the same issue in the ffmpeg.wasm demo in its Github page. Some casual searching led me to this issue in Bugzilla which led me to this page on web.dev. Also the Firefox 79 for developers page which @amn linked says the same thing.

I guess the Github page needs to be configured to send the correct headers?

@jeromewu
Copy link
Collaborator

Yes, it is the retriction in github page, please reference this issue for more details: #102

@jeromewu
Copy link
Collaborator

I have tried to host the page in Google App Engine, in firefox the SharedArrayBuffer is detected but there is an OOM error in the end, for those who are interested, please check this link: https://ffmpegwasm.et.r.appspot.com

@jeromewu jeromewu pinned this issue Nov 11, 2020
@diegocr
Copy link

diegocr commented Nov 11, 2020

video.src = URL.createObjectURL(new Blob([data.buffer], { type: 'video/mp4' }));

This kind of usage will always raise an OOM with any relatively large video, rather the transcoded chunks should be streamed into an MSE container.

@jeromewu
Copy link
Collaborator

The interesting part is that the exactly same code works fine on Chrome, so I think there is some limitation inside Firefox that makes it OOM.

@amn
Copy link
Author

amn commented Nov 12, 2020

This OOM detour isn't FFmpeg specific per se, but I'd still want to point out the following.

When in doubt, look at the spec -- https://w3c.github.io/FileAPI/#blob-section -- does the spec mandate anything that would say, force a user agent vendor to implement Blob in a way where a large enough Blob would always bring about OOM error(s)? I don't see anything in the spec pointing to that. Certainly, if I were making a user agent, for construction of blobs from other blobs, or even byte arrays, I could implement some sort of copy-on-write strategy, where a blob would share memory with the data that was passed to it during construction -- a blob is immutable, after all, so that makes it ever more so appealing to not copy data which certainly would cause memory consumption to baloon given enough blobs of large enough size.

My point is that this is browser specific since the spec does not say anything of the "you must allocate memory for each blob" sort. A naive implementation would certainly do, but there are ways to avoid that. For instance, let's say you already have a big number of smaller blobs, you then construct a large blob using an array of the smaller blobs. How much additional memory must a user agent allocate? There is no definite answer to that -- beyond the allocation of an array of what amounts to trivial references to existing blobs, even the large blob constructed may simply reference existing blobs in a way that is transparent to the Blob API consumer, including other facilities in the user agent itself, that use blobs.

Practically, and exactly because the spec also does not mandate blobs reuse references to data they were constructed with (no mandating copy-on-write), you cannot guarantee there won't be OOM errors. Most likely there will be, even in Chrome, given sufficient amount and sizes of blobs. Memory is finite resource. The only way out is another API, a ReadableStream perhaps, but now I am being very general here.

@amn
Copy link
Author

amn commented Nov 12, 2020

If you "stream chunks to an MSE container" (MediaSource.appendBuffer I think is what is being hinted at here), versus allocating a resource (which you can download in its entirety) using a Blob and a URL, you won't ever have the entire resource if it's sufficiently large -- MSE API deals with buffers differently -- earlier data may be evicted while you're appending more data, meaning that it's playback of some subset of the resource that is guaranteed, like with live streaming -- the video may not provide you with everything you ever appended to it, don't count on being able to "Save As" what's playing from start to finish. It will only work for preview (although it will indeed). See https://w3c.github.io/media-source/#sourcebuffer-coded-frame-eviction.

What you could do, and this is speculation now on my part, is look into the Cache API to see if you could construct a "cached" version of the resulting resource, which is allocated on disk (being that the Cache API uses persistent storage), and then construct a URL to the cached resource and/or fetch the resource (which can be intercepted and served from the cache).

@diegocr
Copy link

diegocr commented Nov 12, 2020

Blobs in Chrome may be disk-based, so using them may doesn't hit an OOM like in Firefox, memory is finite just like persistent/disk storage or given the restrictions browser vendors may impose on this (beyond what is stated in an specification), hence any browser including Chrome can face an OOM depending the amount of data you feed it. Did anybody tested this with a +3GB video?

Here MSE shall be used if the end purpose is streaming the video as in the demo, in a nutshell that is by appending the transcoded chunks in real-time until a certain threshold is reached to prevent the MSE's SourceBuffer from reaching a quota error (i.e. ala-OOM). This way should certainly be much more cross-browser reliable and UX-friendly than loading/transcoding the whole video in memory and then start playing it, by using MSE it could start playing within a few seconds.

@jeromewu
Copy link
Collaborator

jeromewu commented Nov 12, 2020

I managed to run it in Firefox 79 in Kali OS, here is the screenshot for your reference:

Screenshot_2020-11-12_22-54-53

@thijstriemstra
Copy link

See #102 (comment) on how to fix the header issue in Firefox.

@AE1020
Copy link

AE1020 commented Dec 3, 2020

Hello, I am trying http://ffmpegwasm.et.r.appspot.com/ with Firefox 83 and it is saying that SharedArrayBuffer is undefined, as in the issue title.

I can confirm from the inspector tools that the headers are set appropriately mentioned in #102. They are included in the lower right of the screenshot below:

firefox-83-not-working

I see the last comment on this issue is recent (18 days ago)...so I'm wondering if it's something that just broke, or if it's just something weird about my Firefox?

My settings in about:config are the default (I think). I have messed with them in the past but this is a fresh Firefox install:

firefox-shared-config

@AE1020
Copy link

AE1020 commented Dec 3, 2020

I think I figured out the problem...it is because of having an http link instead of https.

(It seems that demo URL is not automatically forwarding to the secure site if you have a non-secure link.)

@thijstriemstra
Copy link

The header issue in Firefox is also coming to Chrome: https://blog.chromium.org/2021/02/restriction-on-sharedarraybuffers.html

@anesuc
Copy link

anesuc commented Mar 5, 2021

Yep can confirm. I get a warning about this in chrome in dev tools:

Deprecation] SharedArrayBuffer will require cross-origin isolation as of M91, around May 2021. See <URL> for more details.

@AndrewDubya
Copy link

I've managed to add the headers necessary to get ffmpeg.wasm running in Firefox, but now I need to serve the files locally (because they're very restrictive).

Is there a distribution that can be served locally? It'd be nice to have a tar.gz or .zip to serve them directly.

@NeoCoderMatrix86
Copy link

Also the SharedArrayBuffer behaviour of Firefox will come to chrome and (since it depends on it) to edge, blocking all major browsers. Maybe the fix of @AndrewDubya is helpfull?!

@anesuc
Copy link

anesuc commented Mar 10, 2021

You can upload it to your website (but it needs to have https enabled). But yeah no local usage will be possible I guess (or non https).

@AndrewDubya
Copy link

Sorry, I should've been more clear. When I said locally, I meant serving them locally as opposed to from unpkg.com. Starting in May, the unpkg cdn will be broken for everyone because the headers required restrict including assets from other URLs.

Here's the hack I came up with to serve it from the same domain: #166 (comment)

I totally get that this is a tech demo and bleeding edge. I think the long term solution would be to fallback to a single thread without SharedArrayBuffer. But I have no idea how complex that is in practice :)

A non-shared array fallback would make it work for most browsers, and more cases where developers don't have access to web server configs.

@anesuc
Copy link

anesuc commented Mar 10, 2021

Actually you are right. For some reason it worked on mine before I looked at the code lol

@agektmr
Copy link

agektmr commented Apr 6, 2021

To enable SharedArrayBuffer, the website needs to adopt "cross-origin isolation". Here's a guide to enable it: https://web.dev/cross-origin-isolation-guide/

The downside of this is all resources loaded onto the same page needs to opt-in by sending Cross-Origin-Resource-Policy: cross-origin header or provide CORS. Otherwise the resource will be blocked loading. There are some other nuances too. To learn more details: https://web.dev/coop-coep/

You can try emulating the headers on this demo site: https://first-party-test.glitch.me/

What I would recommend is to determine whether the page is cross-origin isolated by examining self.crossOriginIsolated. Use SharedArrayBuffer when it returns true and do not use when it returns false.

@julienbeisel
Copy link

Chrome 92 was just released and do need cross-origin-isolation.

I think the doc has to be updated : https://ffmpegwasm.github.io/

@jeromewu
Copy link
Collaborator

jeromewu commented Jul 26, 2021

Yes, now we are using a new URL now: https://ffmpegwasm.netlify.app

I think it should work in Firefox as well, please let me know if it is working in Firefox. 😄

@GavHern
Copy link

GavHern commented Aug 6, 2021

I'm getting this same issue in chrome under local development using the vite bundler. tried adding the headers to no luck. ffmpeg.load() promise never resolves.

@agektmr
Copy link

agektmr commented Aug 6, 2021

For local development, I recommend using --enable-features=SharedArrayBuffer to launch Chrome.
https://web.dev/cross-origin-isolation-guide/#:~:text=gotchas

@trybick
Copy link

trybick commented Aug 7, 2021

I was able to fix this in Chrome for a react project on netlify by creating a netlify.toml with these headers:

[[headers]]
  for = "/*"
  [headers.values]
      Cross-Origin-Opener-Policy = "same-origin"
      Cross-Origin-Embedder-Policy = "require-corp"

@anilsoniaurionpro
Copy link

After enabling cross-origin isolated ,
firefox desktop works but firefox 90.1.3 on Android throws this error (can't open about:config to enable SharedArrayBuffer ).

@gzuidhof
Copy link

For those that want to host a demo/app on a domain where you can't control the headers, with a service worker you can actually set the required COOP and COEP headers. See the coi-serviceworker repo, and this blog post that explains the approach.

It's a bit crazy, but it's a solution for demos hosted on Github pages.

@saojun1024
Copy link

need add http header in server,here is node server example

res.set('Cross-Origin-Embedder-Policy', 'require-corp')
res.set('Cross-Origin-Opener-Policy', 'same-origin')

@Fanna1119
Copy link

If someone is using vite as a dev environment, I managed with this.
https://github.com/chaosprint/vite-plugin-cross-origin-isolation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests