-
-
Notifications
You must be signed in to change notification settings - Fork 1k
How do I output to a file I can read from while FFmpeg is still decoding? #486
-
I'm trying to play unsupported audio formats by using FFmpeg to convert them. It takes too long to convert a whole file and then play it, so I was trying to convert it and output to an audio tag.
I tried setting up NodeJS streams to read the original file, then I converted each chunk with FFmpeg and wrote to a decode file. This didn't work, the audio wouldn't start playing. And it takes way longer to create a new FFmpeg instance for each chunk than converting the whole file.
Is there a way I can output the conversion to a file I can immediately access?
Or is there a way I can simply decode the file without converting it and use it in an audio tag?
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 4 comments 3 replies
-
Hey @Konglomneshued, do you found a way to solve your problem?
I have a similar requirement and want to know if there is a good way to do it.
Beta Was this translation helpful? Give feedback.
All reactions
-
Unfortunately no. I spent a week trying to Google and figure out how I can achieve it and I eventually gave up. I decided to come back to this problem later and work on other parts of my application.
Beta Was this translation helpful? Give feedback.
All reactions
-
In-browser transcoding of video files with FFmpeg and WebAssembly
The method described in the above linked article seems applicable. Specifically the section quoted below. I plan on trying this method soon.
Creating a Streaming Transcoder
Transcoding large files can take a little while. For a bit of fun, let’s take a look at how you can transcode the file into segments, incrementally adding them to the video buffer.
Beta Was this translation helpful? Give feedback.
All reactions
-
I tested a method similar to the method in the link I posted. Unfortunately I could not access the ffmpeg filesystem at all while an ffmpeg.exec() command is running.
Beta Was this translation helpful? Give feedback.
All reactions
-
Same problem. I can produce segments, but the readFile(s) does not resolve until the ffmpeg command is finished executing. It's almost like it has a lock on the FS.
Beta Was this translation helpful? Give feedback.
All reactions
-
I am not sure what your reason is for accessing the output while it is in progress but I have found an interesting, but somewhat limited way of doing it.
I have managed to play the output on a video element while transcoding is still in progress. Speed could be better and I tried a few things to speed it up. The code can use multiple ffmpeg instances in parallel to process the video in chunks and output fragmented mp4 segments for a MediaSource SourceBuffer. It can also use WORKERFS if available so all of the ffmpeg instances share access to the source video without multiple copies.
I did have some issues getting it to work with some videos if I did not disable audio output (-an argument in ffmpeg command.) I am assuming I just need to find a compatible format to transcode to but I am still looking into it.
The transcoding speed makes playback pause often while waiting for new segments.
Source code
<html> <head> <script src="transcode.parallel.js"></script> </head> <body> <video autoplay muted id="video-result" controls></video><br /> <button disabled id="load-button">Load ffmpeg-core (~31 MB)</button><br /> <!-- <button style="display: none;" disabled id="select-input-button">Transcode local file</button> --> <input disabled type="file" id="local-file" accept=".mp4,.m4v,.webm,.avi,.mkv,.mov,.wmv" /><br /> <p id="log-div"></p> </body> </html>
"use strict"; var ffmpegCount = 2; var chunkDurationSize = 5; var useMultiThreadIfAvailable = false; var useWorkerFSIfAvailable = true; var ffmpegs = []; var loadBtn = null; var logDiv = null; var videoEl = null; var localFileInput = null; const baseURLFFMPEG = 'ffmpeg-wasm/ffmpeg'; const baseURLCore = 'ffmpeg-wasm/core'; const baseURLCoreMT = 'ffmpeg-wasm/core-mt'; const workerBFSLoaderURL = 'worker.loader.js'; const toBlobURL = async (url, mimeType) => { var resp = await fetch(url); var body = await resp.blob(); var blob = new Blob([body], { type: mimeType }); return URL.createObjectURL(blob); }; const load = async () => { loadBtn.setAttribute('disabled', true); const ffmpegBlobURL = await toBlobURL(`${baseURLFFMPEG}/ffmpeg.js`, 'text/javascript'); await import(ffmpegBlobURL); var loadConfig = null; if (useMultiThreadIfAvailable && window.crossOriginIsolated) { loadConfig = { workerLoadURL: await toBlobURL(`${baseURLFFMPEG}/814.ffmpeg.js`, 'text/javascript'), coreURL: await toBlobURL(`${baseURLCoreMT}/ffmpeg-core.js`, 'text/javascript'), wasmURL: await toBlobURL(`${baseURLCoreMT}/ffmpeg-core.wasm`, 'application/wasm'), workerURL: await toBlobURL(`${baseURLCoreMT}/ffmpeg-core.worker.js`, 'application/javascript'), }; } else { loadConfig = { workerLoadURL: await toBlobURL(`${baseURLFFMPEG}/814.ffmpeg.js`, 'text/javascript'), coreURL: await toBlobURL(`${baseURLCore}/ffmpeg-core.js`, 'text/javascript'), wasmURL: await toBlobURL(`${baseURLCore}/ffmpeg-core.wasm`, 'application/wasm'), }; } var tasks = []; while (ffmpegs.length < ffmpegCount) { let ffmpeg = new FFmpegWASM.FFmpeg(); ffmpegs.push(ffmpeg); tasks.push(ffmpeg.load(loadConfig)); } await Promise.all(tasks); console.log('ffmpeg cores loaded:', ffmpegCount); localFileInput.removeAttribute('disabled'); window._ffmpeg0 = ffmpegs[0]; } const getMetadata = (inputFile) => { let ffmpeg = ffmpegs[0]; return new Promise((resolve) => { var log = ''; var metadataLogger = ({ message }) => { log += message; if (message.indexOf('Aborted()') > -1) { ffmpeg.off('log', metadataLogger); resolve(log); } }; ffmpeg.on('log', metadataLogger); ffmpeg.exec(["-i", inputFile]); }); }; const getDuration = async (inputFile) => { var metadata = await getMetadata(inputFile); var patt = /Duration:\s*([0-9]{2}):([0-9]{2}):([0-9]{2}.[0-9]{0,2})/gm var m = patt.exec(metadata); return !m ? 0 : (m[1] * 3600) + (m[2] * 60) + (m[3] * 1); }; const transcodeLocalFileInputToMediaSource = async () => { let files = localFileInput.files; let file = files.length ? files[0] : null; if (!file) return; localFileInput.setAttribute('disabled', true); await transcodeFileToMediaSource(file); localFileInput.removeAttribute('disabled'); }; const transcodeFileToMediaSource = async (file) => { console.log('file', file); const inputDir = '/input'; const inputFile = `${inputDir}/${file.name}`; console.log('inputFile', inputFile); // mount the input file in each ffmpeg instance // (custom ffmpeg build with WORKERFS enabled) var useWorkerFS = ffmpegs[0].mount && ffmpegs[0].unmount && useWorkerFSIfAvailable; await Promise.all(ffmpegs.map(async (ffmpeg) => { await ffmpeg.createDir(inputDir); if (useWorkerFS) { await ffmpeg.mount('WORKERFS', { files: [file] }, inputDir); } else { await ffmpeg.writeFile(inputFile, new Uint8Array(await file.arrayBuffer())) } })); var duration = await getDuration(inputFile); if (duration > 0) { const mimeCodec = 'video/mp4; codecs="avc1.64001f"'; const mediaSource = new MediaSource(); var mediaSourceURL = ''; var jobs = []; const getCompletedJob = (i) => { if (i >= jobs.length) return null; var job = jobs[i]; if (job.state != 'done') { return new Promise((resolve) => { job.oncomplete = () => resolve(job); }) } else { return Promise.resolve(job); } }; mediaSource.addEventListener('sourceopen', async (e) => { console.log('sourceopen', mediaSource.readyState); // if (mediaSource.readyState != 'open') { return; } URL.revokeObjectURL(mediaSourceURL); mediaSource.duration = duration; var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec); sourceBuffer.mode = 'sequence'; var ii = 0; sourceBuffer.addEventListener("updateend", async () => { console.log('updateend', mediaSource.readyState); // ended if (mediaSource.readyState != 'open') { return; } var job = await getCompletedJob(ii++); if (!job) { mediaSource.endOfStream(); } else { sourceBuffer.appendBuffer(job.outputData); } }); var job = await getCompletedJob(ii++); sourceBuffer.appendBuffer(job.outputData); }, { once: true }); var index = 0; var durationLeft = duration; var chunkStart = 0; while (chunkStart < duration) { let chunkDuration = durationLeft > chunkDurationSize ? chunkDurationSize : durationLeft; jobs.push({ id: index, chunkStart: chunkStart, chunkDuration: chunkDuration, state: 'queued', // queued, running, done outputData: null, oncomplete: null, }); chunkStart += chunkDuration; index++; } mediaSourceURL = URL.createObjectURL(mediaSource); videoEl.src = mediaSourceURL; var jobQueue = []; jobs.map((job) => jobQueue.push(job)); await Promise.all(ffmpegs.map(async (ffmpeg) => { let job = null; const onprogress = (ev) => { if (!job) return; job.progress = ev.progress; console.log(`Segment progress: ${job.id} ${job.progress}`); }; const onlog = (ev) => { if (!job) return; logDiv.innerHTML = ev.message; console.log(`Segment log: ${job.id}`, ev.message); }; ffmpeg.on('progress', onprogress); ffmpeg.on('log', onlog); while (jobQueue.length) { job = jobQueue.shift(); job.state = 'running'; console.log(`Segment start: ${job.id} ${job.chunkStart} ${job.chunkDuration}`); //await new Promise((r) => setTimeout(r, 1000)); const outputFile = `/output.${job.id}.mp4`; await ffmpeg.exec([ "-nostats", "-loglevel", "error", "-i", inputFile, //"-vf", "scale=iw/4:ih/4", "-an", //"-movflags", "frag_keyframe+empty_moov+default_base_moof", "-movflags", "faststart+frag_every_frame+empty_moov+default_base_moof", "-ss", `${job.chunkStart}`, "-t", `${job.chunkDuration}`, "-preset", "ultrafast", outputFile, ]); try { job.outputData = await ffmpeg.readFile(outputFile); } catch { console.log('Error reading output video'); } job.state = 'done'; console.log(`Segment done: ${job.id} ${job.chunkStart} ${job.chunkDuration}`); if (job.oncomplete) job.oncomplete(); try { await ffmpeg.deleteFile(outputFile); } catch { console.log('Error deleting output video'); } } ffmpeg.off('progress', onprogress); ffmpeg.off('log', onlog); })); } await Promise.all(ffmpegs.map(async (ffmpeg) => { if (useWorkerFS){ await ffmpeg.unmount(inputDir); } await ffmpeg.deleteDir(inputDir); })); }; addEventListener("load", async (event) => { localFileInput = document.querySelector('#local-file'); localFileInput.addEventListener('change', async () => await transcodeLocalFileInputToMediaSource()); loadBtn = document.querySelector('#load-button'); loadBtn.addEventListener('click', async () => await load()); loadBtn.removeAttribute('disabled'); logDiv = document.querySelector('#log-div'); videoEl = document.querySelector('#video-result'); console.log('window loaded'); });
Beta Was this translation helpful? Give feedback.
All reactions
-
It lets you mount the same file multiple times in parallel with ffmpeg.mount('WORKERFS', { files: [file] },? I thought the file was Transferable, such that the original object is no longer accessible after transferring it?
Transferable objects are objects that own resources that can be transferred from one context to another, ensuring that the resources are only available in one context at a time. Following a transfer, the original object is no longer usable; it no longer points to the transferred resource, and any attempt to read or write the object will throw an exception.
Beta Was this translation helpful? Give feedback.