I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.
How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?
<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function () {
if (this.files && this.files[0]) {
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e) {
output.textContent = e.target.result;
});
reader.readAsBinaryString(myFile);
}
});
</script>
</body>
</html>
-
1\$\begingroup\$ This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/… \$\endgroup\$Rene Saarsoo– Rene Saarsoo2018年05月09日 12:01:09 +00:00Commented May 9, 2018 at 12:01
-
\$\begingroup\$ @ReneSaarsoo There's nothing wrong with having this question here. \$\endgroup\$Simon Forsberg– Simon Forsberg2018年05月09日 13:19:47 +00:00Commented May 9, 2018 at 13:19
-
1\$\begingroup\$ Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562 \$\endgroup\$Simon Forsberg– Simon Forsberg2018年05月09日 14:18:16 +00:00Commented May 9, 2018 at 14:18
1 Answer 1
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++) {
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
}
-
\$\begingroup\$ Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?" \$\endgroup\$king amada– king amada2018年05月09日 14:13:45 +00:00Commented May 9, 2018 at 14:13
-
\$\begingroup\$ This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381 \$\endgroup\$Srdjan Grubor– Srdjan Grubor2018年05月09日 14:14:55 +00:00Commented May 9, 2018 at 14:14
-
\$\begingroup\$ Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript? \$\endgroup\$king amada– king amada2018年05月09日 14:40:11 +00:00Commented May 9, 2018 at 14:40
-
1\$\begingroup\$ The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here. \$\endgroup\$Srdjan Grubor– Srdjan Grubor2018年05月09日 14:45:17 +00:00Commented May 9, 2018 at 14:45
-
\$\begingroup\$ Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use. \$\endgroup\$king amada– king amada2018年05月09日 14:56:21 +00:00Commented May 9, 2018 at 14:56
Explore related questions
See similar questions with these tags.