Read Local File
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
0
down vote
favorite
I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.
How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?
<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);
);
</script>
</body>
</html>
javascript performance html file html5
add a comment |Â
up vote
0
down vote
favorite
I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.
How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?
<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);
);
</script>
</body>
</html>
javascript performance html file html5
1
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
1
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.
How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?
<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);
);
</script>
</body>
</html>
javascript performance html file html5
I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.
How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?
<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);
);
</script>
</body>
</html>
javascript performance html file html5
asked May 9 at 10:57
king amada
225
225
1
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
1
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18
add a comment |Â
1
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
1
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18
1
1
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
1
1
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
 |Â
show 1 more comment
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
 |Â
show 1 more comment
up vote
2
down vote
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
 |Â
show 1 more comment
up vote
2
down vote
up vote
2
down vote
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice()
where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.
Some pseudocode - no testing so maybe there's some off-by-one errors:
CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)
answered May 9 at 13:41
Srdjan Grubor
26615
26615
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
 |Â
show 1 more comment
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/â¦) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
â king amada
May 9 at 14:13
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
â Srdjan Grubor
May 9 at 14:14
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
â king amada
May 9 at 14:40
1
1
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
â Srdjan Grubor
May 9 at 14:45
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
â king amada
May 9 at 14:56
 |Â
show 1 more comment
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f194002%2fread-local-file%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/â¦
â Rene Saarsoo
May 9 at 12:01
@ReneSaarsoo There's nothing wrong with having this question here.
â Simon Forsbergâ¦
May 9 at 13:19
1
Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
â Simon Forsbergâ¦
May 9 at 14:18