Read Local File

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
0
down vote

favorite












I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.



How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?



<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);

);
</script>
</body>
</html>






share|improve this question















  • 1




    This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
    – Rene Saarsoo
    May 9 at 12:01










  • @ReneSaarsoo There's nothing wrong with having this question here.
    – Simon Forsberg♦
    May 9 at 13:19






  • 1




    Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
    – Simon Forsberg♦
    May 9 at 14:18
















up vote
0
down vote

favorite












I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.



How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?



<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);

);
</script>
</body>
</html>






share|improve this question















  • 1




    This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
    – Rene Saarsoo
    May 9 at 12:01










  • @ReneSaarsoo There's nothing wrong with having this question here.
    – Simon Forsberg♦
    May 9 at 13:19






  • 1




    Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
    – Simon Forsberg♦
    May 9 at 14:18












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.



How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?



<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);

);
</script>
</body>
</html>






share|improve this question











I'm trying to read any file locally and write out the binary as string, it loads and reads small files very fast.



How can I modifier it to read large files greater than 1gb without crashing the browser or slowing down the system?



<html>
<head>
<title>
Read File
</title>
</head>
<body>
<input type="file" id="myFile">
<hr>
<textarea style="width:500px;height: 400px" id="output"></textarea>
<script>
var input = document.getElementById("myFile");
var output = document.getElementById("output");
input.addEventListener("change", function ()
if (this.files && this.files[0])
var myFile = this.files[0];
var reader = new FileReader();
reader.addEventListener('load', function (e)
output.textContent = e.target.result;
);
reader.readAsBinaryString(myFile);

);
</script>
</body>
</html>








share|improve this question










share|improve this question




share|improve this question









asked May 9 at 10:57









king amada

225




225







  • 1




    This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
    – Rene Saarsoo
    May 9 at 12:01










  • @ReneSaarsoo There's nothing wrong with having this question here.
    – Simon Forsberg♦
    May 9 at 13:19






  • 1




    Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
    – Simon Forsberg♦
    May 9 at 14:18












  • 1




    This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
    – Rene Saarsoo
    May 9 at 12:01










  • @ReneSaarsoo There's nothing wrong with having this question here.
    – Simon Forsberg♦
    May 9 at 13:19






  • 1




    Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
    – Simon Forsberg♦
    May 9 at 14:18







1




1




This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
– Rene Saarsoo
May 9 at 12:01




This question is better suited for StackOverflow, where you can already find an answer to a very similar question: stackoverflow.com/questions/25810051/…
– Rene Saarsoo
May 9 at 12:01












@ReneSaarsoo There's nothing wrong with having this question here.
– Simon Forsberg♦
May 9 at 13:19




@ReneSaarsoo There's nothing wrong with having this question here.
– Simon Forsberg♦
May 9 at 13:19




1




1




Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
– Simon Forsberg♦
May 9 at 14:18




Close-voters, please read: codereview.meta.stackexchange.com/q/5482/31562
– Simon Forsberg♦
May 9 at 14:18










1 Answer
1






active

oldest

votes

















up vote
2
down vote













So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice() where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.



Some pseudocode - no testing so maybe there's some off-by-one errors:



CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)






share|improve this answer





















  • Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
    – king amada
    May 9 at 14:13










  • This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
    – Srdjan Grubor
    May 9 at 14:14










  • Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
    – king amada
    May 9 at 14:40






  • 1




    The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
    – Srdjan Grubor
    May 9 at 14:45











  • Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
    – king amada
    May 9 at 14:56










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
);
);
, "mathjax-editing");

StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "196"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f194002%2fread-local-file%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
2
down vote













So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice() where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.



Some pseudocode - no testing so maybe there's some off-by-one errors:



CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)






share|improve this answer





















  • Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
    – king amada
    May 9 at 14:13










  • This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
    – Srdjan Grubor
    May 9 at 14:14










  • Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
    – king amada
    May 9 at 14:40






  • 1




    The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
    – Srdjan Grubor
    May 9 at 14:45











  • Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
    – king amada
    May 9 at 14:56














up vote
2
down vote













So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice() where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.



Some pseudocode - no testing so maybe there's some off-by-one errors:



CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)






share|improve this answer





















  • Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
    – king amada
    May 9 at 14:13










  • This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
    – Srdjan Grubor
    May 9 at 14:14










  • Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
    – king amada
    May 9 at 14:40






  • 1




    The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
    – Srdjan Grubor
    May 9 at 14:45











  • Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
    – king amada
    May 9 at 14:56












up vote
2
down vote










up vote
2
down vote









So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice() where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.



Some pseudocode - no testing so maybe there's some off-by-one errors:



CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)






share|improve this answer













So any memory-limited reading of large files pretty much involves looping over the file in "chunks" of some multiple of standard memory page size (4k). In Javascript and with IO queuing, this would probably be done with some type of a loop over slice() where each read would be maybe 64k bytes at a time. However, if you end up just reading all of the file content into an in-memory variable you will crash your system regardless of how optimized the reading is.



Some pseudocode - no testing so maybe there's some off-by-one errors:



CHUNK_SIZE = 64 * 1024;
for (chunk_index = 0; chunk_index * CHUNK_SIZE < file.size; chunk_index++)
offset = chunk_index * CHUNK_SIZE
chunk = file.slice(offset, offset + CHUNK_SIZE)
# do something with chunk here (don't add it to an in-memory var though)







share|improve this answer













share|improve this answer



share|improve this answer











answered May 9 at 13:41









Srdjan Grubor

26615




26615











  • Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
    – king amada
    May 9 at 14:13










  • This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
    – Srdjan Grubor
    May 9 at 14:14










  • Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
    – king amada
    May 9 at 14:40






  • 1




    The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
    – Srdjan Grubor
    May 9 at 14:45











  • Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
    – king amada
    May 9 at 14:56
















  • Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
    – king amada
    May 9 at 14:13










  • This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
    – Srdjan Grubor
    May 9 at 14:14










  • Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
    – king amada
    May 9 at 14:40






  • 1




    The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
    – Srdjan Grubor
    May 9 at 14:45











  • Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
    – king amada
    May 9 at 14:56















Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
– king amada
May 9 at 14:13




Exactly what I'm looking for, unfortunately I don't know javascript that much, if you could help with code that would be helpful. Check this question I asked on StackOverflow [How to read any local file by chunks using JavaScript? ](stackoverflow.com/questions/50254537/…) "How can I read any large file(greater than 1 gigabytes) locally by chunks(2kb or more),and then convert the chunk to a string, process the string and then get the next chunk and so on until the end of the file?"
– king amada
May 9 at 14:13












This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
– Srdjan Grubor
May 9 at 14:14




This link seems to have a couple of reasonable implementations: gist.github.com/alediaferia/cfb3a7503039f9278381
– Srdjan Grubor
May 9 at 14:14












Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
– king amada
May 9 at 14:40




Thanks once again, but where can I get the string from the code, I have been reading the code for some minutes, but its confusing, like I have said I'm really a beginner with javascript?
– king amada
May 9 at 14:40




1




1




The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
– Srdjan Grubor
May 9 at 14:45





The point of reading the file in chunks is precisely not to get it back as a full string since that string will take the same amount of space in RAM as the file and most likely crash your browser. If you do this, there is absolutely no point in chunk-reading the file. Beyond that though, and to be rather blunt, I hope there wasn't an expectation for people to write the code for you here.
– Srdjan Grubor
May 9 at 14:45













Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
– king amada
May 9 at 14:56




Not expecting someone to write the code for me, but would be helpful. What I want to do is, get the chunk in string, process the string and discard it, then get the next chunk and do that again, so the only space that will affect the RAM will be 64kb since thats the amount of chunk I will be reading and discard after use.
– king amada
May 9 at 14:56












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f194002%2fread-local-file%23new-answer', 'question_page');

);

Post as a guest













































































Popular posts from this blog

Chat program with C++ and SFML

Function to Return a JSON Like Objects Using VBA Collections and Arrays

Will my employers contract hold up in court?