Arrange lines of a text file
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
5
down vote
favorite
This program orders the lines of a text file alphabetically and prints them out to a new text file.
Usage:
java Arranger input.txt output.txt
Is this the best and especially the most performant way?
import java.io.File;
import java.io.FileReader;
import java.io.BufferedReader;
import java.io.FileWriter;
import java.io.BufferedWriter;
import java.io.IOException;
import java.util.ArrayList;
/*
* this program arranges the lines of a text file in it's alphabetical order
* usage: java Arranger input.txt output.txt
*/
public class Arranger
public static void main(String args)
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
ArrayList<String> words = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile)))
// read lines of file and order them
String line = new String();
while ((line = reader.readLine()) != null)
words.add(line);
words.sort(String::compareToIgnoreCase);
catch (IOException e)
e.printStackTrace();
// write ordered list of words to file
try (BufferedWriter writer = new BufferedWriter(new FileWriter(outputFile)))
for (String word : words)
writer.write(word + "n");
catch (IOException e)
e.printStackTrace();
java strings file
add a comment |Â
up vote
5
down vote
favorite
This program orders the lines of a text file alphabetically and prints them out to a new text file.
Usage:
java Arranger input.txt output.txt
Is this the best and especially the most performant way?
import java.io.File;
import java.io.FileReader;
import java.io.BufferedReader;
import java.io.FileWriter;
import java.io.BufferedWriter;
import java.io.IOException;
import java.util.ArrayList;
/*
* this program arranges the lines of a text file in it's alphabetical order
* usage: java Arranger input.txt output.txt
*/
public class Arranger
public static void main(String args)
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
ArrayList<String> words = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile)))
// read lines of file and order them
String line = new String();
while ((line = reader.readLine()) != null)
words.add(line);
words.sort(String::compareToIgnoreCase);
catch (IOException e)
e.printStackTrace();
// write ordered list of words to file
try (BufferedWriter writer = new BufferedWriter(new FileWriter(outputFile)))
for (String word : words)
writer.write(word + "n");
catch (IOException e)
e.printStackTrace();
java strings file
add a comment |Â
up vote
5
down vote
favorite
up vote
5
down vote
favorite
This program orders the lines of a text file alphabetically and prints them out to a new text file.
Usage:
java Arranger input.txt output.txt
Is this the best and especially the most performant way?
import java.io.File;
import java.io.FileReader;
import java.io.BufferedReader;
import java.io.FileWriter;
import java.io.BufferedWriter;
import java.io.IOException;
import java.util.ArrayList;
/*
* this program arranges the lines of a text file in it's alphabetical order
* usage: java Arranger input.txt output.txt
*/
public class Arranger
public static void main(String args)
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
ArrayList<String> words = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile)))
// read lines of file and order them
String line = new String();
while ((line = reader.readLine()) != null)
words.add(line);
words.sort(String::compareToIgnoreCase);
catch (IOException e)
e.printStackTrace();
// write ordered list of words to file
try (BufferedWriter writer = new BufferedWriter(new FileWriter(outputFile)))
for (String word : words)
writer.write(word + "n");
catch (IOException e)
e.printStackTrace();
java strings file
This program orders the lines of a text file alphabetically and prints them out to a new text file.
Usage:
java Arranger input.txt output.txt
Is this the best and especially the most performant way?
import java.io.File;
import java.io.FileReader;
import java.io.BufferedReader;
import java.io.FileWriter;
import java.io.BufferedWriter;
import java.io.IOException;
import java.util.ArrayList;
/*
* this program arranges the lines of a text file in it's alphabetical order
* usage: java Arranger input.txt output.txt
*/
public class Arranger
public static void main(String args)
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
ArrayList<String> words = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile)))
// read lines of file and order them
String line = new String();
while ((line = reader.readLine()) != null)
words.add(line);
words.sort(String::compareToIgnoreCase);
catch (IOException e)
e.printStackTrace();
// write ordered list of words to file
try (BufferedWriter writer = new BufferedWriter(new FileWriter(outputFile)))
for (String word : words)
writer.write(word + "n");
catch (IOException e)
e.printStackTrace();
java strings file
edited Apr 27 at 17:12
rolflâ¦
90.2k13186390
90.2k13186390
asked Apr 27 at 17:05
Dexter Thorn
657521
657521
add a comment |Â
add a comment |Â
4 Answers
4
active
oldest
votes
up vote
4
down vote
accepted
The most significant issues in your code are:
- the handling of inputs.
- the use of helper functions from the standard libraries.
The algorithm you have chose in is fine, and both other answer recommend using streams, but I disagree (hence my answer).
So, your user-inputs are not validated, and your code can throw more than just IOExceptions
, but also NoSuchElementException
and so on if the file name arguments are not given on the commandline. You8 need to validate these and throw appropriate exceptions.
Further, you just print stack traces for IOException
, but the program exists with code 0 - a success condition. If you're just going to print the stack trace it makes more sense to declare that the exception is thrown from the main method (and that will automagically print the trace and return with a non-zero code).
Other answers have pointed to the Files.lines(...)
method, but I think you should consider the Files.readAllLines
instead. Note that Files.Lines(...)
and Files.readAllLines(...)
methods will both trim the whitespace at the end-of-line. This may be a problem.
Regardless, Files is a good class to know about.
Consider this code:
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = Files.readAllLines(source);
lines.sort(String::compareToIgnoreCase);
Files.write(target, lines);
I prefer the read-the-whole-file concept to the stream concept. It makes it clear that there are memory-requirements. Additionally, it makes the logic clear.
I would also consider a mechanism for handling raw lines without messing with the line termination padding and characters. Your code strips new-line/carriage-return characters and replaces them with just newline characters. I would prefer to see the line's end-of-line sequence unaltered in the transform. To do this requires a more careful consideration of what methods to use.... None of the Files
methods, nor the default BufferedReader.readLine()
nor Scanner
methods do. You have to override these things. Consider the code:
private static final Pattern EOL = Pattern.compile("$", Pattern.MULTILINE);
private static final List<String> getLines(Path source) throws IOException
try (Scanner scanner = new Scanner(Files.newBufferedReader(source)))
scanner.useDelimiter(EOL);
List<String> lines = new ArrayList<>();
while (scanner.hasNext())
lines.add(scanner.next());
return lines;
private static final void writeLines(Path target, List<String> lines) throws IOException
try (BufferedWriter writer = Files.newBufferedWriter(target))
for (String line : lines)
writer.write(line);
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = getLines(source);
lines.sort(String::compareToIgnoreCase);
writeLines(target, lines);
Note how the IOException is still thrown out the main method, but also note that I am using the try-with-resource options for a try-block to ensure the files/streams are closed. The above code does not strip any line-terminators, and writes the output with the same termination as the input.
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information thatFiles.lines(Path)
andFiles.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.
â Stingy
Apr 28 at 17:25
add a comment |Â
up vote
4
down vote
You can probably do something like this as well:
Files.lines(path)
It gets all the lines from the file as a stream, then you can sort the string based on your logic and then collect the same in a list and write to the output file.
This answer gives you a functional programming approach. Most part of the method is self-explanatory.
Files.write(
Paths.get(outFile),
Files.lines(Paths.get(path))
.sorted(String::compareToIgnoreCase)
.collect(Collectors.toList())
);
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
add a comment |Â
up vote
3
down vote
I can't think of a way with better time complexity, you have to know all values before you can sort. Though with some Java 8 stream capability added to the BufferedReader
class, you could implement it quite cleanly as follows:
public class Arranger {
public static void main(String args) throws IOException
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile));
PrintWriter writer = new PrintWriter(outputFile))
reader.lines()
.sorted(String::compareToIgnoreCase)
.forEachOrdered(writer::println);
This is a simple example. Purely performance wise I do not know the implications. Measure!
I believe it falls back toArrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.
â Imus
Apr 27 at 17:54
Oh no, aFileReader
with the platform's default encoding. What aboutjava.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.
â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
add a comment |Â
up vote
2
down vote
Use Java nio library for faster file read/writes:
java.nio.*;
nio is the latest library designed and developed by Oracle, which uses non-blocking I/O for reading and writing.
Let's take a look at two main methods of java.nio.Files
class:
readAllBytes(Path path):
This method reads all the bytes from the file at given path and returns the byte array containing the bytes read from the file.
readAllLines(Path path,Charset cs):
This method read all lines from the file at given path and returns the List containing the lines from the file.
For your example, you can use as follows(I have used Java-8 streams as well for more readability)
List<String> lines = Files.readAllLines(Paths.get(args[0])).stream().sort(String::compareToIgnoreCase).collect(Collectors.toList());
add a comment |Â
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
accepted
The most significant issues in your code are:
- the handling of inputs.
- the use of helper functions from the standard libraries.
The algorithm you have chose in is fine, and both other answer recommend using streams, but I disagree (hence my answer).
So, your user-inputs are not validated, and your code can throw more than just IOExceptions
, but also NoSuchElementException
and so on if the file name arguments are not given on the commandline. You8 need to validate these and throw appropriate exceptions.
Further, you just print stack traces for IOException
, but the program exists with code 0 - a success condition. If you're just going to print the stack trace it makes more sense to declare that the exception is thrown from the main method (and that will automagically print the trace and return with a non-zero code).
Other answers have pointed to the Files.lines(...)
method, but I think you should consider the Files.readAllLines
instead. Note that Files.Lines(...)
and Files.readAllLines(...)
methods will both trim the whitespace at the end-of-line. This may be a problem.
Regardless, Files is a good class to know about.
Consider this code:
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = Files.readAllLines(source);
lines.sort(String::compareToIgnoreCase);
Files.write(target, lines);
I prefer the read-the-whole-file concept to the stream concept. It makes it clear that there are memory-requirements. Additionally, it makes the logic clear.
I would also consider a mechanism for handling raw lines without messing with the line termination padding and characters. Your code strips new-line/carriage-return characters and replaces them with just newline characters. I would prefer to see the line's end-of-line sequence unaltered in the transform. To do this requires a more careful consideration of what methods to use.... None of the Files
methods, nor the default BufferedReader.readLine()
nor Scanner
methods do. You have to override these things. Consider the code:
private static final Pattern EOL = Pattern.compile("$", Pattern.MULTILINE);
private static final List<String> getLines(Path source) throws IOException
try (Scanner scanner = new Scanner(Files.newBufferedReader(source)))
scanner.useDelimiter(EOL);
List<String> lines = new ArrayList<>();
while (scanner.hasNext())
lines.add(scanner.next());
return lines;
private static final void writeLines(Path target, List<String> lines) throws IOException
try (BufferedWriter writer = Files.newBufferedWriter(target))
for (String line : lines)
writer.write(line);
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = getLines(source);
lines.sort(String::compareToIgnoreCase);
writeLines(target, lines);
Note how the IOException is still thrown out the main method, but also note that I am using the try-with-resource options for a try-block to ensure the files/streams are closed. The above code does not strip any line-terminators, and writes the output with the same termination as the input.
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information thatFiles.lines(Path)
andFiles.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.
â Stingy
Apr 28 at 17:25
add a comment |Â
up vote
4
down vote
accepted
The most significant issues in your code are:
- the handling of inputs.
- the use of helper functions from the standard libraries.
The algorithm you have chose in is fine, and both other answer recommend using streams, but I disagree (hence my answer).
So, your user-inputs are not validated, and your code can throw more than just IOExceptions
, but also NoSuchElementException
and so on if the file name arguments are not given on the commandline. You8 need to validate these and throw appropriate exceptions.
Further, you just print stack traces for IOException
, but the program exists with code 0 - a success condition. If you're just going to print the stack trace it makes more sense to declare that the exception is thrown from the main method (and that will automagically print the trace and return with a non-zero code).
Other answers have pointed to the Files.lines(...)
method, but I think you should consider the Files.readAllLines
instead. Note that Files.Lines(...)
and Files.readAllLines(...)
methods will both trim the whitespace at the end-of-line. This may be a problem.
Regardless, Files is a good class to know about.
Consider this code:
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = Files.readAllLines(source);
lines.sort(String::compareToIgnoreCase);
Files.write(target, lines);
I prefer the read-the-whole-file concept to the stream concept. It makes it clear that there are memory-requirements. Additionally, it makes the logic clear.
I would also consider a mechanism for handling raw lines without messing with the line termination padding and characters. Your code strips new-line/carriage-return characters and replaces them with just newline characters. I would prefer to see the line's end-of-line sequence unaltered in the transform. To do this requires a more careful consideration of what methods to use.... None of the Files
methods, nor the default BufferedReader.readLine()
nor Scanner
methods do. You have to override these things. Consider the code:
private static final Pattern EOL = Pattern.compile("$", Pattern.MULTILINE);
private static final List<String> getLines(Path source) throws IOException
try (Scanner scanner = new Scanner(Files.newBufferedReader(source)))
scanner.useDelimiter(EOL);
List<String> lines = new ArrayList<>();
while (scanner.hasNext())
lines.add(scanner.next());
return lines;
private static final void writeLines(Path target, List<String> lines) throws IOException
try (BufferedWriter writer = Files.newBufferedWriter(target))
for (String line : lines)
writer.write(line);
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = getLines(source);
lines.sort(String::compareToIgnoreCase);
writeLines(target, lines);
Note how the IOException is still thrown out the main method, but also note that I am using the try-with-resource options for a try-block to ensure the files/streams are closed. The above code does not strip any line-terminators, and writes the output with the same termination as the input.
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information thatFiles.lines(Path)
andFiles.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.
â Stingy
Apr 28 at 17:25
add a comment |Â
up vote
4
down vote
accepted
up vote
4
down vote
accepted
The most significant issues in your code are:
- the handling of inputs.
- the use of helper functions from the standard libraries.
The algorithm you have chose in is fine, and both other answer recommend using streams, but I disagree (hence my answer).
So, your user-inputs are not validated, and your code can throw more than just IOExceptions
, but also NoSuchElementException
and so on if the file name arguments are not given on the commandline. You8 need to validate these and throw appropriate exceptions.
Further, you just print stack traces for IOException
, but the program exists with code 0 - a success condition. If you're just going to print the stack trace it makes more sense to declare that the exception is thrown from the main method (and that will automagically print the trace and return with a non-zero code).
Other answers have pointed to the Files.lines(...)
method, but I think you should consider the Files.readAllLines
instead. Note that Files.Lines(...)
and Files.readAllLines(...)
methods will both trim the whitespace at the end-of-line. This may be a problem.
Regardless, Files is a good class to know about.
Consider this code:
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = Files.readAllLines(source);
lines.sort(String::compareToIgnoreCase);
Files.write(target, lines);
I prefer the read-the-whole-file concept to the stream concept. It makes it clear that there are memory-requirements. Additionally, it makes the logic clear.
I would also consider a mechanism for handling raw lines without messing with the line termination padding and characters. Your code strips new-line/carriage-return characters and replaces them with just newline characters. I would prefer to see the line's end-of-line sequence unaltered in the transform. To do this requires a more careful consideration of what methods to use.... None of the Files
methods, nor the default BufferedReader.readLine()
nor Scanner
methods do. You have to override these things. Consider the code:
private static final Pattern EOL = Pattern.compile("$", Pattern.MULTILINE);
private static final List<String> getLines(Path source) throws IOException
try (Scanner scanner = new Scanner(Files.newBufferedReader(source)))
scanner.useDelimiter(EOL);
List<String> lines = new ArrayList<>();
while (scanner.hasNext())
lines.add(scanner.next());
return lines;
private static final void writeLines(Path target, List<String> lines) throws IOException
try (BufferedWriter writer = Files.newBufferedWriter(target))
for (String line : lines)
writer.write(line);
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = getLines(source);
lines.sort(String::compareToIgnoreCase);
writeLines(target, lines);
Note how the IOException is still thrown out the main method, but also note that I am using the try-with-resource options for a try-block to ensure the files/streams are closed. The above code does not strip any line-terminators, and writes the output with the same termination as the input.
The most significant issues in your code are:
- the handling of inputs.
- the use of helper functions from the standard libraries.
The algorithm you have chose in is fine, and both other answer recommend using streams, but I disagree (hence my answer).
So, your user-inputs are not validated, and your code can throw more than just IOExceptions
, but also NoSuchElementException
and so on if the file name arguments are not given on the commandline. You8 need to validate these and throw appropriate exceptions.
Further, you just print stack traces for IOException
, but the program exists with code 0 - a success condition. If you're just going to print the stack trace it makes more sense to declare that the exception is thrown from the main method (and that will automagically print the trace and return with a non-zero code).
Other answers have pointed to the Files.lines(...)
method, but I think you should consider the Files.readAllLines
instead. Note that Files.Lines(...)
and Files.readAllLines(...)
methods will both trim the whitespace at the end-of-line. This may be a problem.
Regardless, Files is a good class to know about.
Consider this code:
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = Files.readAllLines(source);
lines.sort(String::compareToIgnoreCase);
Files.write(target, lines);
I prefer the read-the-whole-file concept to the stream concept. It makes it clear that there are memory-requirements. Additionally, it makes the logic clear.
I would also consider a mechanism for handling raw lines without messing with the line termination padding and characters. Your code strips new-line/carriage-return characters and replaces them with just newline characters. I would prefer to see the line's end-of-line sequence unaltered in the transform. To do this requires a more careful consideration of what methods to use.... None of the Files
methods, nor the default BufferedReader.readLine()
nor Scanner
methods do. You have to override these things. Consider the code:
private static final Pattern EOL = Pattern.compile("$", Pattern.MULTILINE);
private static final List<String> getLines(Path source) throws IOException
try (Scanner scanner = new Scanner(Files.newBufferedReader(source)))
scanner.useDelimiter(EOL);
List<String> lines = new ArrayList<>();
while (scanner.hasNext())
lines.add(scanner.next());
return lines;
private static final void writeLines(Path target, List<String> lines) throws IOException
try (BufferedWriter writer = Files.newBufferedWriter(target))
for (String line : lines)
writer.write(line);
public static void main(String args) throws IOException
if (args.length < 2)
throw new IllegalArgumentException("Expect 2 file-name command-line arguments");
Path source = Paths.get(args[0]);
Path target = Paths.get(args[1]);
List<String> lines = getLines(source);
lines.sort(String::compareToIgnoreCase);
writeLines(target, lines);
Note how the IOException is still thrown out the main method, but also note that I am using the try-with-resource options for a try-block to ensure the files/streams are closed. The above code does not strip any line-terminators, and writes the output with the same termination as the input.
answered Apr 27 at 19:54
rolflâ¦
90.2k13186390
90.2k13186390
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information thatFiles.lines(Path)
andFiles.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.
â Stingy
Apr 28 at 17:25
add a comment |Â
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information thatFiles.lines(Path)
andFiles.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.
â Stingy
Apr 28 at 17:25
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
What about the performance aspect? Is your solution more performant?
â Dexter Thorn
Apr 28 at 8:55
Where did you get the information that
Files.lines(Path)
and Files.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.â Stingy
Apr 28 at 17:25
Where did you get the information that
Files.lines(Path)
and Files.readAllLines(Path)
will remove whitespace from the end of the lines? I just tried it with space characters at the beginning and the end of a line, and the spaces were still there in the string. It would also be strange if the methods did that, because their documentations don't say anything about removing leading or trailing whitespace.â Stingy
Apr 28 at 17:25
add a comment |Â
up vote
4
down vote
You can probably do something like this as well:
Files.lines(path)
It gets all the lines from the file as a stream, then you can sort the string based on your logic and then collect the same in a list and write to the output file.
This answer gives you a functional programming approach. Most part of the method is self-explanatory.
Files.write(
Paths.get(outFile),
Files.lines(Paths.get(path))
.sorted(String::compareToIgnoreCase)
.collect(Collectors.toList())
);
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
add a comment |Â
up vote
4
down vote
You can probably do something like this as well:
Files.lines(path)
It gets all the lines from the file as a stream, then you can sort the string based on your logic and then collect the same in a list and write to the output file.
This answer gives you a functional programming approach. Most part of the method is self-explanatory.
Files.write(
Paths.get(outFile),
Files.lines(Paths.get(path))
.sorted(String::compareToIgnoreCase)
.collect(Collectors.toList())
);
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
add a comment |Â
up vote
4
down vote
up vote
4
down vote
You can probably do something like this as well:
Files.lines(path)
It gets all the lines from the file as a stream, then you can sort the string based on your logic and then collect the same in a list and write to the output file.
This answer gives you a functional programming approach. Most part of the method is self-explanatory.
Files.write(
Paths.get(outFile),
Files.lines(Paths.get(path))
.sorted(String::compareToIgnoreCase)
.collect(Collectors.toList())
);
You can probably do something like this as well:
Files.lines(path)
It gets all the lines from the file as a stream, then you can sort the string based on your logic and then collect the same in a list and write to the output file.
This answer gives you a functional programming approach. Most part of the method is self-explanatory.
Files.write(
Paths.get(outFile),
Files.lines(Paths.get(path))
.sorted(String::compareToIgnoreCase)
.collect(Collectors.toList())
);
edited Apr 28 at 3:17
Jamalâ¦
30.1k11114225
30.1k11114225
answered Apr 27 at 18:38
Joydeep Bhattacharya
594
594
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
add a comment |Â
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
Thanks @Jamal for editing my post. I am new here and was unaware about it.
â Joydeep Bhattacharya
Apr 29 at 4:36
add a comment |Â
up vote
3
down vote
I can't think of a way with better time complexity, you have to know all values before you can sort. Though with some Java 8 stream capability added to the BufferedReader
class, you could implement it quite cleanly as follows:
public class Arranger {
public static void main(String args) throws IOException
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile));
PrintWriter writer = new PrintWriter(outputFile))
reader.lines()
.sorted(String::compareToIgnoreCase)
.forEachOrdered(writer::println);
This is a simple example. Purely performance wise I do not know the implications. Measure!
I believe it falls back toArrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.
â Imus
Apr 27 at 17:54
Oh no, aFileReader
with the platform's default encoding. What aboutjava.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.
â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
add a comment |Â
up vote
3
down vote
I can't think of a way with better time complexity, you have to know all values before you can sort. Though with some Java 8 stream capability added to the BufferedReader
class, you could implement it quite cleanly as follows:
public class Arranger {
public static void main(String args) throws IOException
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile));
PrintWriter writer = new PrintWriter(outputFile))
reader.lines()
.sorted(String::compareToIgnoreCase)
.forEachOrdered(writer::println);
This is a simple example. Purely performance wise I do not know the implications. Measure!
I believe it falls back toArrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.
â Imus
Apr 27 at 17:54
Oh no, aFileReader
with the platform's default encoding. What aboutjava.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.
â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
add a comment |Â
up vote
3
down vote
up vote
3
down vote
I can't think of a way with better time complexity, you have to know all values before you can sort. Though with some Java 8 stream capability added to the BufferedReader
class, you could implement it quite cleanly as follows:
public class Arranger {
public static void main(String args) throws IOException
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile));
PrintWriter writer = new PrintWriter(outputFile))
reader.lines()
.sorted(String::compareToIgnoreCase)
.forEachOrdered(writer::println);
This is a simple example. Purely performance wise I do not know the implications. Measure!
I can't think of a way with better time complexity, you have to know all values before you can sort. Though with some Java 8 stream capability added to the BufferedReader
class, you could implement it quite cleanly as follows:
public class Arranger {
public static void main(String args) throws IOException
File inputFile = new File(args[0]);
File outputFile = new File(args[1]);
try (BufferedReader reader = new BufferedReader(new FileReader(inputFile));
PrintWriter writer = new PrintWriter(outputFile))
reader.lines()
.sorted(String::compareToIgnoreCase)
.forEachOrdered(writer::println);
This is a simple example. Purely performance wise I do not know the implications. Measure!
answered Apr 27 at 17:36
Koekje
1,017211
1,017211
I believe it falls back toArrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.
â Imus
Apr 27 at 17:54
Oh no, aFileReader
with the platform's default encoding. What aboutjava.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.
â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
add a comment |Â
I believe it falls back toArrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.
â Imus
Apr 27 at 17:54
Oh no, aFileReader
with the platform's default encoding. What aboutjava.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.
â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
I believe it falls back to
Arrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.â Imus
Apr 27 at 17:54
I believe it falls back to
Arrays.sort
which means compllexity wise it'll be the same. You'll have a little extra overhead from the streams but I'd say the readability is well worth it. Might be hard to remember using foreachorderd if you're not used to streams yet though.â Imus
Apr 27 at 17:54
Oh no, a
FileReader
with the platform's default encoding. What about java.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.â Roland Illig
Apr 27 at 18:23
Oh no, a
FileReader
with the platform's default encoding. What about java.nio.file.Files.readAllLines
? That one always uses UTF-8 and buffers automatically.â Roland Illig
Apr 27 at 18:23
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
Hmm, it was only supposed to be a simple example... :P
â Koekje
Apr 27 at 23:42
add a comment |Â
up vote
2
down vote
Use Java nio library for faster file read/writes:
java.nio.*;
nio is the latest library designed and developed by Oracle, which uses non-blocking I/O for reading and writing.
Let's take a look at two main methods of java.nio.Files
class:
readAllBytes(Path path):
This method reads all the bytes from the file at given path and returns the byte array containing the bytes read from the file.
readAllLines(Path path,Charset cs):
This method read all lines from the file at given path and returns the List containing the lines from the file.
For your example, you can use as follows(I have used Java-8 streams as well for more readability)
List<String> lines = Files.readAllLines(Paths.get(args[0])).stream().sort(String::compareToIgnoreCase).collect(Collectors.toList());
add a comment |Â
up vote
2
down vote
Use Java nio library for faster file read/writes:
java.nio.*;
nio is the latest library designed and developed by Oracle, which uses non-blocking I/O for reading and writing.
Let's take a look at two main methods of java.nio.Files
class:
readAllBytes(Path path):
This method reads all the bytes from the file at given path and returns the byte array containing the bytes read from the file.
readAllLines(Path path,Charset cs):
This method read all lines from the file at given path and returns the List containing the lines from the file.
For your example, you can use as follows(I have used Java-8 streams as well for more readability)
List<String> lines = Files.readAllLines(Paths.get(args[0])).stream().sort(String::compareToIgnoreCase).collect(Collectors.toList());
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Use Java nio library for faster file read/writes:
java.nio.*;
nio is the latest library designed and developed by Oracle, which uses non-blocking I/O for reading and writing.
Let's take a look at two main methods of java.nio.Files
class:
readAllBytes(Path path):
This method reads all the bytes from the file at given path and returns the byte array containing the bytes read from the file.
readAllLines(Path path,Charset cs):
This method read all lines from the file at given path and returns the List containing the lines from the file.
For your example, you can use as follows(I have used Java-8 streams as well for more readability)
List<String> lines = Files.readAllLines(Paths.get(args[0])).stream().sort(String::compareToIgnoreCase).collect(Collectors.toList());
Use Java nio library for faster file read/writes:
java.nio.*;
nio is the latest library designed and developed by Oracle, which uses non-blocking I/O for reading and writing.
Let's take a look at two main methods of java.nio.Files
class:
readAllBytes(Path path):
This method reads all the bytes from the file at given path and returns the byte array containing the bytes read from the file.
readAllLines(Path path,Charset cs):
This method read all lines from the file at given path and returns the List containing the lines from the file.
For your example, you can use as follows(I have used Java-8 streams as well for more readability)
List<String> lines = Files.readAllLines(Paths.get(args[0])).stream().sort(String::compareToIgnoreCase).collect(Collectors.toList());
answered May 1 at 17:32
srth12
616
616
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f193097%2farrange-lines-of-a-text-file%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password