Java.io.StreamTokenizer.eolIsSignificant() 方法
描述
java.io.StreamTokenizer.eolIsSignificant(boolean flag) 方法确定是否将行尾视为标记。 如果 flag 参数为真,则此标记器将行尾视为标记; nextToken 方法返回 TT_EOL 并在读取行尾时将 ttype 字段设置为此值。
行是以回车符 ('\r') 或换行符 ('\n') 结尾的字符序列。 此外,一个回车符后面紧跟一个换行符被视为单个行尾标记。
如果标志为 false,则行尾字符被视为空格,并且仅用于分隔标记。
声明
以下是 java.io.StreamTokenizer.eolIsSignificant() 方法的声明。
public void eolIsSignificant(boolean flag)
参数
flag − true 表示行尾字符是单独的标记; false 表示行尾字符是空格。
返回值
此方法不返回值。
异常
NA
示例
下面的例子展示了 java.io.StreamTokenizer.eolIsSignificant() 方法的使用。
package com.tutorialspoint; import java.io.*; public class StreamTokenizerDemo { public static void main(String[] args) { String text = "Hello. This is a text \n that will be split " + "into tokens. 1 + 1 = 2"; try { // create a new file with an ObjectOutputStream FileOutputStream out = new FileOutputStream("test.txt"); ObjectOutputStream oout = new ObjectOutputStream(out); // write something in the file oout.writeUTF(text); oout.flush(); // create an ObjectInputStream for the file we created before ObjectInputStream ois = new ObjectInputStream(new FileInputStream("test.txt")); // create a new tokenizer Reader r = new BufferedReader(new InputStreamReader(ois)); StreamTokenizer st = new StreamTokenizer(r); // set that end of line is significant st.eolIsSignificant(true); // print the stream tokens boolean eof = false; do { int token = st.nextToken(); switch (token) { case StreamTokenizer.TT_EOF: System.out.println("End of File encountered."); eof = true; break; case StreamTokenizer.TT_EOL: System.out.println("End of Line encountered."); break; case StreamTokenizer.TT_WORD: System.out.println("Word: " + st.sval); break; case StreamTokenizer.TT_NUMBER: System.out.println("Number: " + st.nval); break; default: System.out.println((char) token + " encountered."); if (token == '!') { eof = true; } } } while (!eof); } catch (Exception ex) { ex.printStackTrace(); } } }
让我们编译并运行上面的程序,这将产生下面的结果 −
Word: Hello. Word: This Word: is Word: a Word: text End of Line encountered. Word: that Word: will Word: be Word: split Word: into Word: tokens. Number: 1.0 + encountered. Number: 1.0 = encountered. Number: 2.0 End of File encountered.