I have this code:
public class test {
public static void main(String args[]) throws IOException {
BufferedReader in = new BufferedReader(new
InputStreamReader(System.in));
char x =(char)in.read();
char y =(char)in.read();
char z =(char)in.read();
System.out.print(x+y+z);
}
}
and this input:
1
2
and the output is:
109
Why do I get this output?
I can't understand how read function works.
I tried using the skip function and didn't get the right answer either.
You are reading your input as characters. Your input is three characters (1, 2, and line feed):
1 with an ASCII value of 49.
2 with an ASCII value of 50.
line feed with an ASCII value of 10.
Then you add those three chars by their ASCII value, giving a total of 109.
The problem is that you've misunderstood how the character is being returned when read() is called.
The character read, as an integer in the range 0 to 65535 (0x00-0xffff), or -1 if the end of the stream has been reached
The read method returns an int so it can return the Unicode code for the character. For simple letters and numbers, Unicode overlaps ASCII, where 1 is 49, 2 is 50, and a newline character is 10. The sum of those codes is 109.
Options:
Use a Scanner and its nextInt method.
Use BufferedReader's readLine method and parse the strings to integers with Integer.parseInt.
Related
This question already has answers here:
Char - Java not working as intended / my code
(4 answers)
Closed 3 years ago.
I am seeing a tutorial on udemy and there the instructor says that we can store the integer variable in the char data type. But when I try to print the value ... nothing shows up
I tried assigning the "char one" value to integer variable and then get the output from int variable,It works but why can not I use the char to output the value
public static void main(String[] args) {
char one = 10;
System.out.println(one);
}
If you look at the ASCII table you would see that the character 10 represents the newline character.
This can be proved by the code below:
public static void main(String[] args) {
char one = 10;
//no newline added by print, but println adds a newline implicitly
System.out.print("Test");
System.out.print(one);
System.out.print("Test");
}
The output is:
Test
Test
Although I used System.out.print a newline was still added in the output after the first Test. So you see something was actually printed.
Furthermore, when you pass a char to the System.out.println() the char is converted to its String representation as per the ASCII table by invoking the String.valueOf(char) as char is a primitive.
For Objects when you pass a reference in the System.out.println() the toString() method of the object would be called to get its String representation.
If you change the value to char one = 65 you would see the letter A printed.
In Java char type is an int, therefore they can be converted char <-> int.
When you print an int - you get an integer number. When you print char - you get an ASCII character. char ch = 10 - is not printable character.
char ch = 'A';
System.out.println(ch); // print 'A'
int code = ch;
System.out.println(code); // print 65 - ASCII code of 'A'
Adding to the above answers, if you want to output the int value from the variable "one", a cast would work:
char one = 10;
System.out.println((int) one);
If you take a look at the ASCII Table, you can see the value of 10 is LF which is a new line. If you print this alone, it will appear to be doing nothing because it is just a new line.
However if you modify the code a bit to print some actual characters on both side of the LF char:
char c1 = 70;
System.out.print(c1);
char one = 10;
System.out.print(one);
char c2 = 71;
System.out.print(c2);
This will output:
F
G
On separate lines due to the newline in between, without it they would have printed on the same line.
Additionally you can see on that table 70 corresponds with F, and 71 with G.
Note: Java does not technically use ASCII, but rather a different encoding depending on your environment(commonly UTF-16 or ISO-8859-1), however, the characters are usually equivalent to ASCII for the amount of values the ASCII table contains (a superset). For example char c1 = 202 will print Ê for me, which is not an ASCII value.
You are misinterpreting your output and drawing the wrong conclusion.
A char is a UTF-16 code unit. UTF-16 is a character encoding for the Unicode character set. UTF-16 encodes a Unicode codepoint with one or two UTF-16 code units. Typically, if it might be two code units, you'd use String or char[] instead of char. But if your codepoint is known to take only one UTF-16 code unit, you could use char.
The codepoint you are using is U+000A 'LINE FEED (LF)'. It does take one UTF-16 code unit \u000a, which is convertible from the integer value 0xa or 10. If you inspect your output carefully, you'll "see". Perhaps adding output before and after would help.
In each line of output there should be two columns:
The first column contains the String and is left justified using exactly 15 characters.
The second column contains the integer, expressed in exactly 3 digits; if the original input has less than three digits, you must pad your output's leading digits with zeroes.
can someone explain the System.out.printf("%-15s%03d%n", s1, x);
import java.util.Scanner;
public class Solution {
public static void main(String[] args) {
Scanner sc=new Scanner(System.in);
System.out.println("================================");
for(int i=0;i<3;i++)
{
String s1=sc.next();
int x=sc.nextInt();
System.out.printf("%-15s%03d%n", s1, x);
}
System.out.println("================================");
}
}
Basically every %... is gonna be replaced by one of the arguments of printf. What is after the % sign is a format specifier.
In %-15s:
- means: left-justified
15 means: if the result is less than 15 characters long, add spaces until it is 15 characters long
s means: convert the parameter into a string with toString and use the result
In %03d:
0 means: pad with 0s instead of spaces
3 means: make it at least 3 characters long
d means: the argument will be an integer number, format it as a base-10 number.
%n is the same as \n on *NIX or \r\n on Windows.
You will get more info here: https://docs.oracle.com/javase/7/docs/api/java/util/Formatter.html#syntax
EDIT based on remarks by AxelH and Andy Turner
Its Java formatter syntax
first half - %-15s
% - says that what follows is an argument that will be formatted.
s - says youre formatting a string
15 - number of characters you put into string
and finally - means string is gonna be justified to the left
second half - %03d
d means youll be adding integers
0 means youll be adding 0's where necessary
3 means you need to add 3 digits
%n is System.line_separator - basically outputs new line. It does the same as /n but %n is portable across platforms (credit #AxelH)
Whenever you use charAt() for a string containing numbers, it returns 48 + the digit stored at that index. Why exactly?
Ex:
import java.util.*;
public class otherApples {
public static void main(String args[]){
Scanner scan = new Scanner(System.in);
String neuwt = scan.nextLine();
int i = neuwt.charAt(2);
System.out.println(i);
}
}
intput: 523
output: 51
Because the character '3' has the ASCII character code 51.
If i were a char you would get the 3 you are expecting.
Character '0' has ASCII code 48. '1' is 49 and so on.
In other words: '0' == 48. What you are seeing is correct, you are just looking at the ASCII codes and not the actual characters those codes represent.
I am trying to get a char from an int value > 0xFFFF. But instead, I always get back the same char value, that when cast to an int, prints the value 65535 (0xFFFF).
I couldn't understand why it is generating symbols for unicode > 0xFFFF.
int hex = 0x10FFFF;
char c = (char)hex;
System.out.println((int)c);
I expected the output to be 0x10FFFF. Instead, the output comes back as 65535.
This is because, while an int is 4 bytes, a char is only 2 bytes. Thus, you can't represent all values in a char that you can in an int. Using a standard unsigned integer representation, you can only represent the range of values from 0 to 2^16 - 1 == 65535 in a 2-byte value, so if you convert any number outside that range to a 2-byte value and back, you'll lose data.
int is 4 byte. char is 2 byte.
Your number was well within range an int can hold, but not which char can.
So when you converted that number to a char, it lost data and became the maximum a char can hold, which is what it printed i.e. 65535
Your number was too big to be a char which is 2 bytes. But it was small enough where it fit in as an int which is 4 bytes. 65535 is the biggest amount that fits in a char so that's why you got that value. Also, if a char was big enough to fit your number, when you returned it to an int it might have returned the decimal value for 0x10FFFF which is 1114111.
Unfortunately, I think you were expecting a Java char to be the same thing as a Unicode code point. They are not the same thing.
The Java char, as already expressed by other answers, can only support code points that can be represented in 16 bits, whereas Unicode needs 21 bits to support all code points.
In other words, a Java char on its own, only supports Basic Multilingual Plane characters (code points <= 0xFFFF). In Java, if you want to represent a Unicode code point that is in one of the extended planes (code points > 0xFFFF), then you need surrogate characters, or a pair of characters to do that. This is how UTF-16 works. And, internally, this is how Java strings work as well. Just for fun, run the following snippet to see how a single Unicode code point is actually represented by 2 characters if the code point is > 0xFFFF:
// Printing string length for a string with
// a single unicode code point: 0x22BED.
System.out.println("𢯭".length()); // prints 2, because it uses a surrogate pair.
If you want to safely convert an int value that represents a Unicode code point to a char (or chars to be more exact), and then convert it back to an int code point, you will have to use code like this:
public static void main(String[] args) {
int hex = 0x10FFFF;
System.out.println(Character.isSupplementaryCodePoint(hex)); // prints true because hex > 0xFFFF
char[] surrogateChars = Character.toChars(hex);
int codePointConvertedBack = Character.codePointAt(surrogateChars, 0);
System.out.println(codePointConvertedBack); // prints 1114111
}
Alternatively, instead of manipulating char arrays, you can use a String, like this:
public static void main(String[] args) {
int hex = 0x10FFFF;
System.out.println(Character.isSupplementaryCodePoint(hex)); // prints true because hex > 0xFFFF
String s = new String(new int[] {hex}, 0, 1);
int codePointConvertedBack = s.codePointAt(0);
System.out.println(codePointConvertedBack); // prints 1114111
}
For further reading: Java Character Class
This block of code gives Number Format Exception on input 600000 in n
import java.util.*;
class SpoTwo{
public static void main(String args[]){
Scanner sc=new Scanner(System.in);
int testcase,n,answer;
long bin;
String s;
testcase=sc.nextInt();
for(int i=0;i<testcase;i++){
n=sc.nextInt();
s=Integer.toBinaryString(n);
bin=Integer.parseInt(s);
answer=(int)Math.pow(2,bin*2);
System.out.println(answer%1000000007);
}
}
}
Exception:
Exception in thread "main" java.lang.NumberFormatException: For input string: "10010010011111000000"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:495)
at java.lang.Integer.parseInt(Integer.java:527)
at SpoTwo.main(SpoTwo.java:12)
The binary representation of 600000 is 10010010011111000000. This is not a valid base 10 integer. It is a valid base 2 integer. Use
bin = Integer.parseInt(s, 2);
Here's the method's javadoc.
The overloaded parseInt method you were using
Parses the string argument as a signed decimal integer. The characters
in the string must all be decimal digits, except that the first
character may be an ASCII minus sign '-' ('\u002D') to indicate a
negative value or an ASCII plus sign '+' ('\u002B') to indicate a
positive value. The resulting integer value is returned, exactly as if
the argument and the radix 10 were given as arguments to the
parseInt(java.lang.String, int) method.
you got a string and move to binary and then converts that string to an integer and then vc is the power, so that the whole supports a certain number of houses, so it's a blast, u can use paserInt putting the number of homes or turns the whole into a double.