Is java.util.Date using TimeZone? - java

I have 2 different computers, each with different TimeZone.
In one computer im printing System.currentTimeMillis(), and then prints the following command in both computers:
System.out.println(new Date(123456)); --> 123456 stands for the number came in the currentTimeMillis in computer #1.
The second print (though typed hardcoded) result in different prints, in both computers.
why is that?

How about some pedantic detail.
java.util.Date is timezone-independent. Says so right in the javadoc.
You want something with respect to a particular timezone? That's java.util.Calendar.
The tricky part? When you print this stuff (with java.text.DateFormat or a subclass), that involves a Calendar (which involves a timezone). See DateFormat.setTimeZone().
It sure looks (haven't checked the implementation) like java.util.Date.toString() goes through a DateFormat. So even our (mostly) timezone-independent class gets messed up w/ timezones.
Want to get that timezone stuff out of our pure zoneless Date objects? There's Date.toGMTString(). Or you can create your own SimpleDateFormatter and use setTimeZone() to control which zone is used yourself.

why is that?
Because something like "Oct 4th 2009, 14:20" is meaningless without knowing the timezone it refers to - which you can most likely see right now, because that's my time as I write this, and it probably differs by several hours from your time even though it's the same moment in time.
Computer timestamps are usually measured in UTC (basically the timezone of Greenwich, England), and the time zone has to be taken into account when formatting them into something human readable.

Because that milliseconds number is the number of milliseconds past 1/1/1970 UTC. If you then translate to a different timezone, the rendered time will be different.
e.g. 123456 may correspond to midday at Greenwich (UTC). But that will be a different time in New York.
To confirm this, use SimpleDateFormat with a time zone output, and/or change the timezone on the second computer to match the first.

javadoc explains this well,
System.currentTimeMillis()
Note that while the unit of time of the return value is a millisecond, the granularity of the value depends on the underlying operating system and may be larger. For example, many operating systems measure time in units of tens of milliseconds.

See https://docs.oracle.com/javase/7/docs/api/java/util/Date.html#toString().
Yes, it's using timezones. It should also print them out (the three characters before the year).

Related

java.sql.Date and Time Timezone differs

private GregorianCalendar formatDate(Date dateStatus, Time timeStatus) {
GregorianCalendar calendar = (GregorianCalendar) GregorianCalendar.getInstance(TimeZone.getTimeZone("UTC"));
calendar.setTime(new Date(dateStatus.getTime() + timeStatus.getTime()));
return calendar;
}
The above code returns the Calendar value in milliseconds. But i am getting different value in local and while running the test in jenkins causing the testcase to fail.
Local and Jenkins Server running in different timezones.
Jenkins Error:
Expected: 1554866100000
got: 1554903900000
How can i handle this?
java.sql.Date is an extremely unfortunate API design messup. That class extends java.util.Date, and that class is a lie. It does not represent a date at all (check the source code if you are understandably skeptical). It represent a moment in time, devoid of any timezone information, based on milliseconds since UTC new year's 1970. Anything you care to coerce from a Date object that isn't a very large number or a direct swap to a more appropriate type that doesn't lie (such as java.time.Instant) is therefore suspect: It is picking a timezone implicitly and mixing that in, in order to provide you your answer. This is why most of Date's methods, such as .getYear(), are marked deprecated: In the java core libs, a deprecation marker usually doesn't actually mean "This will be removed soon", it actually means: "This is fundamentally broken and you should never call this method". (See: Thread.stop).
Nevertheless JDBC API (what you use to talk to DBs) was built on top of this; java.sql.Date as well as java.sql.Timestamp extend java.util.Date and thereby inherit the messup. This means date handling in this fashion will convert timestamps in the DB that have full timezone info into timezoneless constructs, and then java-side you can add new timezones, which is a bad way of doing things.
Unfortunately date/time is extremely complicated (see below) and databases have wildly different ways of storing it; usually multiple slightly different date/time types, such as 'TIMESTAMP', 'TIME WITH TIME ZONE', etcetera.
Because of this there is no unique advice that works regardless of context: The right answer depends on your JDBC driver version, DB engine, DB engine version, and needs. This means the best approach is generally to first understand the fundamentals so that you can adapt and figure out what would work best for your specific situation.
java.util.Calendar is even worse. Again a lie (it represents time. Not a calendar!), the API is extremely badly designed (it is very non-java-like). There is a reason this second try at date/time API resulted in yet another date/time library (java.time): It's bad. Don't use it.
So, let me try to explain the basics:
You like to wake up at 8 in the morning. It's noon and you check your phone. It says 'the next alarm will ring in 20 hours'. You now hop onto a concorde at Schiphol Airport in Amsterdam, and fly west, to New York. The flight takes 3 hours. When you land, you check your phone. Should it say 'the next alarm will ring in 17 hours' (3 hours of flight time have passed), or should it say: 'the next alarm will ring in 23 hours' (you actually land at 9 in the morning in New York time because its 6 hours earlier there relative to Amsterdam, so it's 23 hours until 8 o' clock in the morning local time). The right answer is presumably: 23 hours. This requires the concept of 'local time': A point in time stated in terms of years, months, days, hours, minutes, and seconds - but no timezone, and not even 'please assume a timezone for me', but the concept '... where-ever you are now'.
Before you jumped in the plane, you made an appointment at a barber's in Amsterdam for when you return. You made it for March 8th, 2022, at 12:00. When you check your phone it reads: "365 days, 0 hours, 0 minutes, and 0 seconds" as you made the appointment. If you fly to new york, that should now read "364 days, 21 hours, 0 minutes, and 0 seconds": The fact that you're currently in new york has no bearing on the situation whatsoever. You'd think that time-as-millis-since-UTC should thus suffice, but no it does not: Imagine that the netherlands abolishes daylight savings time (crazy idea? No, quite likely actually). The very instant that the gavel hits the desk and the law is adopted that The Netherlands would switch to summer time in 2021, and then stay there forever more, that very moment? Your phone's indicator for 'time until barber appointment' should instantly increment by 1 hour (or it is decrement?). Because that barber appointment isn't going to reschedule itself to be at 11:00 or 13:00.
During your flight, you snapped a pic of the tulip fields before the plane crossed the atlantic. The timestamp of this picture is yet another concept of time: If, somehow, the netherlands decides to apply the timezone change retroactively, then the timestamp in terms of 'this picture was taken on 2021, march 8th, 12:05, Amsterdam time' should instantly pop and now read 13:05 or 11:05 instead. In this way it is unlike the barber appointment.
Before this question can be answered, you need to figure out which of the 3 different concepts of time your situation boils down to.
Only the java.time package fully covers this all. Respectively:
Alarm clock times are represented by LocalDate, LocalTime, and LocalDateTime.
Appointment times are represented by ZonedDateTime.
When did I make the picture times are represented by Instant.
The java.sql.Date type is most equivalent to Instant, which is bizarre, because you'd think that this is more java.sql.Timestamp's bailiwick.
Pragmatics, how to get the job done
Your first stop is to fully understand your DB engine's data types. For example, in postgres:
concept
java.time type
postgres type
alarm clocks
java.time.LocalTime
time without time zone
-
java.time.LocalDate
date
-
java.time.LocalDateTime
timestamp without time zone
appointments
java.time.ZonedDateTime
timestamp with time zone
when i took the picture
java.time.Instant
no appropriate type available
Kind of silly that the implementation of java.sql.Date and java.sql.Timestamp best matches the very concept postgres cannot express, huh.
Once you ensured that your DB is using the right type for your concept, the next thing to check is if your JDBC driver and other infrastructure is up to date. You can use the right types java-side (from the java.time packages): Use these types (LocalDate, Instant, ZonedDateTime, etc) in your infrastructre and if making raw JDBC calls, .setObject(x, instanceOfZDT) instead of .setDate when setting, and .getObject(col, LocalDateTime.class) to fetch, which works on many JDBC drivers.
If it doesn't work, you need to work around the issues, because the process is now that the DB is storing e.g. the year, month, day, hour, minute, second, and complete timezone description (not 'EST' there are way too many zones to cover with 3 letters, but something like 'Europe/Amsterdam'), will then convert this into millis-since-epoch to transfer it to the JDBC server, and then your java code will inject a timezone again, and thus you're going to run into issues. The best bet is to IMMEDIATELY convert ASAP, so that you have an old unwieldy type (java.sql.Date or java.sql.Timestamp) as short as possible and can test right at the source that you're 'undoing the damage' done when your ZDT/LDT type arrives as the Instant-esque java.sql type.

Why is UTC (which is not a time zone) considered a time zone in Java (and not only there)?

Given that UTC is not a time zone, but a time standard (as stated, for example, here), why in my Java application I can use UTC as if it was a time zone (see the code snippet below)?
SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
format.setTimeZone(TimeZone.getTimeZone("UTC"));
If UTC is not a time zone, why is TimeZone.getTimeZone("UTC") able to return the time zone object?
By the way, on my Windows machine UTC is in the list of time zones also (see the screenshot).
Is the statement "UTC is not a time zone" in reality wrong?
Because it makes life much, much simpler to regard UTC as a time zone than to treat it as something else, basically.
It's one of those "Yeah, strictly speaking it's not" scenarios. For everything except "Which region of the world is this observed?" you can think of UTC as a time zone and it works fine. So it's simpler to bend it slightly out of shape than to have a whole separate concept.
If you view a time zone as a mapping from "instant in time" to "UTC offset" (or equivalent, from "instant in time" to "locally observed time") then UTC is fine to think of as a time zone - and that's most of what we do within software.
If you view a time zone as a geographical region along with that mapping, then no, it doesn't work as well - but that's more rarely useful in software. (And you can always fake it by saying it's an empty region :)
Is the statement "UTC is not a time zone" in reality wrong?
Technically and strictly speaking, the statement is not wrong. UTC is a standard, not a timezone (as you already linked).
A timezone corresponds to some region in the world and has lots of different rules regarding that region:
What's the UTC offset (the difference from UTC) when it's in Daylight Saving time and when it's not
When DST starts and ends
All the changes in offsets and DST this region had during its history
Example: in 1985, the brazilian state of Acre had standard offset UTC-05:00 (and UTC-04:00 during DST), then in 1988 it was on UTC-05:00 without DST, then in 2008 the standard changed to UTC-04:00 (and no DST), and since 2013 it's back to UTC-05:00 and no DST.
While the timezone keeps track of all of these changes, UTC has no such rules. You can think of UTC in many different ways:
a "base" date/time, from where everybody else is relative to - this difference from UTC is called "offset". Today, São Paulo is in UTC-03:00 (the offset is minus 3 hours, or 3 hours behind UTC) while Tokyo is in UTC+09:00 (offset of plus 9 hours, or 9 hours ahead UTC).
a "special" timezone that never varies. It's always in the same offset (zero), it never changes, and never has DST shifts.
As the "offset of UTC" (not sure how technically accurate is this term) is always zero, it's common to write is as UTC+00:00 or just Z.
Another difference between UTC and timezone is that a timezone is defined by governments and laws and can change anytime/anywhere. All the changes in Acre described above were defined by politicians, for whatever reasons they thought at that time. (So, even if a region today follows UTC in their timezone, there's no guarantee that it'll stay the same in the future, and that's why even those regions have their own timezones, even if they look redundant).
But no matter how many times politicians change their regions offsets, they must be relative to UTC (until a new standard comes up, of course).
Now, when you see implementations like TimeZone.getTimeZone("UTC"), you can think of it in 2 different ways:
a design flaw, because it's mixing 2 different concepts and leading people to think they're the same thing
a shortcut/simplification/nice-trick/workaround, that makes things easier (as #JonSkeet explained in his answer).
For me, it's a mix of both (fifty/fifty).
The new java.time API, though, separates the concepts in 2 classes: ZoneRegion and ZoneOffset (actually both are subclasses of ZoneId, but ZoneRegion is not public so actually we use ZoneId and ZoneOffset):
if you use a ZoneId with IANA timezones names (always in the format Continent/City, like America/Sao_Paulo or Europe/Berlin), it will create a ZoneRegion object - a "real" timezone with all the DST rules and offset during its history. So, you can have different offsets depending on the dates you're working with in this ZoneId.
if you use a ZoneOffset (with Z, UTC, +03:00 and so on), it will return just an object that represents an offset: the difference from UTC, but without any DST rules. No matter what dates you use this object with, it'll always have the same difference from UTC.
So, the ZoneId (actually ZoneRegion) is consistent with the idea that offsets in some region (in some timezone) change over time (due to DST rules, politicians changing things because whatever, etc). And ZoneOffset represents the idea of difference from UTC, that has no DST rules and never changes.
And there's the special constant ZoneOffset.UTC, which represents a zero difference from UTC (which is UTC itself). Note that the new API took a different approach: instead of saying that everything is a timezone and UTC is special kind, it says that UTC is a ZoneOffset which has a value of zero for the offset.
You can still think it's a "wrong" design decision or a simplification that makes things easier (or a mix of both). IMO, this decision was a great improvement comparing to the old java.util.TimeZone, because it makes clear that UTC is not a timezone (in the sense that it has no DST rules and never changes), it's just a zero difference from the UTC standard (a very technical way of saying "it's UTC").
And it also separates the concepts of timezone and offset (that are not the same thing, although very related to each other). I see the fact that it defines UTC as a special offset as an "implementation detail". Creating another class just to handle UTC would be redundant and confusing, and keeping it as a ZoneOffset was a good decision that simplified things and didn't mess the API (for me, a fair tradeoff).
I believe that many other system's decide to take similar approaches (which explains why Windows have UTC in the timezone list).

Printing milliseconds of Java epoch returns 18000000?

In Joda, if I print
DateTime(GregorianChronology.getInstance())
.withYear(1970)
.withMonthOfYear(1)
.withDayOfMonth(1)
.withHourOfDay(0)
.withMinuteOfHour(0)
.withSecondOfMinute(0)
.withMillisOfSecond(0).getMillis();
I see 18000000 (this also happens to be 1/4th of MILLIS_PER_DAY, FWIW).
What I don't understand is that if the milliseconds represents the offset from the epoch which is defined as Jan-1970-01-01, then shouldn't the milliseconds be 0?
The epoch is Jan-1970-01-01 GMT. The instance you have, obviously has a different DateTimeZone. In fact it lookds like you're at GMT+5. (18000000 millis = 5 hours)
I believe the issue is related to the way Java dates include the time zone as part of there calculations.
For me, this means I'm +10 hours ahead of the epoc.
Try creating a Date/Time value that is set to 0 GMT.
The "epoch" is a specific and universal instant, a point in the universe time (like, say the moment in which the Apollo XI landed on the moon). This reference point can be represented differently in different countries (and a martian could also represent it with his own calendar). For example, for the people in England (GMT), that's the moment in which the hands of their clocks marked "00:00:00" and their (Gregorian) calendars marked "1/1/1970"; but that's just an example.
The line
DateTime(GregorianChronology.getInstance()).withYear(1970).withMonthOfYear(1)
.withDayOfMonth(1).withHourOfDay(0).withMinuteOfHour(0)
.withSecondOfMinute(0).withMillisOfSecond(0)
gives you the instant in which the clocks and the calendars in your country marked "00:00:00 1970-01-01". That's, in general, a different instant.

What is the Objective-C equivalent of the Java TimeZone?

What is the Objective-C equivalent of the Java TimeZone class?
The NSTimeZone class is the equivalent of the Java TimeZone class.
NSTimeZone, I believe. Can't say I've ever done any Objective-C myself, but it looks right...
Apple also has a (pretty short) article on using it.
It's quite likely that they won't be direct equivalents in every respect, of course... but if there's something you would use with Java's TimeZone which you can't figure out in NSTimeZone, ask about that specific call... and someone else can help you, I'm sure :)
EDIT: The purpose of a time zone class is to convert between local times in different time zones. For example, right now, it's 7.50pm for me - but it's 12.50pm for the person who I'm about to have a Skype call with. One option for representing dates and times is to always store them in UTC (which is sort of the "zero" of time zones) and then convert the UTC value into the "local" time for the user, e.g. for display purposes. That's not always the right option, but it's usually a good starting point.
At other times, you may have a local time and know person X's time zone - and want to convert it to person Y's time zone. It's usually easiest to do that by converting the local time to UTC (using X's time zone) and then to convert it back to local time using Y's time zone.
Time zones aren't nearly as straightforward as you might expect - mostly due to daylight savings. Oddities:
Local times which either don't exist, or occur twice, due to DST transitions
Time zones which change to DST at midnight, so that midnight doesn't always exist
Governments deciding to scrap (or introduce) DST at almost no notice
DST which isn't the normal "move an hour forwards". IIRC, Tibet was considering introducing DST of 1:15.
Historical changes to time zones
The list goes on.
NSTimeZone : http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSTimeZone_Class/Reference/Reference.html
Regards.

What system default date format to use?

I'm setting the standards for our application.
I've been wondering, what default date format should I choose to use ?
It should be:
Internationalization & timezone aware, the format should be able to represent user local time
Can be efficiently parsed by SimpleDataFormat (or alike, jdk classes only)
Programming Language agnostic (can parse in java, python, god forbid C++ :) and co.)
Preferably ISO based or other accepted standard
Easy to communicate over HTTP (Should such need arises, JSON or YAML or something in this nature)
Can represent time down to seconds resolution (the more precise the better, micro seconds if possible).
Human readable is a plus but not required
Compact is a plus but not required
Thank you,
Maxim.
yyyy-MM-ddThh:mmZ (See ISO 8601) You can add seconds, etc
You can read it easily, it will not be a problem for SimpleDateFormat.
The most canonical and standard form is probably "Unix Time": The number of seconds elapsed since midnight Coordinated Universal Time (UTC) of January 1, 1970.
If you set that as the default time-format you can easily parse it, store it in memory, write it to disk, easily communicate it over HTTP and so on. It is also definitely an accepted standard, and in a sense it is "time-zone aware", since it is well-defined regardless of time-zones.
(This is the format in which I always store all my time stamps; in databases, in memory, on disk, ...)
The "right" default format really depends on what you're doing with it. The formats for parsing, storing, and displaying can all be different.
For storing the date you're (almost) always going to want to use UTC as aioobe says, even when you want to display it in user local time. I say "(almost)" but I really can't think of a case where I would not want UTC for a saved date. You may want to store the TZ information for where the date originated also, so you can report it in that local time, but more often you want to display the local time for the whoever is currently looking at the date. That means having a way to determine the current user's local time regardless of what the original local time was.
For displaying it, the "default format" should usually be determined by the viewers locale. 08/09/10 usually means 2010-Aug-9 in the U.S. ("Middle endian") but normally means 2010-Sep-8 in most of the rest of the world ("Little endian"). The ISO-8601 format "2010-09-10" is safe and unambiguous but often not what people expect to see. You can also look over RFC-3339 for Date and Time on the internet and RFC-2822 for message format (transmitting the date)
For parsing a date, you'll want to parse it and convert it to UTC, but you should be fairly flexible on what you accept. Again, the end users Locale and timezone, if discoverable, can help you determine what format(s) of string to accept as input. This is assuming user-typed strings. If you're generating a date/time stamp you can control the form and parsing will be no problem.
I also second BalusC link which I hadn't seen before and have now favorited.

Categories