I get this field from kafka that use avro schema:
long dateTime = 1499070300000L; //value is just for example, but have to be micros
It has logicalType: "logical-timestamp-micros"
And I have this kind of Dto:
@JsonFormat(pattern = "dd.MM.yy HH:mm:ss")
private LocalDateTime time;
And I am trying to convert field dateTime to be currentDateTime field format:
long dateTime = 1499070300000L;
DateTimeFormatter dateTimeFormatter = DateTimeFormatter.ofPattern("dd.MM.yy HH:mm:ss");
LocalDateTime localDateTime = LocalDateTime.parse(Instant.ofEpochMilli(dateTime)
.atZone(ZoneId.systemDefault())
.format(dateTimeFormatter));
But I am getting this error:
Exception in thread "main" java.time.format.DateTimeParseException: Text '03.07.17 11:25:00' could not be parsed at index 0
Where am I doing wrong?
Don't convert your ZonedDateTime
to a String
and back to a LocalDateTime
.
Instead, use what the API provides. In your case, LocalDateTime.ofInstant seems to be just what you need:
LocalDateTime localDateTime = LocalDateTime.ofInstant(Instant.ofEpochMilli(dateTime), ZoneId.systemDefault());
Aside from that, you meantioned that the timestamp has the type logical-timestamp-micros
. Instant.ofEpochMilli
requires milliseconds. If your long
is actually the time in microseconds, you might want to divide it by 1000 before using it in Instant.ofEpochMilli
. Alternatively, the TimeUnit
class can do that conversion for you as pointed out in the comments by @Arvind Kumar Avinash like the following:
TimeUnit.MILLISECONDS.convert(milliTime, TimeUnit.MICROSECONDS)
If you care about preserving microseconds, you can use Instant.EPOCH.plus(timeMicros, ChronoUnit.MICROS)
for obtaining the Instant
instead of using Instant.ofEpochMilli
as suggested in the comments by Ole V. V. (the comment has been deleted). See this answer for details on that.
Both methods assume the timestamp being in microseconds since January 1st, 1970 (the UNIX-Epoch).