pythonapache-flinkparquet

Apache Flink Python Datastream API sink to Parquet


I have a Kafka topic that contains json messages. Using Flink Python API I try to process this messages and store in parquet files in GCS.

Here is cleaned code snippet:

class Extract(MapFunction):
    def map(self, value):
        record = json.loads(value)
        dt_object = datetime.strptime(record['ts'], "%Y-%m-%dT%H:%M:%SZ")
        return Row(dt_object, record['event_id'])

 <...>

events_schema = DataTypes.ROW([
    DataTypes.FIELD("ts", DataTypes.TIMESTAMP()),
    DataTypes.FIELD("event_id", DataTypes.STRING())
])
<...>

# Main job part
kafka_source = KafkaSource.builder() \
        <...>
        .build()

ds: DataStream = env.from_source(kafka_source, WatermarkStrategy.no_watermarks(), "Kafka Source")

mapped_data = ds.map(Extract(), Types.ROW([Types.SQL_TIMESTAMP(), Types.STRING()]))

sink = (FileSink
        .for_bulk_format("gs://<my_events_path>",
                         ParquetBulkWriters.for_row_type(row_type=events_schema))
        .with_output_file_config(
            OutputFileConfig.builder()
            .with_part_prefix("bids")
            .with_part_suffix(".parquet")
            .build())
        .build())

mapped_data.sink_to(sink)

The problem is when I try to run this job I get an error:

Java.lang.ClassCastException: class java.sql.Timestamp cannot be cast to class java.time.LocalDateTime (java.sql.Timestamp is in module java.sql of loader 'platform'; java.time.LocalDateTime is in module java.base of loader 'bootstrap')

So the problem is that Types.SQL_TIMESTAMP() and DataTypes.TIMESTAMP() are not compatible when translated in corresponding Java classes. But I don't see any other option to "typify" my mapping transformation.

If instead of

mapped_data = ds.map(Extract(), Types.ROW([Types.SQL_TIMESTAMP(), Types.STRING()]))

I use this option

mapped_data = ds.map(Extract())

then I get another error:

java.lang.ClassCastException: class [B cannot be cast to class org.apache.flink.types.Row ([B is in module java.base of loader 'bootstrap'; org.apache.flink.types.Row is in unnamed module of loader 'app')

My question is can I save data containing timestamps in parquet format using Flink Python API?


Solution

  • So I managed to run this. There is a LocalTimeTypeInfo class for some reasons not listed under Types static methods.

    So if I change

    mapped_data = ds.map(Extract(), Types.ROW([Types.SQL_TIMESTAMP(), Types.STRING()]))
    

    to

    mapped_data = ds.map(Extract(), Types.ROW([LocalTimeTypeInfo(LocalTimeTypeInfo.TimeType.LOCAL_DATE_TIME), Types.STRING()]))
    

    then it will work and create parquet files with timestamp columns. Still there is an issue because it uses deprecated INT96 type for physical representation but it works.