pythonbinaryfilesdlt-daemondlt

How to correctly split DLT-log file into multiple files


I'm trying to develop a dlt-analyzer that would check newly generated logs "on the fly".

For this purpose I run the dlt-receive that outputs all the logs into main.dlt file. Then using Python code I split the logs into 16kB chunks with readlines method and put each chunk subsequently into temp.dlt:

def read_in_chunks(file_object):
     while True:
         data = file_object.readlines(16384)
         yield data

with open('main.dlt', 'rb') as f:
    for chunks in read_in_chunks(f):
        with open('temp.dlt', 'wb') as temp_dlt:
            for chunk in chunks:
                temp_dlt.write(chunk)

Then I run dlt-viewer -s -csv -f <FILTER NAME> -c temp.dlt results.csv to get filtered results. But in most cases it doesn't work (results.csv file appears empty) as it seems that logs from main.dlt copied to temp.dlt ignoring dlt-headers and so dlt-viewer unable to correctly parse logs... Is there a way to split DLT file with preserving message headers? Or can I somehow add missed headers automatically?


Solution

  • dlt-tools package includes tool called dlt-convert. This is useful for splitting / combining dlt files. To extract some part of dlt log from the bigger one, the syntax is:

    dlt_convert -b <index_from> -e <index_to> -o part.dlt full.dlt
    

    In your case you could track the latest message ID in the incoming log and then do splits based on that:

    loop:
     # remember latest now -> latest_index = ...
     dlt_convert -b <track_index> -o latest_part.dlt input.dlt
     # do smth with part.dlt ...
     # reassign: track_index = last_index
     
    

    Thus, I don't think it's good to just split input log at random places, but you can split with dlt_convert or adapt based on its implementation splitting based on index, i.e. with 1000 messages step.

    Note however, that dlt-viewer can also read and show logs in real time (press connect, select ECU and protocol e.g. TCP + port).