pythonsocketsudplink-local

sending a ihex (intel Hex) file over UDP using python Sockets


I am working with IPv6 and UDP sockets using python-2.7. I am particularly focusing on IPv6 Multicasting ff02::1 where every Link-Local address device (with fe80::) responds to queries from a central server entity.

I have Microcontrollers attached to these devices which need a program in the form of .ihex (Intel Hex). A snippet of the file is as below:

:103100005542200135D0085A8245381131400031EE
:103110003F4002000F9308249242381120012F8370
:103120009F4F1E390011F8233F4036000F930724AC

I thing the way to go about it is using struct and using functions like pack and unpack but I am not sure whether sending such an ihex file which is in the size of few Kbs would solve the purpose.

Can I do something like:

#!/usr/bin/env python

from struct import pack, unpack
import socket
.   # Create a UDP socket and Bind it..
.
myHexCode = open("Filename.ihex")
dataToSend = struct.pack('Paramaters for packing', myHexCode)
.
. Send data to socket..

What will be the Packing Parameters ? (should I do ! or Big or small Endian > or < for the Hex File?)

Note


Solution

  • Using UDP with multicast to send a file to multiple consumers seems fraught with issues -- unless the entire file fits within a single packet. UDP packets may be dropped/discarded for a number of reasons and you've already said the network is lossy. You'll need to have a way for each consumer to track and notify the sender about dropped packets.

    That being said, it's certainly doable. One idea: Construct an initial packet that you multicast several times, perhaps with random delays in between (in an attempt to ensure that all stations receive it). This initial packet saying in effect like "About to send new program - expect N subsequent record packets").

    Then send the N packets, perhaps twice each. Put an identifying serial number in each data packet, and have the consumers track which ones they receive. After some delay (or when all have been received), have each consumer respond with a status packet saying either "I got all N packets" or "I didn't get records 5, 98, 144, and 3019" (or whatever scheme is appropriate based on lossiness).

    Then the sender can collect these lost record IDs and resend those until all consumers are satisfied they have received the entire file.

    For packing them into the datagrams, I don't think it matters whether you send "intel hex" or binary. In either case, you will want to send them as a stream of bytes. Binary will be smaller and hence take fewer packets, but there's no other difference in the process of sending them. For the same reason, the byte ordering you select won't make a difference. For simple byte streams, you don't need to use struct to pack them at all. You can just send them.

    Note: python3 distinguishes between bytes and str types so you would need to open the file in "binary" mode for this to remain true with python3.

    However, if as suggested above, you end up sending a serial number in each packet, that serial number will need to be formatted somehow. You can format a number as an ascii string and no struct.pack is needed. Or if you format it in binary, you'd need to select a packing format. Conventionally networking packets use "network byte order" (which is actually the same as big-endian) but that is only a convention.

    If I were doing it, I might structure each record with:

    You could then create the payload with something like this:

    payload = struct.pack("!BHH", record_type, record_id, len(record_data))
              + record_data
    

    Here we're creating a 5 byte string with struct.pack containing the "header" fields (packed using network byte order), then simply appending the record_data which is already a byte string.