Whenever a Python object needs to be stored or sent over a network, it is first serialized. I guess the reason is that the storage and network transfer are both based on bits. I have a stupid question, which is more like a computer science foundation question than a python question. What kind of format do python objects take when they are in cache? Shouldn't they represent themselves as bits? If that's the case, why not just use those bits to store or send the object, and why bother with serialization?
Bit Representation
The same object can have different representations in Bits on different machines:
So an object representation in Bits on the sender machine could mean nothing, (or worse could mean something else) when received on the receiver.
Take an simple integer, 1025, as an illustration of the problem:
00000000 00000000 00000100 00000001
0x00000401
00000001 00000100 00000000 00000000
0x01040000
That's why to understand each other, 2 machines have to agree on a convention, a protocol. For the IP protocol, the convention is to use the network byte order (big-endian) for example.
More on endianness in this question
Serialization (and Deserialization)
We can't directly send an object underlying bit representation on the network, for the reasons described before, but not only.
An object can make reference to another object, internally, through a pointer (the in-memory address of this second object). This address is, again, platform-dependent.
Python solves this using a serialization algorithm called pickling that transforms an object hierarchy into a byte-stream. This byte-stream, when sent over a network, is still platform-dependent and that's why a protocol is needed for both ends to understand each other.