So far I thought they are the same as bytes are made of bits and that both side needs to know byte size and endiannes of the other side and transform stream accordingly. However Wikipedia says that byte stream != bit stream (https://en.wikipedia.or g/wiki/Byte_stream ) and that bit streams are specifically used in video coding (https://en.wikipedia.or g/wiki/Bitstream_forma t). In this RFC https://tools.ietf.org/html/rfc107 they discuss these 2 things and describe "Two separate kinds of inefficiency arose from bit streams.". My questions are:
- what's the real difference between byte stream and bit stream?
- how bit stream works if it's different from byte stream? How does a receiving side know how many bits to process at a given time?
- why is bit stream better than byte stream in some cases?
- what's the real difference between byte stream and bit stream?
- how bit stream works if it's different from byte stream? How does a receiving side know how many bits to process at a given time?
- why is bit stream better than byte stream in some cases?