I was working all day yesterday (and will probably be for most of today) with bytecode where java wrote the bytecode and matlab read it... I had never worked with bytecode in great detail before so this was quite a challenge. I mean, all in all, the experience was interesting and because it was harder then it should have been, more educational.
Still, I've been told almost every other program reads "low byte first" and thinking about it, people usually when writing in byte-code write "low byte first". I don't know if it's true, but either way, is there a specific reason sun decided to read/write byte-code this way? Perhaps for encryption?
For those unfamiliar with byte-code, let's look at an example: if your number is let's say 3 and it's two bytes long (we'll say signed since we're talking java here) then it would be written (what I think is normal, "low byte first") simply like this:
But java would write it:
Just curious. If the answer is simply, "because it does, why are apples red?" then so be it, I just want to know.
Thank you,
-blazed
Still, I've been told almost every other program reads "low byte first" and thinking about it, people usually when writing in byte-code write "low byte first". I don't know if it's true, but either way, is there a specific reason sun decided to read/write byte-code this way? Perhaps for encryption?
For those unfamiliar with byte-code, let's look at an example: if your number is let's say 3 and it's two bytes long (we'll say signed since we're talking java here) then it would be written (what I think is normal, "low byte first") simply like this:
Code:
00000000 00000011 [byte 1] [byte 2]
Code:
00000011 00000000 [byte 1] [byte 2]
Thank you,
-blazed
Comment