Hello all,
I'm kinda new to the world of C#, so please bear with me. I am trying to write a small application that will write an int to server and then get a reply in the form of a double.
The following is my code to write to the server:
[code=c#]
TcpClient bob = new TcpClient("loca lhost", 61734);
NetworkStream bill = bob.GetStream() ;
BinaryWriter binWriter = new BinaryWriter(bi ll);
Int32 num = 987;
binWriter.Write (num);
binWriter.Close ();
[/code]
However when I read the value in at the server it comes out as -620560384 not 987.
So my question is:
Is this the correct way to write an int over C# TCP/IP sockets?
Also the server code is written in java, does C# do anything weird to bytes before it send them?
Cheers,
Tony.
I'm kinda new to the world of C#, so please bear with me. I am trying to write a small application that will write an int to server and then get a reply in the form of a double.
The following is my code to write to the server:
[code=c#]
TcpClient bob = new TcpClient("loca lhost", 61734);
NetworkStream bill = bob.GetStream() ;
BinaryWriter binWriter = new BinaryWriter(bi ll);
Int32 num = 987;
binWriter.Write (num);
binWriter.Close ();
[/code]
However when I read the value in at the server it comes out as -620560384 not 987.
So my question is:
Is this the correct way to write an int over C# TCP/IP sockets?
Also the server code is written in java, does C# do anything weird to bytes before it send them?
Cheers,
Tony.
Comment