Hi,
I am trying to send/receive data from a socket.
The call to receive does not block .
If I use sleep between send and receive things are ok.
Here is the code -
Socket m_socClient;
//create a new client socket ...
m_socClient = new Socket(AddressF amily.InterNetw ork, SocketType.Stre am,
ProtocolType.Tc p);
String szIPSelected = m_strServerUri;
String szPort = "800";
int alPort = System.Convert. ToInt16 (szPort, 10);
System.Net.IPAd dress remoteIPAddress = System.Net.IPAd dress.Parse(szI PSelected);
System.Net.IPEn dPoint remoteEndPoint = new
System.Net.IPEn dPoint(remoteIP Address, alPort);
m_socClient.Con nect(remoteEndP oint);
string requestHeader = "POST / HTTP/1.1\r\nContent-Type: " +
"applicatio n/vnd.\r\n" +
"Content-Length: 1000" +
"\r\nConnection : Close\r\n\r\n";
Byte[] bytesHeader = Encoding.ASCII. GetBytes (requestHeader) ;
// Blocks until send returns.
int nBytes = 0;
nBytes = m_socClient.Sen d (bytesHeader , 0, bytesHeader .Length,
SocketFlags.Non e);
Console.WriteLi ne ("No. of bytes sent" + nBytes);
byte[] response = new byte[4000];
// Doesn't block until read returns.
m_socClient.Blo cking = true;
nBytes = 0;
//*************** PROBLEM HERE ************////
while ( nBytes > 0 )
{
nBytes = m_socClient.Rec eive (response, response.Length , 0);
}
//*************** *************** **************
Console.WriteLi ne ("No. of bytes recvd." + nBytes);
Always nBytes is returning as 0. If I use sleep between send and receive things
work but that's not the proper way.
What is going wrong?
Best regards,
SK
I am trying to send/receive data from a socket.
The call to receive does not block .
If I use sleep between send and receive things are ok.
Here is the code -
Socket m_socClient;
//create a new client socket ...
m_socClient = new Socket(AddressF amily.InterNetw ork, SocketType.Stre am,
ProtocolType.Tc p);
String szIPSelected = m_strServerUri;
String szPort = "800";
int alPort = System.Convert. ToInt16 (szPort, 10);
System.Net.IPAd dress remoteIPAddress = System.Net.IPAd dress.Parse(szI PSelected);
System.Net.IPEn dPoint remoteEndPoint = new
System.Net.IPEn dPoint(remoteIP Address, alPort);
m_socClient.Con nect(remoteEndP oint);
string requestHeader = "POST / HTTP/1.1\r\nContent-Type: " +
"applicatio n/vnd.\r\n" +
"Content-Length: 1000" +
"\r\nConnection : Close\r\n\r\n";
Byte[] bytesHeader = Encoding.ASCII. GetBytes (requestHeader) ;
// Blocks until send returns.
int nBytes = 0;
nBytes = m_socClient.Sen d (bytesHeader , 0, bytesHeader .Length,
SocketFlags.Non e);
Console.WriteLi ne ("No. of bytes sent" + nBytes);
byte[] response = new byte[4000];
// Doesn't block until read returns.
m_socClient.Blo cking = true;
nBytes = 0;
//*************** PROBLEM HERE ************////
while ( nBytes > 0 )
{
nBytes = m_socClient.Rec eive (response, response.Length , 0);
}
//*************** *************** **************
Console.WriteLi ne ("No. of bytes recvd." + nBytes);
Always nBytes is returning as 0. If I use sleep between send and receive things
work but that's not the proper way.
What is going wrong?
Best regards,
SK