I found an interesting anomaly yesterday while debugging a .NET System.Net.Sockets.Socket problem. I'm using the asynchronous socket methods, which is probably a mistake. In hindsight, I would have been better off just spawning a couple of threads and using the synchronous methods.
After making the connection between my listening and connecting socket objects, I call BeginReceive on both ends and and pass in my callbacks, named OnReceivedData. (This callback is supposed to get called when there's incoming data on the socket.)
The anomaly is that the connecting socket's OnReceivedData callback is being invoked even when no data is sent by the listening socket. OnReceivedData is (improperly) invoked one single time exactly one second after the connection is made. When EndReceive is called during to this invokation of OnReceivedData by the connecting socket, an exception is thrown, probably because the socket isn't in the right state for EndReceive to work properly. (If EndReceive is called and there is no data, one would expect the method to simply return 0 bytes, not throw an exception.) If this exception is ignored, OnReceivedData isn't invoked improperly again and everything proceeds as would be expected.
If the listening socket sends a single byte within this first one-second window, OnReceivedData is invoked by the connecting socket (properly) and EndReceive is called and completes (as expected) without an exception. The connection then functions normally.
It doesn't seem like it should be necessary for the listening socket to send data to the connecting socket; the listening socket doesn't improperly invoke its callback if it doesn't receive data. Once the connection is established, there's no reason the listening socket and the connecting socket should behave differently. Seeing as how this behavior is not documented, I think it's a bug. I haven't attemped to recreate it yet in Mono on any platform.