What does the Socket method setSoTimeout() do and when should I use it?
Usually when a read() method tries to read data from a socket the program will block until the data arrives. However, if you set the SO_TIMEOUT property, the read() will only wait the specified number of milliseconds. Then, if no data was received, it will throw an InterruptedIOException. In this case the Socket will not break the connection so you can reuse the socket. The timeout period is expressed in millisecond as an argument to setSoTimeout(): 0 is the default and means "no timeout".
It is a often a good idea to set the timeout to a non-zero value to prevent deadlock of your program in case of remote crash.