Hi all I have to create an application that needs to stay connected as much as possible to a websocket server to listen for events. Since its a security app, resources such as battery and radio usage is not a concern.
I am using standard org.java_websocket with the Draft_17 spec. The application is able to communicate using websockets. The issue is that when the phone loses connection to the server. It takes a few min for the websocket to realise that it's disconnected before calling the onClose() callback with a code 1006. I need to reduce this time as much as possible. When using HTTP clients on android this is no issue, we just simply set the socket timeout. However using org.java_websocket I am unable to figure out how to set this timeout.
I am constantly sending/receiving data from the server. Sending does not speed up the socket timeout in any way.
There is a connection timeout parameter on the WebSocket constructor.
return new WebSocketClient(hostURI, new Draft_17(), null, 5000)
But it has no effect as I think this is only when trying to connect to the server initially.
Any help or guidance will be appreciated
-
did you find any fix for this ?patric_cena– patric_cena2016年11月30日 10:31:57 +00:00Commented Nov 30, 2016 at 10:31
-
@patric Not really. I ended up sending a byte of data to the client every 3 seconds then having a timer on the Android app set to run every 3 seconds. When I receive the byte then I set a flag to true. If my timer fires then I set it to false. the moment it sets it to false twice in a row then I ignore the old connection and start a new one. Works well since its in production already.Mitch Dart– Mitch Dart2016年11月30日 13:00:07 +00:00Commented Nov 30, 2016 at 13:00
-
Thanks for reply am also working on similar kind of work around. Thanks.patric_cena– patric_cena2016年12月02日 07:42:58 +00:00Commented Dec 2, 2016 at 7:42
1 Answer 1
If you check the WebSocketClient source code, you will notify that the timeout parameter is never used