These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE Technology Lab

 
  • Topic is locked indefinitely.
 

Re-using HTTP connection to API server (HTTP persistent conn.)

Author
BigSako
Aliastra
Gallente Federation
#1 - 2016-02-20 13:22:07 UTC  |  Edited by: BigSako
Hi there,

For years I've been using this procedure to read data from the eve api:

(trying to make some php pseudo code here)
function read_from_eve_api($some_eve_api_url)
{
$handle = open($some_eve_api_url);
$data = $handle->read();
$handle->close();
return $data;
}


edit: just to clarify, I'm talking about the "old" xml api


I recently learned that with HTTP/1.1 you can issue multiple GET requests with the same connection handle/socket (this is called HTTP persistent connection or connection re-use, see https://en.wikipedia.org/wiki/HTTP_persistent_connection ), so I've rewritten above method like this:


$global_handle = 0;

function read_from_eve_api($some_eve_api_url)
{
global $global_handle;

if ($global_handle == 0)
{
  // init
  $global_handle = init_handle();
}
$global_handle->set_url($some_eve_api_url);
$data = $global_handle->read();
return $data;
}


There is multiple benefits from this, such as avoiding TCP Congestion Control ramping up every time you start a new connection, as well as avoiding the TCP handshake for connecting, DNS lookups, SSL handshake only once for multiple GET requests, ...


I've already implemented this with my API scripts, so yes, this is working and brings a performance increase for my app.

Though I would like to get an official statement whether this is allowed / envisaged, or whether this could cause very bad side effects.

If you need any more info, please let me know.

If you are also using HTTP keep-alive/connection re-use, please comment and tell me about your experience.
Aineko Macx
#2 - 2016-02-20 13:57:47 UTC  |  Edited by: Aineko Macx
It is not only allowed but AFAIK encouraged. And not only that, you can have multiple parallel requests in-flight. That is required to achieve 100+ reqs per second.

Php curl sends keep-alive header per default, so as long as you are re-using the curl context object, you're already using persistent connections.
I have implemented async parallel requests in my lib. It works great, but the code is not pretty thanks to the ugly curl async API.