I have written a "live" search feature for my website. Whenever a keyUp
event is triggered in the search field, the JavaScript performs an AJAX request to a backend PHP page, which gets the appropriate records from a database, and returns them in JSON form, which the JavaScript then decodes and displays in the appropriate #searchOutput
<div>
.
At the moment, each time the PHP page is invoked (keyUp
event is fired on the search field), a new (non-persistent) PDO connection is created for querying the database. This seems horribly inefficient and slow, as it is probably putting a massive load on the server.
Here's an example JSON response (very simple structure):
[{"title": "How much does it cost?", "url": "pricing.php"}, {"title": "Special Discounts for Students/Teachers", "url": "specialpricing.php"}]
Using the website jsonviewer.stack.hu, it can be visualised:
Visualisation of the sample JSON data
Here's some* of the JavaScript which performs the AJAX request.
(I say some as the a lot of it is polish - fancy error handlers etc.).
xmlhttp.onreadystatechange = function() {
if (this.readyState === 4) {
if (this.status === 200) {
var json = null;
try {
json = JSON.parse(this.responseText);
} catch (e) {
// Error parsing JSON (invalid response)
errHandler("The response was not valid JSON!");
return;
}
/* Error checking
* If the err attribute is set, and error has happened */
//noinspection JSUnresolvedVariable
if (json.err !== undefined) {
//noinspection JSUnresolvedVariable
errHandler(json.err);
return;
}
if (json.length === 0) {
target.innerHTML = "No results were found for " + query + ".<br />Try widening your search.";
return;
}
outputSearchResults(json);
} else {
errHandler("invalid HTTP response code of " + this.status);
}
}
};
xmlhttp.open("GET", "apiResponder.php?mode=0&query=" + query, true);
xmlhttp.timeout = 2000; // 2s timeout
xmlhttp.send();
That is a rather standard AJAX to PHP function. The errHandler
function just causes an error with whatever is passed in (and disconnects the error handler from the search field, so a user can't hammer a server with invalid requests). I also have onerror
and ontimeout
functions for the XMLHTTPRequest
(xmlhttp
) object, but they're not important here.
Now the part that I'm almost sure there is a better way of implementing - the back-end PHP that actually processes the search terms, queries the database, and returns the JSON. It should be mentioned that this is NOT apiResponder.php
- that file is merely an interface that calls this file, when the mode
GET parameter is set to zero.
This file runs the following code every time the keyUp
event is triggered on the search field:
require_once($_SERVER["DOCUMENT_ROOT"] . "class.Database.php");
new Database("mysql:dbname=sp_comm;host=localhost", "root", true);
Of course after that it actually DOES the querying and returning, but this is the performance-killer. When we look inside the Database
class' constructor, it creates a standard, NON-persistent connection:
function __construct($dsn, $username, $persistent) {
if (Database::$db === NULL) {
Database::$db = new PDO($dsn, $username, dbpw);
Database::$db->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
Database::$db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch (PDOException $e) {
//todo
die("could not connect to database: " . $e);
}
}
}
This seems horribly inefficient, but when using persistent connections, it would very quickly start denying other requests as the connection limit would be reached very quickly. After only five requests to this search page (ID 269 is the terminal monitor):
MySQL connection monitor when persistent connections are enabled
As PHP has not got connection pooling, what's the most efficient way of doing this?
As far as I see, persistent connections are not an option as the slight speed increase is not worth filling up my connection limit that fast and denying all other users' access! Is there some sort of "secret option" in the PDO constructor that I'm missing, and I'm going about this completely wrong? The PHP.net (among many other) websites do this "live" search feature and not only is it instant, I'm quite sure they don't use persistent connections!
1 Answer 1
There are a number of options.
Take out the database
Instead of storing in a database, store in a cache. This could be something local like APC or memcache. Or it could be something remote like memcache or DynamoDB.
If space is at a premium, you can just save the short combinations. At some point, you fail over to the database.
Caching generally either doesn't require authorization or it offers low overhead authorization.
Use a service
Your problem is that you are calling the database directly from your web response. So the connection lasts as long as it takes to generate the page. If you move it to a separate service, you can control how to authenticate the service call. The request only needs to last as long as the database activity in the server. So it can use persistent connections without the overhead of serving up an entire web page.
You could throw out PHP altogether and switch to a language with connection pooling for the service. You could continue using PHP for the actual web page, but the data would be managed in the service.
Use a modified version of PHP
Use PHP with a custom database extension that provides pooling.
You can see more discussion on Stack Overflow.
-
\$\begingroup\$ Going with the APC cache method, would it be sensible to run a cron job say, every thirty minutes, to update the APC cache with data from the database? \$\endgroup\$carefulnow1– carefulnow12017年07月11日 07:52:33 +00:00Commented Jul 11, 2017 at 7:52
careful now
, you're sending a search query forc
,ca
,car
,care
,caref
..., regardless of how fast it is. You should wait until the user has made up their mind (which is some sort of arbitrary timeout) before you send the search. \$\endgroup\$