I'm using the darksky api to get weather data for a display board website.
The free version is limited to 1000 api calls per month.
I have a lot of devices (iPads) displaying this information in the building and refreshing every minute or so to update weather and the message of the hour.
I think that caching the weather into a file and reading it when the refresh time is over is a solution to my problem.
Is it the best solution?
And is this the best implementation of this solution
function getWeather(){
//compares the time of the saved weather at each call of the method
$last_modif = filemtime("weather.json");
//if the weather is 10 minutes old, renew it by calling the api
if(time() - $last_modif > WEATHER_REFRESH_TIME){
//echo "call api";
$file = fopen("weather.json", "w");
$request = 'https://api.darksky.net/forecast/MY_KEY/LAT,LNG?lang=fr&units=si&exclude=minutely,flags';
$data = file_get_contents($request);
fwrite($file,$data);
fclose($file);
}
$file = fopen("weather.json", "r");
$weather = fread($file,filesize("weather.json"));
fclose($file);
return json_decode($weather);
}
Should i write the data in a database instead or is there any other way to store the weather?
What are the best practices in this case ?
1 Answer 1
Whether you stick with a flat file or a database depends on how you're using the data. It looks you you're only caching the latest API response for one specific query and not any historical data or data for other queries. If you have no desire to use those other data, caching to a file as you're doing is the easiest solution.
If you want to use other data at some point (maybe for multiple offices or historical graphs), a database would be far more scalable and manageable.
Spinning up a basic MySQL DB on your server would be fairly painless.
Install MySQL, if needed: https://www.digitalocean.com/community/tutorials/how-to-install-mysql-on-ubuntu-14-04
(Optional) Install PHPMyAdmin, if you're not comfortable running MySQL commands over the terminal: https://www.digitalocean.com/community/tutorials/how-to-install-and-secure-phpmyadmin-on-ubuntu-16-04
Create the weather database and user.
Create whatever tables you need for caching the data. This should include either a column next to the data saying when it was cached or a table that records the time of the api queries.
Use some basic PDO to connect to the database and query it: http://coursesweb.net/php-mysql/pdo-introduction-connection-database
With either scheme, if users are hitting your service frequently enough, you can continue with your current scheme of requerying the API as you serve the page, after the data invalidation interval.
If you want to prevent any slow-downs for users or if they're not hitting it regularly enough, you could create a cron job to run your update script, regardless of where you're dumping the data.
As for your current script, I wrestled a bear once is correct. You should try to avoid opening and closing the same file repeatedly. You might try refactoring to this:
function getWeather()
{
$last_modif = filemtime('weather.json');
$response = '';
if(time() - $last_modif > WEATHER_REFRESH_TIME)
{
$response = file_get_contents('https://api.darksky.net/forecast/MY_KEY/LAT,LNG?lang=fr&units=si&exclude=minutely,flags');
file_put_contents('weather.json', $response);
}
else
{
$response = file_get_contents('weather.json');
}
return json_decode($response);
}
file get contetns
then why not use it below instead of fopen/fread/fclose. You could also shorten the code by using file_put_contents. \$\endgroup\$