[PHP] Limiting php execution for webservices

Soldato
Joined
18 Oct 2002
Posts
5,464
Location
London Town
I'm working on a project that interfaces with a number of REST webservices that typically have a limit on the number of requests an IP can make in a block of time. Amazon ECS, for example, asks for no more than 1 request/s.

My query is about the best way to go about preventing PHP from making more requests than desired. My concern is not about one instance of a script executing too fast, but rather many instances running concurrently. I'm using cURL for all connections and PHP5, if that helps, and I can't make use of a database.

I was thinking of generating a lock file for each webservice (requiring each instance obtaining an exclusive lock on it before making a query), but that immediately strikes me as something that will be slow and problematic.

Stuck for ideas, any suggestions? :)

/edit - Oh, and on a completely different note, does anyone here use Textpattern? Just curious really.
 
Last edited:
A table is how I did it (InnoDB rather than MyISAM)

PHP:
$connection = @mysql_connect($server, $dbusername, $dbpassword)
					or die(mysql_error());
	$db = @mysql_select_db($db_name,$connection)
					or die(mysql_error());

	//Start a response object
	print "<ajax-response><response type=\"object\" id=\"QueueUpdater\">";

	$gotlock = false; //Start with no lock
    $qnow = time(); //Remember the current time as we'll use it throughout

	//First try to lock the table
	$result = @mysql_query("LOCK TABLES `qlock` WRITE",$connection) or die(mysql_error());
	//Check to see if the queue lock is held
	$sql = "SELECT `qupdate`,`lastupdate` FROM `qlock`";
	$result = @mysql_query($sql, $connection) or die(mysql_error());
	//Get the number of records
	$num = mysql_num_rows($result);
	//Set function variables if there is a match
	if ($num != 0) {
		if ($sql = mysql_fetch_object($result))	{
			$qupdate = $sql -> qupdate;
			$lastupdate = $sql -> lastupdate;
		}
	}
    //The following stores how many action points to use for non queue related actions such as fatigue regeneration
    $globalactionpoints = $qnow - $lastupdate;

	$trylock = false; //Assume we can't get the lock
	if ( $qupdate == 0 ){ //If there is no lock set trylock to true
		$trylock = true;
	}
	if ( $qupdate == 1 && ($qnow - $lastupdate > $queuelock)) { //Or if there is a stale lock, take it by force anyway
		$trylock = true;
	}
	if ($trylock) {
		$sql = "UPDATE `qlock` set `qupdate` = 1, `lastupdate` = '" . $qnow . "'";
		//print $sql . "<br/>";
		$result = @mysql_query($sql,$connection) or die(mysql_error());
		if ($result > 0) {
			$gotlock = true; //Remember that we got the lock
			$gotlocktime = $qnow; //Remember when we got it too
		}
		//print "Result = $result";
	}
	//Always unlock the table afterwards
    @mysql_query("UNLOCK TABLES",$connection);
	//Different debug message depending on if got lock or not
	if ($gotlock) {
		print "<message text=\"Got the lock, working through queue\" />";
	} else {
		print "<message text=\"Didn't get the lock, stopping\" />";
	}
    }



    //////////////////////////
    //Core game updates follow
    //////////////////////////
    if ($gotlock) {........

Sorry, that's just a quick cut'n'paste so has some extra stuff in there.

Basically everything after that point only proceeds if $gotlock = true
 
Thanks for the reply :), but as I mentioned in my post I'm not able to use any database functionality. I am considering a lock file, but I'm hoping there might be some other solution to consider.
 
Sorry, missed that bit.

This could equally be a file, I have a database so it makes sense to use it.

The important steps are

1)Lock file for updating
2)Check file contents to see if anyone else is processing or if someone was processing but they've timed out (hence the lastupdate timestamp in my code)
3)If this process can get the lock, update the file
4)Unlock the file

That should be thread safe, or at least I'm hoping so. If you just lock the file for writes using that to stop other processes fom locking the file, what happens if script execution bombs out for the person locking the file, no one else will be able to use the service.

That's why I only lock the file/table to ensure that when I read the lock status from inside the file/table it remains valid while I decide I want to take the lock. Then I unlock straight away.
 
I think a lock file (with a filetime check for orphaned lock files) would be a good choice. Instead of opening a file and using it to store locks, which can be cumbersome, just the creation of a lock file would be quicker.

Something like:
Code:
$filename = $service->getServiceName()  . '.lock';

if ( file_exists($filename) && ((time() - filemtime($filename)) < $timeout) )
{
    throw new ServiceLockedException('blah');
}
else
{
    touch($filename);
}
EDIT: cleaned it up a bit.
 
Last edited:
Will give the lock file a go and see how things get on - doesn't seem like that cumbersome a solution when I see it like that. What I'm doing is writing a plugin for textpattern, so I want to keep the code as minimal and efficient as possible.
Thanks for the replies :).
 
The cumbersome bit about file access is usually when accessing the content. When just creating/removing files it should be alright. (infact with the time out you shouldn't need to remove the files at all)
 
Back
Top Bottom