I'm working on a project that interfaces with a number of REST webservices that typically have a limit on the number of requests an IP can make in a block of time. Amazon ECS, for example, asks for no more than 1 request/s.
My query is about the best way to go about preventing PHP from making more requests than desired. My concern is not about one instance of a script executing too fast, but rather many instances running concurrently. I'm using cURL for all connections and PHP5, if that helps, and I can't make use of a database.
I was thinking of generating a lock file for each webservice (requiring each instance obtaining an exclusive lock on it before making a query), but that immediately strikes me as something that will be slow and problematic.
Stuck for ideas, any suggestions?
/edit - Oh, and on a completely different note, does anyone here use Textpattern? Just curious really.
My query is about the best way to go about preventing PHP from making more requests than desired. My concern is not about one instance of a script executing too fast, but rather many instances running concurrently. I'm using cURL for all connections and PHP5, if that helps, and I can't make use of a database.
I was thinking of generating a lock file for each webservice (requiring each instance obtaining an exclusive lock on it before making a query), but that immediately strikes me as something that will be slow and problematic.
Stuck for ideas, any suggestions?

/edit - Oh, and on a completely different note, does anyone here use Textpattern? Just curious really.
Last edited: