limit Python CGI's frequency of calls to a database?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • mab43
    New Member
    • Dec 2009
    • 7

    limit Python CGI's frequency of calls to a database?

    I've got a Python CGI script that pulls data from a GPS service; I'd like this information to be updated on the webpage about once every 10s (the max allowed by the GPS service's TOS). But there could be, say, 100 users viewing the webpage at once, all calling the script.

    I think the users' scripts need to grab data from a buffer page that itself only upates once every ten seconds. How can I make this buffer page auto-update if there's no one directly viewing the content (and not accessing the CGI)? Are there better ways to accomplish this? Database? Server-side cron? I'm very new to these topics but these are ideas I've heard.
  • gits
    Recognized Expert Moderator Expert
    • May 2007
    • 5390

    #2
    the serverside cron is a good start. let the cron write a file every 10s and then write a script that reads the file. the webpage just calls the 'reader-script' -> so you would have your 'buffer-page' ...

    kind regards

    Comment

    • mab43
      New Member
      • Dec 2009
      • 7

      #3
      gits -- thanks, I read into my host's TOS and they don't want cron jobs running more than once every 15 minutes.. What if I had a local script pull GPS data every 10s then upload a new file to the server if the data has changed? That just seems inefficient but I'm struggling to think of another option.

      Comment

      • gits
        Recognized Expert Moderator Expert
        • May 2007
        • 5390

        #4
        you could even have a serverside script that reads the gps-data ... and writes a file. when that is done just write a lockfile and then let the next request read from the file. after 10s just remove the lockfile and read again from the GPS-service to write a new file - to give you just another idea ...

        kind regards

        Comment

        • acoder
          Recognized Expert MVP
          • Nov 2006
          • 16032

          #5
          Another possibility: use a database to store the results and the time that the results were retrieved in a cache. If the time has elapsed (time retrieved > result retrieve time + 10s), then make a new request. If no users are viewing the page for a length of time, there's no need to make a request every 10 seconds. You just make the next request when the next user views the page by which time 10 seconds will have passed.

          Comment

          • mab43
            New Member
            • Dec 2009
            • 7

            #6
            acoder, thanks, this is exactly what I did. The script checks the timestamp of the data first thing and only fetches new data if ten seconds have elapsed. The data file is updated after a new fetch. Thanks again; here's the code I used.

            I guess my last worry is about read/write interference if the system takes on a lot of users at once. Maybe I'll try gits' write-lock idea...

            Comment

            Working...