So, you have a big web based system dealing with some pretty large databases. Do you:
- Do all the big database operations directly in your web app as the user clicks the button, making them wait for a good few seconds
or
- Do just the bare minimum (like display “Thanks” !) and farm the job out to another process somewhere
Yep, I thought so to.
So I set about building a generic system that can run these background tasks, the idea being I can simply pass jobs from any of my apps into a ‘queue’ to be processed by this background system.
Design criteria:
- Fault tolerant (if we’re relying on this for doing things asynchronously to the UI it needs to be stable)
- Scalable (we don’t want the queues backing up!)
- Easy to create new types of job (The scheduler itself shouldn’t need to know anything about the jobs themselves)
After pondering the problem for a while, I decided the best way to do this was to treat each process as a webpage. Quite a few reasons for this: They’re language independent, they’re essentially multi threaded by nature, they’re discreet so if one web call fails it doesn’t bring down the whole process, they’re server independent (I can call jobs on other servers). I could simply call a bunch of web pages with the appropriate query string like so:
dobillingstuff.aspx?rowid=123
sendmessage.aspx?messageid=5678
Thinking along those lines, what if I split up big jobs into little jobs? I could have a ‘master’ job that then churned out a load of smaller ‘child’ jobs…
compilestatsforusers.aspx
…which then generates these queue requests that do the donkey work…
compileuserstats.aspx?userid=1
compileuserstats.aspx?userid=2
compileuserstats.aspx?userid=3
compileuserstats.aspx?userid=4
compileuserstats.aspx?userid=5
…etc…
Now, all I need is a way of executing these jobs……