We are trying to design/build a system that effectively treats any
number of commodity machines as a large "grid", and will deploy and run
binaries as resources. These resources are housekept, monitored and
basically kept running. The backend talks to all the servers via UDP
sockets, makes sure the processes are performing optimally, etc.
We want a system that will enable users to purchase a service from our
website, when paid for the website contacts a database, which will run
on a master server, every 5 min or so this server checks if there have
been any new DB entry?s if there have it reads them, finds out what
service was purchased then automatically downloads and configures the
necessary packages that we have built onto a remote server and then runs
the process and continues to monitor it. Thus meaning that the users
service is online as soon as it has been paid for and is watch to ensure
it remains online.
The website also has to be able to access the database to allow users to
edit settings which will then be stored in text based files read by
their individual services.
All service updates are built on a remote machine and then uploaded to
the master which automatically upgrades any services running on the slaves.
Control System (packages and updates compiled and uploaded)
-->Remote Master (via secure communication)
-----> Automatically and managed updated remote nodes
Any recommendations for cluster tools/ any tools infact that could be
used to achieve this at any level ? would be most welcome. We were also
looking at using a tool such as uml so that we could totally restrict
what the users could do and so they had no way to alter or affect other