We are looking to store a large amount of user data that will be
changed and accessed daily by a large number of people. We expect
around 6-8 million subscribers to our service with each record being
approximately 2000-2500 bytes. The system needs to be running 24/7
and therefore cannot be shut down. What is the best way to implement
this? We were thinking of setting up a cluster of servers to hold the
information and another cluster to backup the information. Is this
practical?
Also, what software is available out there that can distribute query
calls across different servers and to manage large amounts of query
requests?
Thank you in advance.
Ben
changed and accessed daily by a large number of people. We expect
around 6-8 million subscribers to our service with each record being
approximately 2000-2500 bytes. The system needs to be running 24/7
and therefore cannot be shut down. What is the best way to implement
this? We were thinking of setting up a cluster of servers to hold the
information and another cluster to backup the information. Is this
practical?
Also, what software is available out there that can distribute query
calls across different servers and to manage large amounts of query
requests?
Thank you in advance.
Ben
Comment