Running Massive Databases
Hi - we've developed a solution for a client - the website runs data feeds from merchants and then allows visitors to search for these items (in categories) and the results are show in order of the merchants bid per click (ie a bid per click search engine but using datafeeds)The problme is that they are growing fast and its starting to get a tad slow. The machien is a dual xeon (looks like 4 cpu's therefore) and 1 G ram (up to 4 G possible)
The database runs on MySQL4 - however we're afraid the database will have millions of listings in a quick order of time and we're looking into how to let this grow without running into the slowdowns -
how do huge databases (such as this site which I can imagine has millions upon millions of database entries) run efficiently?
We've thought of solutions that involve load balancing / clustering but more details are needed. I know clustering on MySQL is one way (replication) but it may have to do.
What can we do to speed up the search through millions of entries? Load the whole db into ram? Split the database so that different products/merchants are on different servers?
Any real suggestion (maybe some from the WHT staff) as to how this can be done would be greatly appreciated.
Thanks.