resource use: wget vs. ftp
I just got a "new" server tonight - a Redhat 7.3 Pentium 233 MMX with 64 MB of Ram. (it's for secondary dns and to hold a couple tar.gz backup files.)I was testing it tonight to see how stable it was by running seti on it (as I'm in the habit of doing every time I get a new computer/server.)
I was also moving a 1.7 GB backup .tar.gz file onto the server using wget to pull it from another server. I had previously moved 1.5 GB of files onto the new 233 server via ftp and that went fine. However at 5:30 am, 80% of the way through the wget the server just stopped responding to ssh, http, webmin, and ftp so it must of locked up.
I had a total of 3.5 GB of free space on the hard drive, 2.5 GB in the partition into which I was placing the 1.7 GB backup file.
Do you think it locked up due to using wget, or due to running seti on it? (I do have seti running on a dual ppro 200 here at home and it's been running fine for two years now, so I didn't think the low cpu speed would cause it to crash, but maybe the 64 mb of memory is the limiting factor with the new 233.)
Once it comes back online later today, will there be any way to see what caused the crash?