How to handle extremely large tar file?

I use my server to serve videos that are very large. A few weeks ago, my server was hacked, and I had to do a reinstall. I specifically asked my provider to keep the files. The tech who did the resinstall tar'd and zipped the /home directory.

The server is now back online. However, the tar files(unzipped) is about 6.9GB. I tried untar-ing it, but after waiting for 20 minutes, my /home directory was filled, and I think because of this, some files were truncated.

I was hoping to be able to extract individual files out to another partition, maybe /usr temporarily. But I don't know what the command for extracting only some files is. I tried to do a tar --list, but it just hangs there for 5 minutes, and I had to kill it. I know also that there are a few large files in the tar file that I don't need, but I don't know how to delete them from the tarball.(nor do I know their names, since tar --list doesn't work).

My machine is fairly low-end, its a celeron 800 with 128MB of RAM. What would be your recommendation to this problem?

Here is the df output without all the files extracted(but with the large tar file)

------------
Filesystem 1k-blocks Used Available Use% Mounted on
/dev/hda8 256667 196019 47396 81% /
/dev/hda1 54416 14585 37022 29% /boot
/dev/hda6 12871596 7272012 4945744 60% /home
/dev/hda5 12871596 821464 11396292 7% /usr
/dev/hda7 256667 33005 210410 14% /var
------------

and here is the tar file itself with ls -lhs,

------------
7.0G -rw-r--r-- 1 root root 6.9G Feb 27 14:37 home.tar
------------

Thanks,

Peter

 

 

 

 

Top