Extracting Tar files in php in safe mode with no shell access

I have just about had it with web servers that give no shell access and operate in safe mode. I wanted to install Dataface on my webspace here at sjhannah.com. The full installation is about 2.6 megs as a compressed tar file (much much bigger uncompressed) so it is kind of unfeasible to upload the uncompressed files to the server. Given the tendency of ftp to hang before during and after each file is uploaded, it would likely take 5 to 6 hours to upload the conventional way using an FTP client.

What I really needed to be able to do was upload the compressed tar archive to the server, then extract its contents on the server. Sounds easy, right. Well not when your server doesn’t provide shell access and PHP is executed in safe mode. Okay, Okay, normally you could create a PHP script like the following:

< ?
system("tar -xvfz filename.tar.gz");

?>

Except for the fact that my server doesn’t even provide ‘tar’ in the path – so even this command won’t work on my server.

PEAR to the rescue!!!

PEAR is a repository for reusable PHP classses. When I need to get something done in PHP, PEAR is usually the first place where I look. Luckily I found a great class called Achive_Tar that provides a pure PHP solution for Tar file manipulation.

After downloading and installing the class in my web space I wrote a new script to take care of the extraction:

< ?
require_once 'Archive/Tar.php';
$archive = new Archive_Tar($_GET['filename']);

$res = $archive->extract();
if ( PEAR::isError($res) ){
	echo $res->toString();
}

?>

This script takes a get parameter ‘filename’ to specify the name of the file to be extracted. It then extracts that file into the same directory!! And voila!! we can run this script and extract our file.

2 thoughts on “Extracting Tar files in php in safe mode with no shell access”

  1. What about large files over 2 gigs….. is this solution aware of memory by not loading all the file into memory at once?

  2. I’m not 100% sure about the memory thing, but I looked through the source and it doesn’t appear to load the whole file into memory. Rather it seems to work with the file as a stream. Therefore I would think that it should handle large files over 2 gigs. Your only limitation may be your file system (i.e. as long as your file system supports files over 2 gigs, then Archve_Tar will probably be able to extract them.

Comments are closed.

comments powered by Disqus