With Amazon S3 being as good and cheap as it is, it’s almost essential for what I need it for…storing images and large static files. The problem is there is no interface besides SOAP, and if you don’t know how I feel about SOAP, let me tell you: it makes my head want to fucking explode. It’s insanely complicated for what it does. It tries to standardize so many things that it’s completely bloated…sending the message “hello” from one computer to another in SOAP takes oh about 10 years…5 years for a team of 50 supercomputers working in tandem to build the header and message body (which will total about 400 mb when finally complete), .02ms to send, and 5 years in decoding. Use JSON, you crackheads. Sure you’ll actually have to document it, but nothing is worse than SOAP, not even documenting.

That’s besides the point though. S3 chose to use SOAP, so I refuse to write my own client for it. This means that, as of late, the world is without a good free S3 uploading client. S3Fox, the firefox extension, is ok…it can’t handle SSL connections though, so expect to lose your private key to a sniffer about 10 seconds after your first request. JungleDisk is now a completely paid service (I already pay for S3, I’m not paying those guys to fucking USE it). Linux has some great command line tools for S3 (yay…), but that leaves windows with either S3fox, JD ($$$), or a handful of shitty S3 clients.

Right now, I have to use a PHP script I built around the S3 PHP Class to do any uploading that doesn’t make me vomit. The S3 class works REALLY well…it lets you assign ACL while uploading, change headers for images (for browser-side caching, mmm) and best of all, doesn’t completely suck. Let’s all thank Donovan Sch√∂nknecht for writing something that actually works well and communicates with S3.

Here’s a piece of code I wrote that wraps around the S3 uploader. It uploads images, sets Cache-Control headers, and removes the images. What you want to do is copy your images into a folder, run this file one directory up (change $start_folder to == the name of the folder your images are in), and sit back. It will upload all your images, directory structure preserved, publicly viewable and with cache-control headers.

< ?
	// quick config
	$bucket			=	'your.bucket.com';
	$start_folder	=	'images';

	// settings
	error_reporting(E_ALL);
	ini_set('display_errors', 1);
	ini_set('max_execution_time', 3600);

	// include S3 class
	include 'S3.php';
	$s3	=	new S3('[your key]', '[your secret]', false);

	// get list of files. if you don't want a subdirectory, just change this line to not need one. hopefully you know PHP...
	$files	=	recurse(array(), $start_folder);

	// loop over files and upload
	for($i = 0, $n = count($files); $i < $n; $i++)
	{
		$ext	=	preg_replace('/.*\./', '', $files[$i]);
		$type	=	'image/jpeg';
		if($ext == 'jpg')
		{
			$type	=	'image/jpeg';
		}
		else if($ext == 'gif')
		{
			$type	=	'image/gif';
		}
		else if($ext == 'png')
		{
			$type	=	'image/png';
		}

		if(
			!$s3->putObject(
				$s3->inputFile($files[$i]),
				$bucket,
				$files[$i],
				S3::ACL_PUBLIC_READ,
				array(),
				array('Cache-Control' => 'max-age=31536000', 'Content-Type' => $type)
			)
		)
		{
			echo '<span style="color:green;">Failed: upload of '. $files[$i] . '';
		}
		else
		{
			echo '<span style="color:green;">Succeeded: upload of '. $files[$i] .'';
			unlink($files[$i]);
		}
	}

	function recurse($files, $dir)
	{
		$d	=	scandir($dir);

		for($i = 0, $n = count($d); $i < $n; $i++)
		{
			if(!preg_match('/^\./', $d[$i]))
			{
				if(is_dir($dir . '/' . $d[$i]))
				{
					$files	=	recurse($files, $dir . '/' . $d[$i]);
				}
				else
				{
					$files[]	=	$dir . '/' . $d[$i];
				}
			}
		}

		return $files;
	}
?>

Feel free to modify, copy, blah blah…but give credit where it’s due. Let it be a light to you when all other lights go out. Hopefully it helps someone, because it sure helps me out.

Comments Off