Installation guide

// Seek to the beginning of the stream
$result['Body']->rewind();
// Read the body off of the underlying stream in chunks
while ($data = $result['Body']->read(1024)) {
echo $data;
}
// Cast the body to a primitive string
// Warning: This loads the entire contents into memory!
$bodyAsString = (string) $result['Body'];
Saving objects to a file
You can save the contents of an object to a file by setting the SaveAs parameter.
$result = $client->getObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
'SaveAs' => '/tmp/data.txt'
));
// Contains an EntityBody that wraps a file resource of /tmp/data.txt
echo $result['Body']->getUri() . "\n";
// > /tmp/data.txt
Uploading large files using multipart uploads
Amazon S3 allows you to uploads large files in pieces. The AWS SDK for PHP provides an abstraction layer that
makes it easier to upload large files using multipart upload.
use Aws\Common\Enum\Size;
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\Model\MultipartUpload\UploadBuilder;
$uploader = UploadBuilder::newInstance()
->setClient($client)
->setSource('/path/to/large/file.mov')
->setBucket('mybucket')
->setKey('my-object-key')
->setOption('Metadata', array('Foo' => 'Bar'))
->setOption('CacheControl', 'max-age=3600')
->build();
// Perform the upload. Abort the upload if something goes wrong
try {
$uploader->upload();
echo "Upload complete.\n";
} catch (MultipartUploadException $e) {
$uploader->abort();
echo "Upload failed.\n";
}
You can attempt to upload parts in parallel by specifying the concurrency option on the UploadBuilder object. The
following example will create a transfer object that will attempt to upload three parts in parallel until the entire object
has been uploaded.
$uploader = UploadBuilder::newInstance()
->setClient($client)
->setSource('/path/to/large/file.mov')
->setBucket('mybucket')
Amazon Simple Storage Service
106