I'm creating presigned URL with aws-sdk-php for uploading files to S3 bucket. URLs for GET are working.
Here is the code
$client = S3Client::factory(array('region' => 'eu-west-1','key' => 'xxx','secret' => 'xxx',));
//option 1
$command = $client->getCommand('PutObject', array(
'Bucket' => 'myBucket',
'Key' => 'testing/signedPHP1_'.time(),
'ContentType' => 'image/jpeg',
'Body' => 'dump' //it's mandatory
));
$signedUrl = $command->createPresignedUrl('+5 minutes');
$signedUrl .= '&Content-Type=image%2Fjpeg';
echo("
The URL is: ". $signedUrl . "
");
echo("Now run from console for upload:
curl -v -H \"Content-Type: image/jpeg\" -T /tmp/temp.jpg '" . $signedUrl . "'");
//option 2
$request = $client->put('myBucket/testing/signedPHP2_'.time());
$signedUrl = $client->createPresignedUrl($request, '+5 minutes');
$signedUrl .= '&Content-Type=image%2Fjpeg';
echo("
The URL is: ". $signedUrl . "
");
echo("Now run from console for upload:
curl -v -H \"Content-Type: image/jpeg\" -T /tmp/temp.jpg '" . $signedUrl . "'");
//GET which works
$request = $client->get('myBucket/testing/existingFile.txt');
$signedUrl = $client->createPresignedUrl($request, '+5 minutes');
echo("
The URL is: ". $signedUrl . "
");
echo("Now run:
curl '" . $signedUrl . "'");
//GET which works
$command = $client->getCommand('GetObject', array('Bucket' => 'myBucket','Key' => 'upload/data.txt'));
$signedUrl = $command->createPresignedUrl('+5 minutes');
echo("
The URL is: ". $signedUrl . "
");
echo("Now run:
curl '" . $signedUrl . "'");
When trying to use curl command I'm getting and error SignatureDoesNotMatch with message The request signature we calculated does not match the signature you provided. Check your key and signing method.
The similar code in aws-sdk for Javascript is working
var AWS = require('aws-sdk');
AWS.config.update({ accessKeyId: 'xxx', secretAccessKey: 'xxx', region: 'eu-west-1' });
var s3 = new AWS.S3();
var params = {
Bucket: 'myBucket',
Key: 'testing/preSignedURLnodeJS_' + (+new Date),
ContentType: 'image/jpeg',
Expires: 60 * 5
};
s3.getSignedUrl('putObject', params, function(err, url) {
console.log('The URL is: ', url);
console.log('Now run from console for upload:
curl -v -H "Content-Type: image/jpeg" -T /tmp/temp.jpg \'' + url + '\'');
});
Already done a lot of researching but no results. What am I doing wrong?
On Github I got an answer from SDK developer and it says
$command = $client->getCommand('PutObject', array(
'Bucket' => 'myBucket',
'Key' => 'testing/signedPHP1_'.time(),
'ContentType' => 'image/jpeg',
'Body' => '',
'ContentMD5' => false
));
If you are using the sdk why not use the putObject command, rather than creating a put request yourself. This will take care of the signedUrl for you.
$result = $client->putObject(array(
'Bucket' => $bucket,
'Key' => 'data_from_file.txt',
'SourceFile' => $pathToFile
));
http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_putObject
If you don't want that, you need to look at the body of your put command, this should be the contents of the file you are uploading, not just a random string 'dump'.
Run wireshark on the machine running the command and you will see the body of the request curl is making is the contents of the file.
Here you go:
First, create the presigned URLs
<?php
/**
* getDocumentUploadUrls
*
* creates a list of url so you can upload multiple files per
* document. After uploading the files it is requires for you
* to confirm the uploads.
*
* @param Int $documentId the document id
* @param Int $count the amount of files you want to upload for this document
*
* @return Array list of URLs to use with PUT request to upload files to s3.
*/
public function getDocumentUploadUrls(Int $documentId, Int $count = 1): Array
{
$awsService = $this->getService('aws');
$s3Client = $awsService->getSdk()->createS3();
$urls = array_fill(1, $count, null);
$bucket = 'yourbucket';
$result = [];
$expire = '+20 minutes';
for ($i = 0; $i < $count; $i++) {
$fileCount = $i + 1;
$key = "{$documentId}/{$fileCount}";
$cmd = $s3Client->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => $key
]);
$request = $s3Client->createPresignedRequest($cmd, $expire);
$result[] = [
'url' => (string) $request->getUri(),
'reference' => "{$bucket}/{$key}"
];
}
return $result;
}
The result may look similar to this:
$result = [
0 => [
'url' => 'AwsS3://put/uri',
'reference' => 'docId/count'
]
];
Now, to upload with php cURL:
if ($request->isPost()) {
$files = $request->getFiles()->toArray();
$files = reset($files);
$data = $request->getPost();
$docId = $data['documentId']; // get this from your frontend POST params
$docCount = count($files);
try {
$endpoints = $this->getDocumentUploadUrls($userId, $docId, $docCount);
$uploadInfo = $this->uploadDocuments($endpoints, $files);
} catch (\Exception $e) {
// handle exception
}
}
public function uploadDocuments(Array $endpoints, Array $files)
{
$info = [];
foreach ($files as $index => $file) {
$url = $endpoints[$index]['url']; // the no. of files must match the number of endpoints for this to work
$type = isset($file['type']) ? $file['type'] : 'application/octet-stream';
$headers = [
"Content-Type: $type",
'Access-Control-Allow-Origin: *',
];
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "PUT");
curl_setopt($ch, CURLOPT_POSTFIELDS, $file);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
$response = curl_exec($ch);
if (curl_errno($ch)) {
throw new \Exception(curl_error($ch));
}
curl_close($ch);
$info[] = [
'headers' => $headers,
'response' => $response
];
}
return $info;
}