How to Upload Files to S3 Server using Apex
Uploading and Reading files from AWS S3 server is one of frequently used requirement in Salesforce project. This post will give complete detail, how to write files of current record to S3 Server using Apex.
Important information about writing file to S3 Server.
- We require Write permission on bucket
- If multiple files with same name saved on destination, then it will override last uploaded file.
- Use Content-MD5 for transport security
- We can use AWS inbuild encryption or custom encryption logic also work while writing file to S3 server
- We should provide access level permission on files using request headers. By default files will be saved as private.
- We can do versioning of file. If required, enable versioning on S3 server bucket.
Required request parameter
- Content-Type: This is required as it describe content type like PDF, JPEG, JPG etc. See method ContentType in below code.
- Content-Length: This specify size of body in bytes. Ideally it is determined automatically but if automatic size determination does not work, specify this piece.
- Host : It will contain information about your bucket name and region information on S3 Server.Value pattern is : {bucketName}.S3.{regionName}.amazonaws.com
- Content-Encoding: This parameter store information about what content encoding is used to store file. So that same will be applied when reading files.
- ACL: Access-Control-List will provide access to object after uploading in S3 server. Example – public-read, public-read-write.
- Endpoint: It is file url in S3 bucket
- Authorization: Authorization detail to write file in S3 server. See method CreateAuthHeader in code.
Reading files in Apex:
To upload files in S3 Server, We have to first read it from ContentVersion object. ContentVersion object is holding files content and ContentDocumentLink has file sharing detail. To get content of files we have to first get content document from ContentDocumentLink object and then we can get file content using ContentVersion object.
Listlinks=[SELECT ContentDocumentId,LinkedEntityId FROM ContentDocumentLink where LinkedEntityId=:recordId]; Set ids=new Set (); for(ContentDocumentLink link:links) { ids.add(link.ContentDocumentId); } List versions=[SELECT VersionData,Title,ContentDocumentId,FileExtension FROM ContentVersion WHERE ContentDocumentId = :ids AND IsLatest = true]; for(ContentVersion attach:versions) { ..... }
Complete Code:
Exception class for custom exception in your code.
BaseException class code
Now our code is ready. Let us use above AWSService using controller by passing S3 credential.
S3Controller Class Code
This S3Controller class can be used by any caller like Lightning Component, Lightning Web Component, another apex class.
Reference:
AWS Documentation: https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutObject.html
55 comments
Thank You for appreciating.
Hi Dhanik, is this implementation something that can be called through a ContentVersion trigger and work through Salesforce’s inherent Drag&Drop functionality for files?
Hello Chris,
We can use this in ContentVersion trigger as well. Scenario could be like, on upload of any file in case object we will upload that file to S3 server and S3 url will be saved in case or related object.
Thank You,
Dhanik
Hi Dhanik, thank you for the clarification. And this functionality stores the S3 bucket route within the FileStore custom object which means the files are visible through the FileStore related list correct? You wouldn’t be able to access them via Salesforce’s native Files list?
Yes, FileStore object is storing file url of S3 bucket. We can not directly view files from FileStore object as that will require authentication to view it. So we need to implement lightning button where we pass authentication detail and after successful authentication file will be shown. This functionality is captured in another blog http://salesforcecodex.com/2020/01/download-files-from-s3-server-using-apex/
Hi Dhanik, is the RecordID being passed into the UploadDocToS3Server the ContentVersion ID?
It is record id of any entity. Basically account id or case id for which attachment uploaded.
Hi Dhanik, thank you for posting your code. It is very nicely written and well-organized.
I wondered if you might have a thought about the following. I am able to upload a file to an S3 bucket using your code. The bucket itself is public. When I access the uploaded file through the Amazon UI, I get an access denied. If I run a similar program written in Java, the file is accessible. The line of code
req.setHeader(‘ACL’, ‘public-read’);
looks entirely correct based on all of my googling. I would suspect that the issues was a configuration of the S3 bucket except that the line of code in the Java program
s3client.putObject(new PutObjectRequest(s3Bucket, fileKey,
new File(filePath))
.withCannedAcl(CannedAccessControlList.PublicRead));
attempts to do the same thing and works fine. If you don’t have any thoughts as to what may be up, and barring getting the ACL correct on the upload, do you otherwise have any thoughts on how to make a separate call to update the ACL?
Meanwhile, I am going to explore your other posts! Thanks again.
Hello Gordon,
You can see ACL information from https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl. As per ‘public-read’ – only owner will have full access and others have read. If you are using same user to see in UI then it should be opened. You can check with AWS Team as well for this.
Thank You,
Dhanik
Hi GORDON,
Did you get any solution to this? Even I face the same issue, When i upload image on s3, it is not accessible publicly (everyone), for security reason we dont want to change the config of S3 bucket. Should we go for canonical request building in Apex?
Hi Dhanik
Is there any limitation of the file size when uploading to s3 ? Can we upload large files also using this ?
Thanks
Hello Sushma,
There is limitation of 5GB in Put method. You can also refer documentation https://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html
Thank You,
Dhanik
Hi Dhanik,
Thanks for awesome code, but when i am using the same code for our aws integration in my requirement.(I am getting 505 HTTP Version Not Supported error)
can you please help me where it was wrong and how to resolve.
Hello Prasanth,
This error thrown when remote url is not correct. Please check url once again.
Thank You,
Dhanik
Hi,
Trying to use the above upload document code and getting the error of Bad Request error,
System.HttpRequest[Endpoint=https://salesforces.s3.ap-south-1.amazonaws.com/tablueau.txt, Method=PUT]
23:42:40.127 (1127788525)|CALLOUT_RESPONSE|[132]|System.HttpResponse[Status=Bad Request, StatusCode=400]
What may be the reason and help me to resolve it.
Thanks.
Hello Prabakar,
Please try by adding aws url in remote site setting.
Thank You,
Dhanik
Trying to use the above upload document code and getting the below error.
System.CalloutException: Unable to tunnel through proxy. Proxy returns “HTTP/1.1 503 Service Unavailable”,
System.HttpRequest[Endpoint=https://sfdcs3new.s3.us-east-1.amazonaws.com/test.png, Method=PUT]
System.CalloutException: Unable to tunnel through proxy. Proxy returns “HTTP/1.1 503 Service Unavailable”,
I have put endpoint in remote site settings also.
Thanks.
It could be firewall or internet proxy issue. Please check that once.
Thank You,
Dhanik
Hello Shiv,
It could be firewall or internet proxy issue. Please check that once.
Thank You,
Dhanik
[…] https://salesforcecodex.com/2020/01/download-files-from-s3-server-using-apex/ https://salesforcecodex.com/2020/01/uploading-files-to-s3-server-using-apex/ […]
Hello DHANIK LAL SAHNI,
I am using AWS.S3.ListObjectsRequest and AWS.S3.ListObjects to get file from AWS in salesforce. It is working fine. but I want to write unite test class for this. How can I do mock for all this request ?
Hello Kiran,
Please refer https://salesforce.stackexchange.com/questions/35016/implementation-of-aws-toolkit-test-classes-test-setmock-have-any-done-it for this.
Thank You,
Dhanik
I have a requirement like i want to store uploaded file to s3 directly without storing in documents. can u provide a lightning code how to send files to server
Hello Krishna,
You can do it. When you upload document, get the blob of that attachment and call Apex code with that blob.
UploadDocuments method which is using recordid as parameter, you can pass blob as well. You can skip all code which is for getting attachment content. You can skip code from line# 81 to 95 of this apex method.
Thank You,
Dhanik
Hello Dhanik ;
I need to add x-amz-meta- through api into the s3 file metadata. for that how to add through header?
Thank you
Amit
Hello Amit,
Check this stackexchange post for your problem.
Thank You,
Dhanik
Thank you it’s working.
Please, help me in test class for above rest api apex class. any sample code?
Hi DHANIK LAL SAHNI,
Great post!
I am also trying to access Amazon Connect using REST API apex class.
Could you please share new post about that, it should be also help other.
ref: https://www.any-api.com/amazonaws_com/connect/docs/_users_InstanceId_UserId_/DescribeUser
Regards,
LinThaw
Sure LinThaw. I will try to publish that.
Regards
Dhanik
Hello Dhanik
Can I upload large file using this api. file size is greater than 5mb.
Hey Harshit,
You can upload max 6MB size. As this is apex heap size. You can do max 12MB for asynchronous way.
Thank You,
Dhanik
Hello Dhanik
this method is not working when I upload large file. file size is greater than 5mb.
Hello Harshit,
We can easily upload files of 5MB. You just avoid timeout error while uploading.
Thank You,
Dhanik
Hello Dhanik
this method is not working when I upload large file.
Hello Harshit,
There is limitation of 5GB for 1 file in 1 transaction. If you bigger size, use multipart upload and here is limit for that https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html
Thank You,
Dhanik
Hello
It’s very helpful. but this is not working for large files, any suggestion pls.
Thank you , very useful solution.
why you decided to store the URL of the uploaded file in a custom object ? Maybe there is a way to store it in the ContentVersion.ContentUrl ?
Yes, it can be done there also. Here is two scenarios to go with custom object
Dhanik,
I have tried several apps on the AppExchange marketplace to move files from Salesforce to AWS Gov Cloud and have not been successful with any of the vendors. As an interim solution, I would like to be able to move the pdfs created on specific records to a specified folder in AWS S3 Gov Cloud. I may have 1,000 Accounts that each require a different pdf which all need to be moved to S3 without overwriting any pdfs of files that exist in that folder. Do you know if this approach would work for AWS Gov Cloud?
Hello Christian, I have not tried with AWS Gov cloud. Concept is little bit different there. But if you require, we can sit and try to upload it.
Thank You,
Dhanik
Hey Christian, I have not tried for Govt cloud but if you require we can discuss on this.
Thank You,
Dhanik
Hello Danik.
I am getting below error.
Line: 207, Column: 1
System.CalloutException: Exceeded max size limit of 6000000 with request size 6004736
Hello Mohit,
This is default behavior for maximum file upload. Checkout this post https://salesforce.stackexchange.com/questions/197583/system-calloutexception-exceeded-max-size-limit-of-6000000 for more detail.
Thank You,
Dhanik
I am getting [Status=Bad Request, StatusCode=400] ,
for request System.HttpRequest[Endpoint=https://mybucketxyz.s3-us-east-2.amazonaws.com/demoImage.png, Method=PUT]
Is there any other configuration I have do other than
Hello Seemu,
Please check request parameter as this error comes when some request parameters are not valid like bucket name, token, key etc.
If you still face issue, please ping me in linked in. We will join over call to resolve your issue.
Thank You,
Dhanik
I am getting [Status=Bad Request, StatusCode=400] ,
for request System.HttpRequest[Endpoint=https://mybucketxyz.s3-us-east-2.amazonaws.com/demoImage.png, Method=PUT]
Is there any other configuration I have do on the bucket other than
Bucket policy
{
“Version”: “2012-10-17”,
“Id”: “Policy1625808393729”,
“Statement”: [
{
“Sid”: “Stmt1625808382243”,
“Effect”: “Allow”,
“Principal”: “*”,
“Action”: [
“s3:GetObject”,
“s3:PutObject”
],
“Resource”: “arn:aws:s3:::mybucketxyz/*”
}
]
}
Cross-origin resource sharing (CORS)
[
{
“AllowedHeaders”: [
“*”
],
“AllowedMethods”: [
“PUT”,
“POST”,
“DELETE”,
“GET”,
“HEAD”
],
“AllowedOrigins”: [
“*”
],
“ExposeHeaders”: []
}
]
please suggest
Hello Seemu,
Please check request parameter as this error comes when some request parameters are not valid like bucket name, token, key etc.
If you still face issue, please ping me in linked in. We will join over call to resolve your issue.
Thank You,
Dhanik
Hi,
Trying to use the above upload document code and getting the error of Bad Request error even though this endpoint is added in remote site settings,
System.HttpRequest[Endpoint=https://salesforces.s3.ap-south-1.amazonaws.com/tablueau.txt, Method=PUT]
23:42:40.127 (1127788525)|CALLOUT_RESPONSE|[132]|System.HttpResponse[Status=Bad Request, StatusCode=400]
Can you please help me with this?
Hello Vinod,
Have you given proper bucket name, s3 configs? Try to use same thing in Postman, if that is working there, it will work here also.
If you still face issue, ping me on linked in to connect and resolve issue.
Thank You,
Dhanik
Hi Dhanik,
Great post! I’ve been trying to get this going, but I’m getting the following message:
InvalidRequest
The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.I understand that there is a way to do that win an SDK, but how can we achieve this here?
Thanks,
N
Hello N,
Please use our other blog Use Named Credential to Upload File in S3. This has updated code with AWS Signature Version 4.
Thank You,
Dhanik
Hi Dhanik,
I am not able to move more than 5 mb files to S3
Is there any limit from salesforce?
Thanks
V
Hello Vaibhav,
We use apex for uploading files so there is a limit of 6MB heap size. So you can upload the file which is lighter than 6MB. If you use the batch process then you can upload 12 MB files.
Thank You,
Dhanik
How can we upload big files for salesforce to amazon s3 is it possible ?
Hello Gaurav,
You can check link https://stackoverflow.com/questions/59059268/how-to-upload-larger-file-greater-than-12-mb-to-aws-s3-bucket-in-using-salesfo for your issue.
Thank You,
Dhanik