Home Salesforce Uploading Files to S3 Server using Apex

Uploading Files to S3 Server using Apex

by Dhanik Lal Sahni

How to Upload Files to S3 Server using Apex

Uploading and Reading files from AWS S3 server is one of frequently used requirement in Salesforce project.  This post will give complete detail, how to write files of current record to S3 Server using Apex.

Important information about writing file to S3 Server.

  1. We require Write permission on bucket
  2. If multiple files with same name saved on destination, then it will override last uploaded file.
  3. Use Content-MD5 for transport security
  4. We can use AWS inbuild encryption or custom encryption logic also work while writing file to S3 server
  5. We should provide access level permission on files using request headers.  By default files will be saved as private.
  6.  We can do versioning of file.  If required, enable versioning on S3 server bucket.

Required request parameter

    1. Content-Type: This is required as it describe content type like PDF, JPEG, JPG etc. See method ContentType in below code.
    2. Content-Length: This specify size of body in bytes. Ideally it is determined automatically but if automatic size determination does not work, specify this piece.
    3. Host :  It will contain information about your bucket name and region information on S3 Server.Value pattern is : {bucketName}.S3.{regionName}.amazonaws.com 
    4. Content-Encoding: This parameter store information about what content encoding is used to store file. So that same will be applied when reading files.
    5. ACL: Access-Control-List will provide access to object after uploading in S3 server. Example – public-read, public-read-write. 
    6. Endpoint: It is file url in S3 bucket
    7. Authorization:  Authorization detail to write file in S3 server. See method CreateAuthHeader in code.

Reading files in Apex:

To upload files in S3 Server, We have to first read it from ContentVersion object. ContentVersion object is holding files content and ContentDocumentLink has file sharing detail.  To get content of files we have to first get content document from ContentDocumentLink object and then we can get file content using ContentVersion object.

   List<ContentDocumentLink> links=[SELECT ContentDocumentId,LinkedEntityId FROM ContentDocumentLink where LinkedEntityId=:recordId];
        Set<Id> ids=new Set<Id>();
        for(ContentDocumentLink link:links)
        {
            ids.add(link.ContentDocumentId);
        }
        List<ContentVersion> versions=[SELECT VersionData,Title,ContentDocumentId,FileExtension FROM ContentVersion WHERE ContentDocumentId = :ids AND IsLatest = true];
        
        for(ContentVersion attach:versions)
        {
         .....
        }

Complete Code:

Exception class for custom exception in your code.

BaseException class code

Now our code is ready. Let us use above AWSService using controller by passing S3 credential.

S3Controller Class Code


This S3Controller class can be used by any caller like Lightning Component, Lightning Web Component, another apex class.

Reference:

AWS Documentation: https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutObject.html

You may also like

55 comments

Dhanik Lal Sahni January 19, 2020 - 10:56 pm

Thank You for appreciating.

Reply
Chris Dong January 31, 2020 - 11:19 pm

Hi Dhanik, is this implementation something that can be called through a ContentVersion trigger and work through Salesforce’s inherent Drag&Drop functionality for files?

Reply
Dhanik Lal Sahni February 1, 2020 - 12:14 am

Hello Chris,
We can use this in ContentVersion trigger as well. Scenario could be like, on upload of any file in case object we will upload that file to S3 server and S3 url will be saved in case or related object.

Thank You,
Dhanik

Reply
Chris Dong February 1, 2020 - 12:34 am

Hi Dhanik, thank you for the clarification. And this functionality stores the S3 bucket route within the FileStore custom object which means the files are visible through the FileStore related list correct? You wouldn’t be able to access them via Salesforce’s native Files list?

Reply
Dhanik Lal Sahni February 1, 2020 - 1:22 am

Yes, FileStore object is storing file url of S3 bucket. We can not directly view files from FileStore object as that will require authentication to view it. So we need to implement lightning button where we pass authentication detail and after successful authentication file will be shown. This functionality is captured in another blog http://salesforcecodex.com/2020/01/download-files-from-s3-server-using-apex/

Reply
Chris Dong February 5, 2020 - 4:32 am

Hi Dhanik, is the RecordID being passed into the UploadDocToS3Server the ContentVersion ID?

Dhanik Lal Sahni February 5, 2020 - 10:02 am

It is record id of any entity. Basically account id or case id for which attachment uploaded.

Gordon April 15, 2020 - 12:20 am

Hi Dhanik, thank you for posting your code. It is very nicely written and well-organized.

I wondered if you might have a thought about the following. I am able to upload a file to an S3 bucket using your code. The bucket itself is public. When I access the uploaded file through the Amazon UI, I get an access denied. If I run a similar program written in Java, the file is accessible. The line of code

req.setHeader(‘ACL’, ‘public-read’);

looks entirely correct based on all of my googling. I would suspect that the issues was a configuration of the S3 bucket except that the line of code in the Java program

s3client.putObject(new PutObjectRequest(s3Bucket, fileKey,
new File(filePath))
.withCannedAcl(CannedAccessControlList.PublicRead));

attempts to do the same thing and works fine. If you don’t have any thoughts as to what may be up, and barring getting the ACL correct on the upload, do you otherwise have any thoughts on how to make a separate call to update the ACL?

Meanwhile, I am going to explore your other posts! Thanks again.

Reply
Dhanik Lal Sahni April 16, 2020 - 1:03 am

Hello Gordon,

You can see ACL information from https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl. As per ‘public-read’ – only owner will have full access and others have read. If you are using same user to see in UI then it should be opened. You can check with AWS Team as well for this.

Thank You,
Dhanik

Reply
CHETAN BHATLA April 8, 2021 - 11:19 am

Hi GORDON,

Did you get any solution to this? Even I face the same issue, When i upload image on s3, it is not accessible publicly (everyone), for security reason we dont want to change the config of S3 bucket. Should we go for canonical request building in Apex?

Reply
sushma April 27, 2020 - 10:01 pm

Hi Dhanik
Is there any limitation of the file size when uploading to s3 ? Can we upload large files also using this ?

Thanks

Reply
Dhanik Lal Sahni April 28, 2020 - 12:56 am

Hello Sushma,

There is limitation of 5GB in Put method. You can also refer documentation https://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html

Thank You,
Dhanik

Reply
Prasanth May 28, 2020 - 4:03 pm

Hi Dhanik,
Thanks for awesome code, but when i am using the same code for our aws integration in my requirement.(I am getting 505 HTTP Version Not Supported error)
can you please help me where it was wrong and how to resolve.

Reply
Dhanik Lal Sahni May 31, 2020 - 12:51 am

Hello Prasanth,

This error thrown when remote url is not correct. Please check url once again.

Thank You,
Dhanik

Reply
prabakar May 29, 2020 - 12:30 pm

Hi,

Trying to use the above upload document code and getting the error of Bad Request error,

System.HttpRequest[Endpoint=https://salesforces.s3.ap-south-1.amazonaws.com/tablueau.txt, Method=PUT]

23:42:40.127 (1127788525)|CALLOUT_RESPONSE|[132]|System.HttpResponse[Status=Bad Request, StatusCode=400]

What may be the reason and help me to resolve it.

Thanks.

Reply
Dhanik Lal Sahni May 31, 2020 - 1:02 am

Hello Prabakar,

Please try by adding aws url in remote site setting.

Thank You,
Dhanik

Reply
Shiv June 21, 2020 - 8:52 pm

Trying to use the above upload document code and getting the below error.
System.CalloutException: Unable to tunnel through proxy. Proxy returns “HTTP/1.1 503 Service Unavailable”,

System.HttpRequest[Endpoint=https://sfdcs3new.s3.us-east-1.amazonaws.com/test.png, Method=PUT]

System.CalloutException: Unable to tunnel through proxy. Proxy returns “HTTP/1.1 503 Service Unavailable”,

I have put endpoint in remote site settings also.

Thanks.

Reply
Dhanik Lal Sahni June 23, 2020 - 12:14 am

It could be firewall or internet proxy issue. Please check that once.

Thank You,
Dhanik

Reply
Dhanik Lal Sahni June 23, 2020 - 12:14 am

Hello Shiv,
It could be firewall or internet proxy issue. Please check that once.

Thank You,
Dhanik

Reply
Kiran July 23, 2020 - 10:29 pm

Hello DHANIK LAL SAHNI,

I am using AWS.S3.ListObjectsRequest and AWS.S3.ListObjects to get file from AWS in salesforce. It is working fine. but I want to write unite test class for this. How can I do mock for all this request ?

Reply
Krishna September 24, 2020 - 6:19 pm

I have a requirement like i want to store uploaded file to s3 directly without storing in documents. can u provide a lightning code how to send files to server

Reply
Dhanik Lal Sahni September 29, 2020 - 8:47 am

Hello Krishna,

You can do it. When you upload document, get the blob of that attachment and call Apex code with that blob.

UploadDocuments method which is using recordid as parameter, you can pass blob as well. You can skip all code which is for getting attachment content. You can skip code from line# 81 to 95 of this apex method.

Thank You,
Dhanik

Reply
Amit October 15, 2020 - 3:46 pm

Hello Dhanik ;
I need to add x-amz-meta- through api into the s3 file metadata. for that how to add through header?
Thank you
Amit

Reply
Dhanik Lal Sahni October 15, 2020 - 5:48 pm

Hello Amit,

Check this stackexchange post for your problem.

Thank You,
Dhanik

Reply
Amit October 19, 2020 - 9:35 am

Thank you it’s working.
Please, help me in test class for above rest api apex class. any sample code?

Reply
LinThaw November 5, 2020 - 11:29 am

Hi DHANIK LAL SAHNI,
Great post!
I am also trying to access Amazon Connect using REST API apex class.
Could you please share new post about that, it should be also help other.
ref: https://www.any-api.com/amazonaws_com/connect/docs/_users_InstanceId_UserId_/DescribeUser

Regards,
LinThaw

Reply
Dhanik Lal Sahni November 11, 2020 - 9:35 pm

Sure LinThaw. I will try to publish that.

Regards
Dhanik

Reply
Harshit November 11, 2020 - 4:59 pm

Hello Dhanik
Can I upload large file using this api. file size is greater than 5mb.

Reply
Dhanik Lal Sahni February 14, 2021 - 9:48 pm

Hey Harshit,

You can upload max 6MB size. As this is apex heap size. You can do max 12MB for asynchronous way.

Thank You,
Dhanik

Reply
Harshit November 11, 2020 - 5:03 pm

Hello Dhanik
this method is not working when I upload large file. file size is greater than 5mb.

Reply
Dhanik Lal Sahni November 17, 2020 - 9:46 am

Hello Harshit,

We can easily upload files of 5MB. You just avoid timeout error while uploading.

Thank You,
Dhanik

Reply
Harshit November 11, 2020 - 5:04 pm

Hello Dhanik
this method is not working when I upload large file.

Reply
Dhanik Lal Sahni November 17, 2020 - 9:44 am

Hello Harshit,

There is limitation of 5GB for 1 file in 1 transaction. If you bigger size, use multipart upload and here is limit for that https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html

Thank You,
Dhanik

Reply
Harshit November 18, 2020 - 4:34 pm

Hello
It’s very helpful. but this is not working for large files, any suggestion pls.

Reply
Raphael March 14, 2021 - 6:48 pm

Thank you , very useful solution.
why you decided to store the URL of the uploaded file in a custom object ? Maybe there is a way to store it in the ContentVersion.ContentUrl ?

Reply
Dhanik Lal Sahni March 18, 2021 - 9:21 pm

Yes, it can be done there also. Here is two scenarios to go with custom object

  • If we want to upload existing attachment data to S3 server and after uploading we remove that file.
  • If we don’t want to store file in Salesforce and will store file in S3 directly then we can use custom object to store S3 file url.
  • Reply
    Christian Patterson March 30, 2021 - 10:09 pm

    Dhanik,

    I have tried several apps on the AppExchange marketplace to move files from Salesforce to AWS Gov Cloud and have not been successful with any of the vendors. As an interim solution, I would like to be able to move the pdfs created on specific records to a specified folder in AWS S3 Gov Cloud. I may have 1,000 Accounts that each require a different pdf which all need to be moved to S3 without overwriting any pdfs of files that exist in that folder. Do you know if this approach would work for AWS Gov Cloud?

    Reply
    Dhanik Lal Sahni April 2, 2021 - 8:37 pm

    Hello Christian, I have not tried with AWS Gov cloud. Concept is little bit different there. But if you require, we can sit and try to upload it.

    Thank You,
    Dhanik

    Reply
    Dhanik Lal Sahni April 8, 2021 - 1:48 am

    Hey Christian, I have not tried for Govt cloud but if you require we can discuss on this.

    Thank You,
    Dhanik

    Reply
    Mohit April 6, 2021 - 7:20 pm

    Hello Danik.
    I am getting below error.
    Line: 207, Column: 1
    System.CalloutException: Exceeded max size limit of 6000000 with request size 6004736

    Reply
    Dhanik Lal Sahni April 8, 2021 - 1:46 am

    Hello Mohit,
    This is default behavior for maximum file upload. Checkout this post https://salesforce.stackexchange.com/questions/197583/system-calloutexception-exceeded-max-size-limit-of-6000000 for more detail.

    Thank You,
    Dhanik

    Reply
    Seemu Saikia July 9, 2021 - 3:45 pm

    I am getting [Status=Bad Request, StatusCode=400] ,
    for request System.HttpRequest[Endpoint=https://mybucketxyz.s3-us-east-2.amazonaws.com/demoImage.png, Method=PUT]

    Is there any other configuration I have do other than

    Reply
    Dhanik Lal Sahni July 13, 2021 - 1:15 pm

    Hello Seemu,

    Please check request parameter as this error comes when some request parameters are not valid like bucket name, token, key etc.
    If you still face issue, please ping me in linked in. We will join over call to resolve your issue.

    Thank You,
    Dhanik

    Reply
    Seemu Saikia July 9, 2021 - 3:47 pm

    I am getting [Status=Bad Request, StatusCode=400] ,
    for request System.HttpRequest[Endpoint=https://mybucketxyz.s3-us-east-2.amazonaws.com/demoImage.png, Method=PUT]

    Is there any other configuration I have do on the bucket other than
    Bucket policy
    {
    “Version”: “2012-10-17”,
    “Id”: “Policy1625808393729”,
    “Statement”: [
    {
    “Sid”: “Stmt1625808382243”,
    “Effect”: “Allow”,
    “Principal”: “*”,
    “Action”: [
    “s3:GetObject”,
    “s3:PutObject”
    ],
    “Resource”: “arn:aws:s3:::mybucketxyz/*”
    }
    ]
    }

    Cross-origin resource sharing (CORS)
    [
    {
    “AllowedHeaders”: [
    “*”
    ],
    “AllowedMethods”: [
    “PUT”,
    “POST”,
    “DELETE”,
    “GET”,
    “HEAD”
    ],
    “AllowedOrigins”: [
    “*”
    ],
    “ExposeHeaders”: []
    }
    ]

    please suggest

    Reply
    Dhanik Lal Sahni July 13, 2021 - 1:15 pm

    Hello Seemu,

    Please check request parameter as this error comes when some request parameters are not valid like bucket name, token, key etc.
    If you still face issue, please ping me in linked in. We will join over call to resolve your issue.

    Thank You,
    Dhanik

    Reply
    Vinod Kambire July 12, 2021 - 2:53 pm

    Hi,

    Trying to use the above upload document code and getting the error of Bad Request error even though this endpoint is added in remote site settings,

    System.HttpRequest[Endpoint=https://salesforces.s3.ap-south-1.amazonaws.com/tablueau.txt, Method=PUT]

    23:42:40.127 (1127788525)|CALLOUT_RESPONSE|[132]|System.HttpResponse[Status=Bad Request, StatusCode=400]

    Can you please help me with this?

    Reply
    Dhanik Lal Sahni July 13, 2021 - 8:41 am

    Hello Vinod,
    Have you given proper bucket name, s3 configs? Try to use same thing in Postman, if that is working there, it will work here also.
    If you still face issue, ping me on linked in to connect and resolve issue.

    Thank You,
    Dhanik

    Reply
    N August 2, 2021 - 9:39 pm

    Hi Dhanik,

    Great post! I’ve been trying to get this going, but I’m getting the following message:

    InvalidRequestThe authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.

    I understand that there is a way to do that win an SDK, but how can we achieve this here?

    Thanks,
    N

    Reply
    Dhanik Lal Sahni August 3, 2021 - 10:23 am

    Hello N,

    Please use our other blog Use Named Credential to Upload File in S3. This has updated code with AWS Signature Version 4.

    Thank You,
    Dhanik

    Reply
    V November 23, 2021 - 3:15 pm

    Hi Dhanik,

    I am not able to move more than 5 mb files to S3
    Is there any limit from salesforce?
    Thanks
    V

    Reply
    Dhanik Lal Sahni November 25, 2021 - 1:02 pm

    Hello Vaibhav,
    We use apex for uploading files so there is a limit of 6MB heap size. So you can upload the file which is lighter than 6MB. If you use the batch process then you can upload 12 MB files.

    Thank You,
    Dhanik

    Reply
    Gaurav August 3, 2023 - 2:32 pm

    How can we upload big files for salesforce to amazon s3 is it possible ?

    Reply
    Dhanik Lal Sahni August 9, 2023 - 5:37 pm Reply

    Leave a Comment