All Collections
Enterprise Accounts
Storage Connect for
Storage Connect for
Jared avatar
Written by Jared
Updated over a week ago

Storage Connect is available for Enterprise customers on V3 Storage Connect allows’s Enterprise customers to use their own cloud storage endpoint as the backing storage of Today when a user uploads an asset to, the asset flows through the application stack and is stored in’s Amazon S3 bucket. Similarly, playback and delivery of an asset is serviced from a managed Amazon S3 bucket.

With Storage Connect, an asset uploaded to gets redirected to the customer’s connected storage, rather than’s. This offering is available to both net-new and existing customers. To enable existing customers, offers a one-time migration of existing customer data historically stored in’s managed Amazon S3 bucket to the customer-managed Amazon S3 bucket for general availability.

The below information is designed to inform net-new and existing customers with a step-by-step guide configuring their S3 bucket for compatibility with Storage Connect. 


Customers are to provide with an EMPTY S3 bucket within the us-east-1 region. To ensure the bucket is properly configured to work securely with, follow the steps below within the AWS Console.

Create AWS IAM OIDC Identity Provider

Visit the Identity and Access Management Dashboard (AWS IAM) at From here, you will need to add as a new trusted Identity Provider. Follow the steps below to do so.

Access Management > Identity providers

Once created, navigate to and select the newly created provider and copy the Amazon Resource Name (ARN).

  • The ARN can be found within the Summary section of the selected provider, this information will be required during the next step -- creating the IAM role.

  • The ARN value will be formatted similar to the following...

    • arn:aws:iam::1234567891234:oidc-provider/

For more information on setting up AWS IAM OIDC identity provide, please reference AWS official guides.

Create IAM Role

Now that has been created as a trusted Identity Provider, a new “Role” can be created to securely give access to your bucket. Move through the three steps with the information provided below to complete the configuration successfully.

Access Management > Roles

Step 1 | Select Trust Entity

  • Select “Create Role”

    • Trusted Entity Type: Custom trusted policy

    • Custom trust policy:

"Version": "2012-10-17",
"Statement": [
"Effect": "Allow",
"Principal": {
"Action": "sts:AssumeRoleWithWebIdentity",
"Condition": {
"StringEquals": {

Copy and paste the above into the trust policy field — Please be mindful of any formatting changes or fat-fingering when doing so.

With the above JSON block in place, be sure to swap out the temporary variables accordingly with your specific customer values.

  • Replace IAM_OIDC_PROVIDER_ARN with the AWS IAM OIDC identity ARN — copied from the previous steps — and replace FRAMEIO_ACCOUNT_ID with your account ID provided by our support team.

Double-check your work, then choose Next.

Step 2 | Add Permission

To properly give permission to access a customer's S3 bucket or object key, a new policy will need to be created.

Do not select or search through the provided AWS policies, rather, choose “Create policy”. A new browser tab will be opened.

Select the JSON option and copy and paste the following into the field.

"Version": "2012-10-17",
"Statement": [
"Action": [
"Effect": "Allow",
"Resource": "arn:aws:s3:::BUCKET_NAME/*"

Again, be mindful of any formatting changes or fat-fingering when doing so.

Just as before, be sure to swap out the temporary variable accordingly with your customer specific value. 

  • Replace BUCKET_NAME with your bucket name:

Double-check your work, then choose Next.

Provide a Name for your policy, and optional Description and Tags, then choose Create Policy. After you create the policy, close that tab and return to your original tab to finish creating the “Role”.

Now that the custom policy has been created, refresh the browser tab to find and select the newly created policy, then choose Next.

Step 3 | Name, Review, and Create

Provide a Name and an optional Description for the role. Review the configuration and choose Create Role.

Copy the IAM Role ARN to provide to your Support team, we will need that and a few other pieces of information to update your backend account configuration accordingly.


To ensure our existing customer’s success with Storage Connect, offers a migration service of existing customer data. This service will copy objects from’s managed storage to the customer provided target bucket within AWS us-east-1. Migration services define the differences between net-new and existing customer setup and guidelines.

Step 1 | Success Plan
Working with your account team, develop a plan and timeline for migration to a Storage Connect enabled account. This will include defining the account/accounts to migrate, cleaning up and unarchiving projects as needed, and other general maintenance of the account/accounts.

Step 2 | Migration Service - Bucket Policy
With a defined migration plan in place, customers can grant’s migration service access to their S3 bucket.

Complete the steps defined for net-new customers above, then move forward with the addition of the S3 Bucket Policy listed below. 

To add this migration bucket policy, navigate to your target S3 bucket’s permissions. Once within the “Permissions” tab of your bucket, simply add the below JSON block to the “Bucket Policy” section and save the changes. 

"Version": "2012-10-17",
"Statement": [
"Sid": "FrameioMigrationAccess",
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::BUCKET_NAME/*",
"Principal": {
"AWS": "arn:aws:iam::745689021772:role/frameio-storage-connect-migration-access"

As always, be sure to swap out the temporary variable, BUCKET_NAME, accordingly with your customer specific value.

Step 3 | Policy Clean-Up
Upon completion of the migration service, customers should remove the below policy from the bucket. | ACCOUNT CONFIGURATION & REQUIRED PARAMETERS

The last step in setting up your connected storage is to provide your success team with a few key parameters.

Coordinate and pass along the below information to your dedicated Customer Success Manager. From there, your account’s backend will be updated to establish the proper object routing to your S3 bucket or object key.

  • Region: us-east-1

  • Bucket Name

  • Object Prefix

  • IAM Role ARN

Once complete, users can expect to upload content via any client and have the original media written to your provided customer storage base.

Remember, will continue to store your generated proxies and thumbnails to ensure users have the best experience possible within the app. We take this measure as a Business Continuity precaution in the case were to ever have a lapse in connectivity to the customer’s storage.


Q: Can I rename an object in the S3 bucket and still access it through
A: Not at this time. In the future, we may add a capability to “relink” an object to an asset in

Q: Can I use AWS S3 Lifecycle rules on my bucket to transition objects to cheaper storage types?
A: You may use your own AWS S3 Lifecycle rules to move objects to other storage types like IA or GIR, however this may incur additional AWS S3 costs if the objects are still actively reviewed within

Q: Can I move objects to Glacier?
A: We would advise not to move objects to Glacier. will not detect this change and as a result, users will have a degraded experience when attempting to access the original media for download.

Q: What S3 storage classes are supported?
A: All classes with instant retrieval (ie Standard, Intelligent-Tiering,Standard-IA, Glacier Instant Retrieval). Not supported: Glacier Flexible Retrieval, Glacier Deep Archive.

Q: Can I still “archive” projects within the web-app UI?

A: With Storage Connect, the action of archiving a project will reorganize that project with the “Archived Projects” twirl-down menu of its respective team.

However, there is no associated call to S3 with this action. It is up to the customer to implement their own lifecycle rules within AWS. The UI interaction by the end-user will have no effect of configured policies or triggers within AWS.

End-users can, of course, instantly “unarchive” a project as needed.

Q: Can I audit access to our S3 bucket?
A: Yes, you can use AWS CloudTrail to audit when assumes the IAM role, and use AWS S3 Access Logs to audit all requests from to the Amazon S3 bucket.

Q: What is’s log retention policy?
A: Logs generated from’s AWS environment will fall in line with our existing log retention policies.

Did this answer your question?