Jon Toor, CMO, Cloudian
How is Amazon S3 Priced?
Amazon Simple Storage Service (Amazon S3) is an elastically scalable object storage service. The service provides a free tier to get you started, with limited capacity for 12 months.
Beyond the limits of the free tier, AWS S3 storage pricing has five components:
- Storage tiers – AWS S3 prices per GB-month of storage, and pricing varies according to the storage tier you select. The AWS S3 Standard tier provides instant access with low data retrieval costs, but a relatively high cost per GB. There are lower cost tiers that offer lower cost per GB, but higher cost of data retrieval or delayed data retrieval.
- Cost per requests – AWS S3 has a cost per 1,000 requests, depending on the request type.
- Data transfer – there are extra charges for data transfer from AWS S3 to the Internet, or to certain AWS regions.
- Management and analytics – AWS S3 charges extra for automating the data lifecycle and moving data automatically to the most optimal storage tiers.
- Replication – if you set up replication in AWS S3, data transfer and operations performed during replication are charged like regular AWS S3 operations.
This is part of an extensive series of articles about S3 Storage.
In this article, you will learn:
- Amazon S3 Free Tier
- The 5 Components of Amazon S3 Pricing: In Detail
- Cost Optimization for Amazon S3 Storage
- Cut S3 Storage Costs with Cloudian
Amazon S3 Free Tier
The AWS Free Tier, offered to new AWS customers, gives you 5 GB of storage in the AWS S3 Standard storage tier. This includes up to 2000 PUT, POST, COPY or LIST requests, 20,000 GET requests and 15 GB of outgoing data transfer per month for a year.
Free tier usage is calculated every month for all AWS Regions and the eligible cost savings are deducted from your bill. Unused free capacity does not roll over into the next month.
The 5 Components of Amazon S3 Pricing: In Detail
AWS S3 uses a pay-per-use pricing model, with no upfront charges and no minimum fee. The more storage you use, the lower the AWS S3 costs will typically be per GB.
There are several Amazon S3 cost factors to consider when storing and managing your data. These include storage charges, request and data ingestion charges, data transfer and transfer acceleration charges, data management and analytics charges.
All prices shown below are for the US East region. Prices are subject to change, for up to date pricing see the official pricing page.
Storage Tiers
Storage in AWS S3 buckets is priced per GB/month. Rates vary depending on the size of the data, how long the data is stored, and the storage tier.
AWS S3 Standard
S3 standard is the default tier, designed for frequently accessed data. Its low latency and high throughput make it a common backbone for many applications.
Data Size | Price per GB |
First 50 TB / Month | $0.023 |
Next 450 TB / Month | $0.022 |
Over 500 TB / Month | $0.021 |
AWS S3 Standard – Infrequent Access (IA)
AWS S3 Standard-IA is suitable for data that is accessed infrequently, but still requires fast access when needed. It is commonly used for long-term storage, backup, and business continuity. It has lower storage costs compared to AWS S3 Standard, but there are fees for data retrieval.
Fixed Cost Per GB | $0.0125 |
AWS S3 One Zone – Infrequent Access
This Amazon S3 class stores your data in a single Availability Zone (AZ). Unlike other AWS S3 types, this design cannot withstand the physical loss of AZs from regional catastrophes such as earthquakes and floods. However, if you don’t need the extra protection offered by geographic redundancy, it costs 20% less than the S3 Standard-IA.
Fixed Cost Per GB | $0.01 per GB |
AWS S3 Glacier
Glacier is suited to storing data that is rarely accessed, but which must be archived for compliance or regulatory requirements. Glacier has a low cost per GB, but it takes more time to retrieve the data.
Tier | Time to Access the Data | Cost Per GB |
S3 Glacier | Between a few minutes and several hours (depending on access method) | $0.004 |
S3 Glacier Deep Archive | 12-48 hours | $0.00099 |
AWS S3 Intelligent Tiering
Intelligent Tiering leverages monitoring and automation capabilities to optimize costs by moving data between frequent-access (FA) and infrequent-access (IA) tiers. Intelligent Tiering eliminates the need to pay higher FA fees for data that is rarely accessed.
There is a monthly monitoring and auto-tiering fee, but no fees for data retrieval, so you don’t have to worry about unexpectedly increasing costs as your data access patterns change.
Monitoring and Automation | Fixed Cost Per GB | $0.0025 per 1,000 objects |
Requests and Data Retrievals
The free tier allows a certain number of free PUT and GET requests. If you exceed this number, you’ll pay for additional PUT/GET requests, as well as other requests not included by the free tier. Data retrieval cost depends on storage tier—it is higher for the infrequent access storage classes, compensating for their lower data storage price.
Tier | Cost for 1000 PUT, COPY, or POST Requests |
AWS S3 Standard, Glacier, Glacier Deep Archive | $0.005 |
AWS S3 Infrequent Access, Infrequent Access One Zone | $0.01 |
Data Transfer
Standard data transfers from the Internet to AWS S3 buckets are free, but data transfers outside AWS S3 incur costs. Amazon uses a tiered data transfer pricing structure, with a lower cost the more data you transfer outside the S3 service each month.
Volume of Data Transfer to the Internet | Cost per GB |
Up to 1 GB / Month | Free |
Next 9.999 TB / Month | $0.09 |
Next 40 TB / month | $0.085 |
Next 100 TB / month | $0.07 |
Over 150 TB / month | $0.05 |
If you need faster data transfer, you can pay extra for accelerated transfers. High-speed data transfers between an S3 bucket and the Internet typically cost $0.04 per GB in both directions ($0.08 per GB outside the US, Europe, and Japan).
Management and Analytics
There are charges for storage management features and analytics, including Amazon S3 Inventory, S3 Storage Class Analytics, S3 Storage Lens, and S3 Object Tagging, if you have these services enabled on the buckets in your account.
Feature | Cost Per Million Objects |
S3 Inventory | $0.0025 |
S3 Analytics Storage Class Analysis | $0.10 |
S3 Storage Lens (charge is for advanced metrics; basic metrics offered free) | $0.20 |
Batch Operations | $0.2 (and in addition, $1 per job) |
S3 Object Tagging | $0.01 (per 10,000 tags per month) |
Replication
AWS S3 replication has several charges that are the same as regular S3 usage:
- Storage fees for the S3 tier selected as replication target
- Charges for primary copy storage
- PUT requests for replication data transfer
- Infrequent-Access storage retrieval charges (if the IA tier is used)
For cross region replication (CRR), you also have to pay for inter-region data transfer from S3 to each destination region. If you use S3 Replication Time Control (RTC), there is an additional charge for RTC data transfers. You also pay for S3 replication metrics.
S3 RTC data transfer | $0.015 per GB |
S3 Replication Metrics (same charge as CloudWatch custom metrics) | First 10,000 metrics cost $0.3 / metrics / month
Down to $0.02 / metrics / month over 1 million metrics |
Cost Optimization for Amazon S3 Storage
Here are a few ways you can optimize your use of Amazon S3 and reduce costs.
Defining Application Requirements
When moving your workloads to AWS, you need to understand their performance and data access requirements. For example, the requirements for a backup and archive application will be completely different from the requirements of a streaming service or an E-commerce application.
In order to manage your storage costs, you need to know when and how your data is retrieved, accessed, archived or deleted by a user.
Here are some examples of the different requirements for various applications:
- Static websites—requirements include random data access, high availability and durability
- Data analytics applications—requirements include frequent data access, large capacity and lower availability
- Financial reporting—requirements include infrequent data access, data retention for several years (depending on regulations), and high durability
Organizing Your Data
Amazon S3 offers tools that let you organize data organization at the object or bucket level. This is important for optimizing costs. You can use object tags, name prefixes, and S3 buckets to organize your data:
- Up to 10 tags can be associated with an object
- Tags can be added to existing objects or to new objects when they are uploaded—for example, Identity and Access Manager (IAM) permissions can grant read-only access to objects with specific tags
- Object tags allow you to manage the object lifecycle—for example, you can specify name prefixes and tag-based filters in lifecycle rules
Amazon S3 Storage Class Analysis allows you to configure filters that categorize objects for analysis using object tags and key name prefixes. Amazon CloudWatch metrics can be customized to display information using specific tag filters.
Selecting Optimal Amazon S3 Storage Class
Amazon S3 provides several storage classes suitable for various use cases, with each class supporting a different level of data access with corresponding pricing. Choosing the right storage class for each use case is essential to planning your S3 cost optimization strategy.
There are three key elements to selecting the best storage class for your data in S3: monitoring, analysis, and optimization.
Monitoring
It is important to monitor your S3 usage so you can reduce storage costs and adjust for growth. You can use AWS Budgets to set a budget and get alerts when your usage or costs exceed, or are expected to exceed, the specified budget. Amazon CloudWatch metrics allow you to monitor storage and requests in real time and alert you when you reach a usage threshold.
Analysis
Amazon S3 Storage Class Analysis provides insights about data usage patterns, which can help you choose the most appropriate storage tier for different parts of your data.
Storage Tier Optimization
Configuring S3 Lifecycle Policies allows you to ensure that objects are stored in the most cost-efficient class according to where they are in their lifecycle. You set rules that define how Amazon S3 handles specific groups of objects—for example, you can specify that objects should be automatically deleted when no longer needed, or automatically transitioned into a cold storage class like Amazon S3 Glacier.
Another option is to allow S3 Intelligent-Tiering to optimize costs by automatically moving objects to the most appropriate storage tier, according to prior access patterns.
Cut S3 Storage Costs with Cloudian: S3-Compatible, Massively Scalable On-Premise Object Storage
Cloudian® HyperStore® is a massive-capacity object storage device that is fully compatible with Amazon S3. It can store up to 1.5 Petabytes in a 4U Chassis device, allowing you to store up to 18 Petabytes in a single data center rack. HyperStore comes with fully redundant power and cooling, and performance features including 1.92TB SSD drives for metadata, and 10Gb Ethernet ports for fast data transfer.
HyperStore is an object storage solution you can plug in and start using with no complex deployment. It also offers advanced data protection features, supporting use cases like compliance, healthcare data storage, disaster recovery, ransomware protection and data lifecycle management.
Learn more about Cloudian® HyperStore®.