Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Do Azure have a low-cost cloud storage service like Amazon Glacier?
No, Microsoft Azure does not offer a service equivalent to Amazon Glacier. Glacier is built on top of Amazon S3. Equivalent to Amazon S3 is Microsoft Azure Blob Storage.
UPDATE - 06-November-2017
Recently Microsoft Azure offered a new access tier called Archive Tier which is similar to Amazon Glacier (and other cloud provider's long term storage solution for archival purpose). You can read more about this here: https://azure.microsoft.com/en-in/blog/announcing-the-public-preview-of-azure-archive-blob-storage-and-blob-level-tiering/. I also wrote a blog post about the same that you can read here: https://gauravmantri.com/2017/10/15/understanding-azure-storage-blob-access-tiers/.
As an update to the other answer, Azure does have an semi-equivalent service in their blob storage in that you can set up your storage with a "Cold" access tier. You pay less per GB of storage, but you pay more for access against that data. In contrast to Amazon Glacier, you don't have the delay to access the data that Glacier comes with, but you do pay the same price or more (depending on your Glacier retrieval timing).
On the flip side, you can set up storage with a "Hot" access tier and pay ~80% more per GB stored, but pay half the price for access operations.
You can find the current pricing for Azure blob storage at https://azure.microsoft.com/en-us/pricing/details/storage/blobs/ and current pricing for the various Glacier retrieval tiers at https://aws.amazon.com/glacier/pricing/
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 10 months ago.
Improve this question
Please, consider a system (composed of many microservices and BFFs) that:
Each Platform (many microservices) and Customer Journey (BFF) has its own AWS Account (as part of an organization - Control Tower). We might have 20 - 30 AWS Accounts.
AWS Services used are: Lambda, SNS, SQS, Step Functions, EventBridge, Cognito, S3, CloudFront, CloudWatch, DynamoDB, Aurora Serverless (V2) + RDS Proxy, API GW (REST)
External Services: Lumigo for Monitoring, GitLab CI/CD (SaaS), Salesforce, Stripe, Twilio, Some Banks (API based)
Multi-region deployment (For DR only). So DynamoDB and Aurora Serverless (V2) are synched to another region, and the application is always deployed in both regions (Queues and other temporary states/data are not synched).
and knowing that it's now 2022 (Lambda will turn 10 in a couple of years) would we need VPC (VPCes?) for this solution for maximum security (regarding Infrastructure alone)?. It always looked to me that good governance, automatic rotation of IAM credentials, a strong CI/CD pipeline, and continuous and external security checks would be enough for Serverless Architecture, so that developers or DevOps wouldn't need to invest a lot of energy setting up and maintaining Network and VPC
Any help would be appreciated.
Cheers
So it is no must. You can keep your service also secure without a VPC. However, it may be more cost-effective to use a VPC. For example, if you move data from S3 to lambda you pay a fee for network traffic. If both have endpoints in the same VPC there are no fees.
Furthermore, the two accounts per microservice approach seems a bit complex. It would rather have one CDK construct/terraform/cloud formation template per microservice and then two instances of them for test and prod. The default quota for AWS Organization is 10 accounts, so it would limit you to 5 microservices.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I have a lot of app services on alot of different storage accounts. I would like to consolidate some. Can you move function apps and app services to new storage accounts? I have not found anything in the admin UI.
An App Service doesn't run on a storage account, it connects to a storage account. Which means you can simply switch connection strings. You should, however, think about migrating data as well.
Azure Functions have a storage account associated with them (although these, too, are connected to with a connection string) because of managing triggers and dashboarding functionality.
More information here: Storage considerations for Azure Functions.
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. This is because Functions relies on Azure Storage for operations such as managing triggers and logging function executions.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I would like to know which could be the best aproach to get learn Azure SQL Database with low cost. If i create an Account, it just works by 30 days (free account) and I need more than this.
So, basically I need an account which can allow me to use:
Azure SQL Database (can be basic tier)
Azure Blob Storage
Azure Data Factory
Azure VM
With this information, have some way to get this services with low cost to learn Azure?
One solution would be to upgrade your free account to Pay-as-you-go account.
https://azure.microsoft.com/en-us/offers/ms-azr-0003p/
With this you can try a lot of services from azure. Many services are charged based on the usage and for your training purpose that would incur the cost of nearly nothing bease usage would be very less. You can delete the resources as soon as you finished learning in order to keep your account clean.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Does VM in azure container service cost? Microsoft is saying that the AKS azure container service is free for students. But when I go to create container service it's give a pricelist with vm. Is it will be cost for me as I am a student subscriber?
Azure Container Service is always free for a student subscriber, including it to cluster virtual machines.
But you should make sure that you haven't exhaust your available credit or reach the end of 12 months.You can see your remaining credit on the Microsoft Azure Sponsorships portal.
For more details, please refer to this article.
What happens after I use my $100 credit or I’m at the end of 12 months?
After you exhaust your available credit or reach the end of 12 months, your Azure subscription will be disabled. To continue using Azure, you may upgrade to a Pay-As-You-Go subscription by contacting Azure Support.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a question regarding the optimization of Azure blob download speeds. I am looking at having a private container in Azure blob storage with 10000 files of size ~5 MB. Whenever an user wants to download this file, I will be generating a SAS Url for the user to download the file. As of now, I am looking at ~1000 concurrent users downloading various files at any point of time.
I would like to know whether any of the below steps will help me to maintain optimal download speeds for this kind of usage.
Will storing the files across different containers help in improving download speeds.
Read in the Windows Azure storage team's blog that each storage account has a fixed bandwidth. To offset this , do I need to storing the files across different storage accounts.
Is it sufficient to have a single container in a storage account to get the best download speeds for ~1000 concurrent users.
I
t will also be great if you can let me know the best practices to achieve this.
No. Blobs are the partitions, not the containers.
Yes.
It depends. The target throughput of a single blob is up to 60 MBytes/sec, but since you're talking about 10000 files this shouldn't be a problem (assuming your 1000 concurrent users will download different files). What you'll need to look at is the scalability target of the storage account, where the throughput is up to 3 gigabits per second. This could become an issue if your application grows, but there are a few solutions you can look at:
Use multiple storage accounts (maybe one per country, per application, ...). The limit for creating storage accounts is pretty low (it used to be 5 storage accounts per subscription, don't know if this changed), so you'll need to contact Microsoft is you want to use more storage accounts.
Think about using the CDN together with blob storage to expose your files. This will improve performance (more throughput) but your users will also download the files much faster since they download from a 'nearby' location.
You can also do some caching in your Web Roles (in LocalResources for example, or the Caching Preview, to cache your popular files). But I wouldn't advise on doing this.
This article is a good place to start: Windows Azure Storage Abstractions and their Scalability Targets