Amazon S3 GET Failures and Retry [closed] - get

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
We have been using S3 for storing images and PDFs for our web application for some time. When we first coded our web application, the error rate on S3 GETs was fairly high (~1% on first attempt) and we built in retry semantics in our client code. That is, the client would attempt an S3 image download and on failure it would retry several more times.
My question:
Is the S3 Get error rate still high enough to require GET retries (lets say > 0.1%)? Note: I am not concerned about whole S3 data center down times; that is a separate problem. Any analytics regarding this topic would help a lot (e.g. error rate per resource size).

We are getting slightly higher failure rates than that using Amazon's SDK libraries. I estimate our failure rates at about 5%. I find it hard to believe that a service that bad is the defacto standard for cloud storage. That is a sad state of affairs.

Related

Upload video to youtube from Firebase Storage using cloud function [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Can I upload video to my youtube channel from firebase storage using cloud function. There is no restriction on the function triggering which means the function can be triggered by any method like HTTP trigger/cron job/Firebase Call function. I did not find any code related to this yet.Thanks .
I'm not very familiar with YouTube API, however I checked it shortly and I think it will be hard or even impossible to do it as general with no limits. When I looked on Firebase Quotas page there is a quota that might limit such idea.
Cloud Function is has hard time limit of 540 seconds. I think that might be some movies that you are able to download within 9 minutes, however for sure most of youtube contents will not be possible to be uploaded in such time.
Other problem is that I do not found any possibility to upload without local file system. Firebase function does not use local file system only has possibility to use in memory tmp directory, which means that it's affecting total memory usage which is limited by quota of 4GB. Not much for video purposes as well.
So in my opinion, it maybe possible to upload small video using cloud function, but I don't think you will be able to upload larger content.

Which is better and why to store web application logs , Application Insights or Azure table storage? Please suggest [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I want to store application exception logs, which option is better app insights or table storage.
Based on the minimal amount of information you're providing:
Application Insights is the more mature solution. You get lots of stuff you would have to build yourself for Exception data in Table Storage. This includes stuff like reporting, trends, correlation, alerting and whatnot.
Have a look over here: https://learn.microsoft.com/en-us/azure/application-insights/app-insights-asp-net-exceptions.
The document is about a web application, but you can store information from all types of applications in Azure. For this, also see TelemetryClient.TrackException

Photo Sharing Vs Storage [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
There are lot of photo sharing applications out there, some are making money and some don't. Photo sharing takes lot of space, so I doubt where they host these! Rich services probably using Amazon or their own server, but the rest? Do they have access to any kind of free service? Or they have purchased terabytes from their web host?
AWS S3 is what you are generally referring to. The cost is mainly due to the reliability they give to the data they store. For photo-sharing, generally this much reliability is not required (compared with say a financial statement).
They also have other services like S3 RRS (Reduced redundancy), and Glacier. They are lot cheaper. Say those photos not accessed for a long time may be kept on Glacier (it will take time to retrieve, but cheap). RRS can be used for any transformed images (which can be re-constructed even if lost) - like thumbnails. So these good photo-sharing services, will do a lot of such complicated decisions on storage to manage cost.
You can read more on these types here : http://aws.amazon.com/s3/faqs/
There is also a casestudy of SmugMug on AWS. I also listened to him once, where he was telling about using his own hard-disks initially to store, but later S3 costs came down and he moved on to AWS. Read the details here:
AWS Case Study: SmugMug's Cloud Migration : http://aws.amazon.com/solutions/case-studies/smugmug/

Windows Azure web site in free mode outbound data [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
All, I had deployed a website in Windows Azure which I can upload and download file from it, the file is stored in the Windows Azure Storage Blob. And I note in the free mode the max outbound data per day is 165 MB, and inbound data is unlimited as Price detail mentioned. So I want to know what happen to the website if the amount of outbound data exceed the 165MB. Because the max size of file which is upload or download in my website will be lager than 200MB. I didn't see any price detail about this situation. and also worry about if the website can works well in this situation. Thanks
the free site will stop working if the size is exceeded. If you need that much you may want to consider the Shared option for ~10$ a month during preview. The Shared option will not throttle you when you exceed 165MB, you will just be charged for what you consume.

By Azure Scaling how much bill charged [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
How can we find out how much money we are saving by scaling (via increasing and decreasing instance) our Windows Azure application?
Also, is there a way to find out how much database storage is used and its cost, and how much bandwidth is used and its cost?
I don't believe there is currently any way to measure in an immediate way how your Windows Azure application changes affect your billing and usage. There is, however, a feature request for a billing/usage API you could vote on.
SQL Azure includes two system views that can detail your storage and bandwidth usage.
The sys.database_usage view lists the number, type, and duration of
databases on the server and the sys.bandwidth_usage view describes the
bandwidth used with each database.
The above was excerpted from this article.
Additional Links
In particular - How to find acrued billing charges for Windows Azure
In general - Search Stack Overflow for "Azure Billing"

Resources