Upload video to youtube from Firebase Storage using cloud function [closed] - node.js

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Can I upload video to my youtube channel from firebase storage using cloud function. There is no restriction on the function triggering which means the function can be triggered by any method like HTTP trigger/cron job/Firebase Call function. I did not find any code related to this yet.Thanks .

I'm not very familiar with YouTube API, however I checked it shortly and I think it will be hard or even impossible to do it as general with no limits. When I looked on Firebase Quotas page there is a quota that might limit such idea.
Cloud Function is has hard time limit of 540 seconds. I think that might be some movies that you are able to download within 9 minutes, however for sure most of youtube contents will not be possible to be uploaded in such time.
Other problem is that I do not found any possibility to upload without local file system. Firebase function does not use local file system only has possibility to use in memory tmp directory, which means that it's affecting total memory usage which is limited by quota of 4GB. Not much for video purposes as well.
So in my opinion, it maybe possible to upload small video using cloud function, but I don't think you will be able to upload larger content.

Related

403 Increasing the number of Parents is not allowed Google Drive API React Native [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I am using react-native-google-drive-api-wrapper.
It has been almost 4 months that my code was working completely fine. But now, suddenly I am not able to upload file on any folder in Google drive via React Native app.
I don't think that this is an issue of exceeding limits or something, because I am able to upload file on root folder, but not to any folder/subfolder.
Error is:
403 Increasing the number of Parents is not allowed Google Drive API React Native
Log:
>{"error":{"errors":[{"domain":"global","reason":"cannotAddParent",
"message":"Increasing the number of parents is not allowed"}],
"code":403,"message":"Increasing the number of parents is not allowed"}}
Kindly help!
Have a look at the updated behavior:
Beginning Sept. 30, 2020, you will no longer be able to place a file in multiple parent folders; every file must have exactly one parent folder location. Following is a summary of behavior changes related to the Drive API's new single-parent model.
There is a guide on how to migrate your app to the single-parent model - I recommend you to follow it.

Photo Sharing Vs Storage [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
There are lot of photo sharing applications out there, some are making money and some don't. Photo sharing takes lot of space, so I doubt where they host these! Rich services probably using Amazon or their own server, but the rest? Do they have access to any kind of free service? Or they have purchased terabytes from their web host?
AWS S3 is what you are generally referring to. The cost is mainly due to the reliability they give to the data they store. For photo-sharing, generally this much reliability is not required (compared with say a financial statement).
They also have other services like S3 RRS (Reduced redundancy), and Glacier. They are lot cheaper. Say those photos not accessed for a long time may be kept on Glacier (it will take time to retrieve, but cheap). RRS can be used for any transformed images (which can be re-constructed even if lost) - like thumbnails. So these good photo-sharing services, will do a lot of such complicated decisions on storage to manage cost.
You can read more on these types here : http://aws.amazon.com/s3/faqs/
There is also a casestudy of SmugMug on AWS. I also listened to him once, where he was telling about using his own hard-disks initially to store, but later S3 costs came down and he moved on to AWS. Read the details here:
AWS Case Study: SmugMug's Cloud Migration : http://aws.amazon.com/solutions/case-studies/smugmug/

Amazon S3 upload using rest API and node.js [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
guys, I want to upload image to S3 storage, but I can't do it.
My app is on parse.com and I can't use npm to install aws-sdk
Please, help me, I'm newbie in aws and node.js.
I posted a link as a comment, however I will give it a bit of explanation.
I am not sure if it is possible to upload to S3 through Parse (mainly because that would be alot of unnecessary traffic for Parse), however it is possible to upload to S3 directly from your client by using a certificate. This (signed) certificate effectively tells S3 that you are authorizing the device to upload to your bucket as long as the requirements included in the certificate are met.
This question on Parse's site give more information about this, as well as Cloud Code that should generate the certificate for you. As always, I would recommend you understand what this code is doing before you use it for any production app/service.
You can also probably find some more information about this client-side upload by doing a quick google for something like 'client side upload to S3'.
Seems like a perfect place to use https://www.inkfilepicker.com
Just plug in your own S3 creeds and off you go.
If you can't stand not doing something painful use the REST API here for S3 http://docs.aws.amazon.com/AmazonS3/latest/dev/S3_Authentication2.html and build out your cloud code functions with the networking capability available in parse
There is a reason inkfilepicker exists tho...

put all images in a database or just in a folder [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am developing a website which uses a lot of images.
The images get manipulated very often (every few seconds, by the clients). All images are on a linux server. It is also possible that two clients try to change an image at the same time.
So my question is: should I put the images into a database or just leave them in a folder (how does the OS handle the write-write-collisions?)?
I use node.js and mongoDB on the server.
You usually store the reference to the file location inside of the database. As far as write-write collisions In most whoever has the file open first gets it however it mostly depends on the OS that you are working with. You will want to look into file locking. This wikipedia article gives a good overview.
http://en.wikipedia.org/wiki/File_locking
It is also considered good practice in your code to check and notify the user if the file is in use if write collisions are likely to occur.
I suggest you store your images within the MongoDB using the GridFS file system. This allows you to keep images and their metadata together, it has an atomic update and two more advantages, if you are using replica sets for your database:
Your images have the same high availability as the rest of your data and get backed-up together
You can, eventually, direct queries to secondary members of the set
For more information, see
http://docs.mongodb.org/manual/applications/gridfs
http://docs.mongodb.org/manual/replication/
Does this help?
Cheers
Ronald

Amazon S3 GET Failures and Retry [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
We have been using S3 for storing images and PDFs for our web application for some time. When we first coded our web application, the error rate on S3 GETs was fairly high (~1% on first attempt) and we built in retry semantics in our client code. That is, the client would attempt an S3 image download and on failure it would retry several more times.
My question:
Is the S3 Get error rate still high enough to require GET retries (lets say > 0.1%)? Note: I am not concerned about whole S3 data center down times; that is a separate problem. Any analytics regarding this topic would help a lot (e.g. error rate per resource size).
We are getting slightly higher failure rates than that using Amazon's SDK libraries. I estimate our failure rates at about 5%. I find it hard to believe that a service that bad is the defacto standard for cloud storage. That is a sad state of affairs.

Resources