IPFS persistence data over time - security

i am investigating IPFS to make a poc
I have read several blogs and introduced myself with some concepts , but I can't find a good explanation related to data duration over IPFS.
I want to persist data over IPFS for a long time for the site I am developing.
I know that Infura and other paid services have IPFS projects and also I know that I can have my own infra and deploy an IPFS full node.
Is the data pushed to IPSF secure and persistent over time or could it be deleted without my knowledge? Is it secure?
Thanks in advance

Related

Social network app architecture with React+Nodejs and Kafka

I have an idea of social network website and as I'm currently learning web development I thought that was a great idea to practice. I already worked out the business logic and front-end design with React. Now it's time for backend and I'm struggling.
I want to create a React+Nodejs event-driven app. It seems logical to use Kafka right away. I looked through various Kafka architecture examples and have several questions:
Is it possible to create an app that uses Kafka for data through API calls from Nodejs and to React and vice versa. And user relational database only for longterm storage?
Is it better to use Kafka to handle all events but communicate with some noSQL database like Cassandra or HBase. Then it seems that NodeJS will have to make API calls to them and send data to React.
Am I completely missing the point and talking nonsense?
Any answer to this question is going to be quite opinion based, so I'm just going to try to stick to the facts.
It is totally possible to create such an application. Moreover, Kafka is basically a distributed log, so you can use it as an event store and build your state from that.
That mainly depends on your architecture, and there are too many gaps here to answer this with any certainty - what kind of data are you saving? What kind of questions will you need answered? What does your domain model look like? You could use Kafka as a store, or as a persistent messaging service.
I don't think you're off the mark, but perhaps you're going for the big guns when in reality you don't really need them. Kafka is great for a very large volume of events going through. If you're building something new, you don't have that volume. Perhaps start with something simpler that doesn't require so much operational complexity.

Storing data persistently on IPFS

Recently I developed an alternative to Google Drive using IPFS (the decentralized storage technology). The app serverd it's purpose but suffered from 2 major problems:
App was super cool for small files, but on large files, the download was very slow and eventually stopped.
Data was not persistent, means I lost few files after few hours of upload.
My questions:
Is IPFS a persistent storage system? If no what measures can be used to make it persistent?
Understood your question.So coming to the points.
Is IPFS a persistent storage system ?
IPFS is a distributed system that can (among other things) resolve a content hash to the content it represents. This content can never truly be guaranteed to be available (maybe you're offline, maybe all of the peers with it are offline, maybe you're behind a powerful NAT, maybe the network split and the peers with the content are on the other partition).
In IPFS the simple or decentralized system an object is online only when the nodes that are holding the object spend energy.
And your second part,IPFS is mainly for the permanence and permanence!=persistent.IPFS itself currently handles this by means of "pinning", which excludes an object and its children from garbage collection within one IPFS node.
Work is going on to make it more persistent.One of them is Filecoin (paper), and there a couple of concrete ideas for an ipfs-cluster tool.

How to serve node.js service for worldwide customers and fast?

I have a local VPS that hosting and providing my Node.js REST API in my country.
However soon I will need to open it for different countries.
That means that clients from remote will ask for my services.
Since they are far it will be probably slow connection.
How can I avoid this? Maybe I need more servers located in their countries too, but still, how the data could be shared over one DB?
I do not looking for a full tutorial for how to do that (could be nice to have) but I am looking for get info about the methodology of this.
What do you recommend to do, keep buying servers in remote countries, sharing their data between them someway, or maybe choose to use some cloud service like Firebase? How cloud services work in first place?
Without going into too much detail for each item, here are some keypoints in which I think you should focus your on learning to solve your problem.
For data storage - look into firestore (not the json database) as firestore is globally scaleable.
For your REST endpoints I would use google cloud functions, but without knowing the nature of your application its hard to say if its suitable. The key to being able to reach global scale is having cacheable endpoints. Then you are leveraging google's global CDN which is much faster than hitting the origin server. Note: The firebase cloud functions infrastructure WILL face cold start issues which may/may not be a problem for you.
Cache invalidation is a little lacking so you can leverage longer max-age cache settings but use either cache busing and/or the header stale-while-revalidate to help with this.
There is some great info here https://www.youtube.com/watch?v=dbV-293m1dQ that covers some of what I have mentioned in more detail.

Do DynamoDB and Cloudant store data at edge locations?

Trying to decide between DynamoDB and CouchDB for my website. It's a static site (built with a static site generator) and I'm planning on using a JavaScript module to build a comment system.
I'm toying with using PouchDB and CouchDB so that synchronizing is easy. I'm also considering DynamoDB.
I have a performance question. From these databases, do any of them push data out to edge locations so that latency is reduced? Or is my Database essentially sitting on one virtual server somewhere?
From what I know, neither of these solutions utilise edge locations ootb.
Since you're mentioning PouchDB, I assume you want to use a client-side database in your app?
If that's the case you should keep in mind that, in order to sync, a client-side DB needs to have access to your cloud db. So it's not really suitable for a comment system since all client could just drop comments of others, edit them, etc.

How to design chat based application with nodejs, redis and mongodb?

We have an application for iOS which has a chat feature. Currently it works with long poll. And now we are trying to modify it to work with sockets. When it comes to socket, we have started for a research and it seems that one of the best option is using nodejs with socket.io. Then we have used redis pub/sub to manage the message delivery and storage.
After a few researching on redis, the recommended usage suggests the stored data should fit on memory. But, we have a little big database. We would like to store the whole chat history. So we have started to plan to use redis as a cache database, which will store the online user's chat history (may be not whole of them) and write the actual conversation after getting offline from redis to mongodb/simpledb (or instantly both of them).
So as a summary, we are about to decide to use nodejs and redis pub/sub to deliver messages, redis as a cache database, and mongodb to store the whole conversation.
What do you think about the design? Is this acceptable? Or, if there is a better way you can suggest, can you please explain a little more?
Thanks in advance.
For a chat system, you're thinking big. If you think you're going to reach a million users then go for it. Consider also availability - how will your system deal with failure of a machine?

Resources