Prefect Multiple Database Connections - prefect

How does Prefect handle multiple database connections for flows or how does it delegate drivers and runtimes for connections to say; Oracle, SQL Server, and Snowflake?
If I have a single Prefect server on Kubernetes, does Docker handle all of this, or does the prefect server that hosts many flows that connect to many sources?

Prefect does not connect to sources in the way that tools like Stitch or Fivetran do. Flows, which would be python scripts, are orchestrated using prefect on the infrastructure of your choice and from those scripts you are able to create connections and generally do whatever you want.

Related

Is it possible to connect to 3rd party database using Azure Logic Apps?

I am new to Cloud and looking to cut down cost on Azure. I already have a database on the hostinger platform and would like to use it for the python script that I want to run on the Azure Logic Apps platform. Is it possible to do this or does Azure prevent any such connections? Do I need to create any connector on Azure for this purpose? I have no idea of running python script on Azure. If this is possible then it can be a great cost cutting measure for me.
One of the workarounds is you can try using Remote MySQL from the Databases of the Hostinger platform.
Type the IP address of your remote server in the IP (IPv4 or IPv6) area on the Remote MySQL page, or check the Any Host box to connect from any IP.
Then choose the database you wish to access from afar. Click Create when you're finished.
Make sure that a MySQL user must use their MySQL server hostname for remote connections - the hostname may be found at the top of the same page.
You may now use the same connection to make your own logic app connector and utilize the same connector for additional database manipulations.
would like to use it for the python script that I want to run on the Azure Logic Apps platform.
Depending on your requirements, you may utilize a variety of connectors for this. For example following the custom connector that you are using to retrieve the database from your hostinger, you can use azure functions in order to while coding with python.
For more information, you can refer to this example.
REFERENCES:
How to Allow Remote Connections to MySQL Database (hostinger.in)

How to host multiple databases for a micro-service application?

I have an application in mind that would be built using node, mongodb + other db, kubernetes , RabbitMQ, docker and react as a front end. The application will be built in a microservice architecture. We all know that for a monolith app all you need is one DB (MongoDB, MySQL etc etc) but for a micro one you can have multiple databases. My question would be, do I need to buy multiple, separate databases and connect each service to them ? or how does it work in a micro-services design.
At the moment a I have a sample microservices app that is running on my local machine using docker and its connected to multiple databases or database/service. I am just to trying to get an idea on how does this work with companies like DigitalOcean or AWS.
Any input on this would be great.
I am just trying to figure out how this going to work when it comes to production later so that I am ware of cost and deployments. I have done some research on Digital ocean, AWS etc etc but I still can figure out how do they work.
thanks in advance.
You don't need having multiple instances of DBMS running. You can easily use one VM with one MongoDB running on it.
When you scale you might want to have separate machines running DB instances for your services, but at start you may just separate it logically to ensure you do not communicate between services using DB.
Chris Richardson on his microservices.io website says:
There are a few different ways to keep a service’s
persistent data private. You do not need to
provision a database server for each service.
For example, if you are using a relational database
then the options are:
- Private-tables-per-service – each service owns a
set of tables that must only be accessed by that
service
- Schema-per-service – each service has a database
schema that’s private to that service
- Database-server-per-service – each service has
it’s own database server.
Source: https://microservices.io/patterns/data/database-per-service.html

Deploy and secure microservices communication - best practices

I have 2 microservices each connected to a db and two APIs and I was asking myself what are the best practices about deploying them in production.
Firstly, I thought about putting all the code on the same server. Each microservice will be in a docker container and same with the APIs. I will not use docker for databases(Mongo and MariaDB), instead they will be installed directly on the server. Only the port of the APIs will be opened and they will communicate with the microservices via TCP.
Secondly I was wondering how can I secure the microservices or if is needed. The validation logic will be only on the APIs.
Thank you!
Some food for thoughts:
Deploy several replicas of the same container to increase reliability and resilience. To manage load balancing and replicas you could use for instance kubernetes
If your containers communicate between each other, you could think about enabling mTLS.
Expose only the needed ports.
If you deploy them in your own server pay attention to the maintenance of the underlying operating system.

Microservices Architecture in NodeJS

I was working on a side project and i deiced to redesign my Skelton project to be as Microservices, so far i didn't find any opensource project that follow this pattern. After a lot of reading and searching i conclude to this design but i still have some questions and thought.
Here are my questions and thoughts:
How to make the API gateway smart enough to load balnce the request if i have 2 node from the same microservice?
if one of the microservice is down how the discovery should know?
is there any similar implementation? is my design is right?
should i use Eureka or similar things?
Your design seems OK. We are also building our microservice project using API Gateway approach. All the services including the Gateway service(GW) are containerized(we use docker) Java applications(spring boot or dropwizard). Similar architecture could be built using nodejs as well. Some topics to mention related with your question:
Authentication/Authorization: The GW service is the single entry point for the clients. All the authentication/authorization operations are handled in the GW using JSON web tokens(JWT) which has nodejs libray as well. We keep authorization information like user's roles in the JWT token. Once the token is generated in the GW and returned to client, at each request the client sends the token in HTTP header then we check the token whether the client has the required role to call the specific service or the token has expired. In this approach, you don't need to keep track user's session in the server side. Actually there is no session. The required information is in the JWT token.
Service Discovery/ Load balance: We use docker, docker swarm which is a docker engine clustering tool bundled in docker engine (after docker v.12.1). Our services are docker containers. Containerized approach using docker makes it easy to deploy, maintain and scale the services. At the beginning of the project, we used Haproxy, Registrator and Consul together to implement service discovery and load balancing, similar to your drawing. Then we realized, we don't need them for service discovery and load balancing as long as we create a docker network and deploy our services using docker swarm. With this approach you can easily create isolated environments for your services like dev,beta,prod in one or multiple machines by creating different networks for each environment. Once you create the network and deploy services, service discovery and load balancing is not your concern. In same docker network, each container has the DNS records of other containers and can communicate with them. With docker swarm, you can easily scale services, with one command. At each request to a service, docker distributes(load balances) the request to a instance of the service.
Your design is OK.
If your API gateway needs to implement (and thats probably the case) CAS/ some kind of Auth (via one of the services - i. e. some kind of User Service) and also should track all requests and modify the headers to bear the requester metadata (for internal ACL/scoping usage) - Your API Gateway should be done in Node, but should be under Haproxy which will care about load-balancing/HTTPS
Discovery is in correct position - if you seek one that fits your design look nowhere but Consul.
You can use consul-template or use own micro-discovery-framework for the services and API-Gateway, so they share end-point data on boot.
ACL/Authorization should be implemented per service, and first request from API Gateway should be subject to all authorization middleware.
It's smart to track the requests with API Gateway providing request ID to each request so it lifecycle could be tracked within the "inner" system.
I would add Redis for messaging/workers/queues/fast in-memory stuff like cache/cache invalidation (you can't handle all MS architecture without one) - or take RabbitMQ if you have much more distributed transaction and alot of messaging
Spin all this on containers (Docker) so it will be easier to maintain and assemble.
As for BI why you would need a service for that? You could have external ELK Elastisearch, Logstash, Kibana) and have dashboards, log aggregation, and huge big data warehouse at once.

Api App Sqlconnector not allowing Hybrid setup

Trying to configure my SQL Connector API App named (CENTRAL.SQLCONNECTOR.EMPLOYEESYNC.DEV) to use the Hybrid Connection Manager. I have had to delete it and recreate multiple times assessing the query options, and confused because it seems very inconsistent to allow the Hybrid Connection setup. It currently reads below...
Hybrid Connection
Not configured as a hybrid ApiApp
The prior time it allowed it fine, and I was able to successfully setup the Hybrid Connection Manager on my web server. Could you please look at this api app, and maybe tell me what I may be doing wrong.
Thanks

Resources