[Question posted by a user on YugabyteDB Community Slack]
This is regarding migration from a different source DB to YugabyteDB.
I can see that 'YugabyteDB Voyager' supports Postgres, Aurora, MySQL and Oracle only as source database for Migration to YugabyteDB.
How to migrate SQL Server and DB2 database to Yugabyte?
Appreciate if someone can throw more insight into this topic as I have a use case where we are evaluating YB as the target database and we have SQL Server and DB2 as source.
Both SQLServer and DB2 as a source are on the roadmap for YugabyteDB Voyager, but not been scheduled or prioritized for any release yet.
You can use pgloader for offline migration. For live migration you can try blitzz.io
Related
I have a question regarding license for JOOQ being used as a sql builder with AWS RDS being the underlying database.
Can the ASL 2.0 be used or Jooq license is required?
Thanks,
Kanika
Per jOOQ 3.13, there is no separate licensing option for AWS RDS. If you’re planning to use jOOQ with MariaDB, MySQL, or PostgreSQL on AWS RDS, the ASL 2.0 licensed jOOQ Open Source Edition will suffice. If you’re planning to use jOOQ with Aurora, Oracle, or SQL Server on AWS RDS, the jOOQ Professional Edition will be needed.
See the jOOQ database support website for an up-to-date list of supported database products per jOOQ edition and distribution.
I have an Oracle DB with data that I need to load and transform into an Azure SQL Database. I have no control over either the DB nor the application that updates its data.
I'm looking at Azure Data Factory, but I really need data changes in Oracle to be reflected as near to real-time as possible.
I would appreciate any suggestions / insights.
Is ADF the correct tool for the job? If so, what is a good approach to use? If not suitable, what should I consider using instead?
For real-time you don't really want an ELT/ETL tool like ADF. Consider a replication agent like Attunity or (gulp at the licensing costs) GoldenGate.
I don't think Data Factory is not good for you. Yes you can copy data from Oracle to Azure SQL database with it. But like #Thiago Custodio said, we need need to do it to each table you have. That's too complicated.
Just reference: Copy data from and to Oracle by using Azure Data Factory.
As you said, you really need data changes in Oracle to be reflected as near to real-time as possible.
The migration/copy time must be very short. Then the data in Oracle and Azure SQL database could be same before the Oracle data changed next time. I searched a lot and didn't find any real-time copy tools. Actually, I think you want the copy could be something like 'data sync'.
I found this link Sync Oracle Database with SQL Azure, hope it could give some good ideas for you.
About the data migration or copy, You can using bellow ways:
SQL Server Migration Assistant for Oracle (OracleToSQL)
Azure Database Migration Service (DMS)
Reference tutorial:
Migrating Oracle Databases to SQL Server (OracleToSQL): SQL Server Migration Assistant (SSMA) for Oracle is a comprehensive environment that helps you quickly migrate Oracle databases to Azure SQL database.
How to migrate Oracle to Azure SQL Database with minimum downtime:
Hope this helps.
For the record, we went with a product named QLik Replicate (aka Attunity) and it is working very well!
Hi Is it possible to connect Azure Postgres SQL Database to PowerBI using Direct Query, I cant seem to find information regarding this.
Currently these are the only data sources supported by DirectQuery:
Amazon Redshift
Azure HDInsight Spark (Beta)
Azure SQL Database
Azure SQL Data Warehouse
Google BigQuery (Beta)
IBM DB2 database
IBM Netezza (Beta)
Impala (version 2.x)
Oracle Database (version 12 and above)
SAP Business Warehouse Application Server
SAP Business Warehouse Message Server (Beta)
SAP HANA
Snowflake
Spark (Beta) (version 0.9 and above)
SQL Server
Teradata Database
Vertica (Beta)
PostgreSQL is supported, but only in import mode. So no, you can't use DirectQuery with PostgreSQL (unless you write your own custom connector). You can vote for this idea though.
I'm working on a Custom Connector that will work for Direct Query from PostgreSQL through an ODBC driver. Working on a full write-up (this month when I get time) but until then I can just share the repo here:
DirectQuery for Postgres via ODBC
This is working for us to DirectQuery our Postgres data source via an Azure hosted Windows instance running the custom connector on a On-Premise gateway 24/7.
Does anyone know when Azure's Migration Service is going to be compatible with migrating Cassandra data over to Cosmos DB? I heard the team might be working on it a while ago and I'm wondering if there have been any updates as to when it will be available/if it's still happening?
Based on this official document,you could find two options to copy data from existing Cassandra workloads to Azure Cosmos DB.
1.Using cqlsh COPY command
2.Using Spark
However,the data migration tool is still not support Cassandra API so far.You could submit feedback here to push the progress of whatever you want.
I've been trying to figure out if I can run my DBs using PaaS for a specific application.
The bit I can't quite find an answer to is if Azure DBs on PaaS supports running as a Transactional Publisher?
I've seen an article that says a DB in Azure is capable of being a subscriber, but can't find anything regarding the other way round.
Today SQL DB doesn't support transaction replication if the SQL DB database is the publisher.