Transfer Success, but no data in the database (Data Migration tool) - data-migration

I am trying to transfer JSON file to mongodb in microsoft azure using Microsoft's Data Migration tool. There are no errors generated and its says transferred. But there is no data in the database.
enter image description here
I looked everywhere but doesn't find any solution. If you could help, that would great.

Related

Azure Data Factory Excel read via HTTP fails

I am looking to import data form a publicly available Excel sheet into ADF. I have set up the dataset using an HTTP linked service (see first screenshot), with AutoResolveIntegrationRuntime. However, when I attempt to preview the data, I get an error suggestion that the source file is not in the correct format (second screenshot).
I'm wondering if I may have something set incorrectly in my configuration?
.xls format is not supported while using HTTP.
Since, the API downloads file you can't preview data. You can load file to blob or Azure Datalake Storage using copy activity and then on top of that file have a dataset to preview.
The workaround is to save your .xlsx file as a .csv file because Azure Data Factory does not support reading .xlsx files explicitly for HTTP connectors.
Furthermore, there is no need to convert the.xlsx file to.csv if you only want to copy it; simply select the Binary Copy option.
Here, is a similar discussion where the MS-FTE has confirmed with Product Team that's its not supported yet for HTTP Connector.
Please submit a proposal in the QnA thread to allow this functionality in future versions, which will be actively monitored by the data factory product team and evaluated for adoption.
Please check the issue at QnA Thread- Here.

Office365 Excel as source for GCP BigQuery

We are using Office365 Excel and manually creating some data that we need in BigQuery. What solution would you create to automatically load the data from this excel to a table in bq? We are not allowed to use Google Sheets (which would solve all our problems).
We use Matillion and GCP products.
I have no idea how to solve this, and I don't find any information about this, so any suggestion or idea is appreciated.
Cheers,
Cris
You can save your data as csv and then load them to BigQuery.
Then, you can load data by using one of the following:
The Google Cloud Console
The bq command-line tool's bq load command
The API
The client libraries
Here you can find more details Loading data from local files
As a different approach, you can also try this other option:
BigQuery: loading excel file
For this you will need to use Google Drive and federated tables.
Basically you will upload your Excel files to Google Drive with the option "Convert uploaded files to Google Docs editor format" checked in your settings, and upload to BigQuery from Google Drive.

Azure Wrangling data flow doesn't appearing

I want to use the Wrangling data flow in Azure Data Factory v2, but this data flow doesn't appearing for me.
I followed this tutorial Prepare data with wrangling data flow
We have this image to create the wrangler:
But, in my subscription these options doesn't appearing for me.
I searched in many websites, tutorials and I didn't find anything about this.
The Azure data wrangling in data flow is actually moved below as "Power Query".
for more details watch the official docs video:
https://learn.microsoft.com/en-us/azure/data-factory/wrangling-overview#use-cases
If you have already create the Wrangling data flow, I think you can select it.
In your screenshot, it seems you dont select the 'Use existing data flow':

Is there a way to use excel power query to query a SAS dataset on a local drive?

I've searched everywhere and can't seem to find an answer so hopefully someone here can assist. We have a SAS program set to run weekly that is outputting a dataset to a local drive. Is there a way to get excel Power Query to see it? I can connect to datassets fine that are housed within the database but stored locally is an issue. Outputting this to the database isn't an option for us. Any ideas?
If you have the Stored Process server you can create a web query to access it, as described here: https://www.rawsas.com/sas-as-a-service-an-easy-way-to-get-sas-into-excel-power-bi-and-000s-of-other-tools-languages/
This functionality also comes bundled with https://datacontroller.io (free for up to 10 users)
Disclosure - I wrote the blog and created the product.
Alternatives:
update your job to export your data as CSV or some other format that can be read natively by excel.
Use the IOM interface and VBA
SAS Addin for excel
All these options require server SAS. In short, there is no way that Excel Power Query can connect directly to a SAS dataset on a local drive, as the .sas7bdat format is a proprietary SAS format optimised for use by SAS.

Extracting data from cassandra - OpenShift

I've created the openshift web console
OpenShift Web Console image
The hawkular-cassandra database consists of all the cluster metrics which is reflected in web console, However I need to extract this data from the database and store it in an excel sheet for analysis.
I tried to fetch this data but I am not able to access cassandra and there is no certain information regarding data extraction or exporting data to excel sheet.
Is there any way through which this can be done in open shift?

Resources