I created a databricks feature table, but saw that it by default went under hive_metastore. I was expecting to see it under the unity catalog that I have created. Is featurestore not integrated with unity catalog yet?
Right now (November 2022nd), feature store doesn't support unity catalog. This integration is expected early next year, but it's hard to say right now. You can reach your Databricks solution architect or customer success engineer to get more information about expected dates.
Related
I want to create a simple pipeline with a few activities. The solution to start off won't be very complex - we'll pull in a spreadsheet from Sharepoint, parse it, transform it and stage it a SQL Azure database. Over time, the solution will have many other similar pipelines of varying complexity.
How should I structure the solution? Should I build everything directly in the Azure Portal? - there is a nice GUI after all.
Or should I follow my instinct and build out the solution in Visual Studio? If so, are there any extensions out there for VS 2022 that replicate the WYSIWYG experience of the portal? Any documentation/links around best practices for enterprise grade solutions win extra points!
Elaborating the title further, the current ADF I am working on has a lot of legacy code, i.e. multiple datasets, linked services. Unfortunately no naming convention or system of creating new items were defined.
I have tried to list down all the pipelines, the associated datasets (linked services as well), but this seems a lengthy approach and we have around 100 odd pipelines.
I tried by exporting the complete data factory as an ARM template and tried to create a parser which would automatically create the above list, but it seems that ARM templates are more interlinked than I had thought, I dropped this plan.
Is their a better approach for solving this problem?
You can pull the pipelines/dataflows that uses the particular dataset. this details will be available in the properties tab -> Related tab.
enter image description here
Also you get the list of datasets that uses a particular linked service by going to manage tab ->Linkedservice -> Related
enter image description here
Since you haven't mentioned data factory version (v1 or v2) and mentioned it has lot of legacy code, I am making an assumption that it could be a v1 factory. In that case, you can check the ADF v1 to ADF v2 migration tool which helps with basic migration tasks (listing resources, doing simple migration steps etc.)
In part to help explain or understand "what components build on which others" in Azure, or show others the time and evolution of thinking as products were released (console vs portal for example), is there a list of when products were released to the public?
I checked Wikipedia for this and found no historical data, (hoping it would be in table form by date/status/release/LTS)
I think the closest would be the RSS feed from https://azure.microsoft.com/en-us/updates/
It includes updates about features going to Preview and General Availability.
You could also try with the Wayback Machine on the Products available by region page.
Here is a version from 2018. The page changed into a search type page so you'll need to try see if archive.org has versions of the region specific pages.
Products available by region gives you a NOW-only view with the following key:
Generally Available
In Preview
In Preview (hover to view expected timeframe)
Future availability (hover to view expected timeframe)
I'm attempting to crawl a MS Sharepoint environment using a local Oracle Endeca instance (11.1.0). Per Oracle's documentation there should be a 'Sharepoint WebServices Data Source' option within the 'CAS Console' but it is not present.
The documentation goes into detail on configuring the data source, but provides no information on getting the plugin installed to begin with, or even if a plugin exists, it seems like this should be OOTB but it is nowhere to be found.
Appreciate any information on the subject from anyone who has successfully used this connector with Endeca before.
The 'Sharepoint WebServices Data Source' requires a separate license, stay classy Oracle.
I have data living on the cloud in table storage and I would like to move this down to a development server.
I used to use Clumsy Leaf table explorer but I seem to often have problems where not all the data is moved. It gives an error half way through when I try to import data that I exported from the cloud.
Are there other options for me to move data between one location and another?
By the way I notice this question was asked before but that was in 2011 and the suggested made does not work. Please don't vote to close this as I hope that things changed since 2011.
If you're looking for other tools, may I suggest you look at Cerebrata tools (http://www.cerebrata.com). You could either use Cloud Storage Studio or Azure Management Cmdlets to download data from the cloud and upload in development storage.
Hope this helps
A while ago, I wrote a blog post about the different azure storage and service management tools. Most of the mentioned tools are still valid and they probably improved since I wrote the post. Check it out at http://gshahine.com/blog/archives/2010/11/04/azure-service-management-tools/