OIM - How to import provisioned resources from another OIM version? - oim

i need to import users and their provisioned resources from the version 11.1.1.3 to 11.1.2.2.
For the users I have used the Bulk Load but this script just only import the user data into the usr table and I need to import also the data about the entire info for the provisioned resources such AD, SAP, CRM, etc.
Could you suggest me any idea to do this? Is there a script or similar to import?

It's pretty simple if your targets are remaining same and then just you need to install all your connectors on 11.1.2.2 machine and the point the connector to targets using IT Resource Configuration. With the help of target Reconciliation you can able to import the provisioned resources to users as it is.

There are two ways to get the user account data in OIM.
1. Configure target reconciliation and reconcile the user accounts.
2. Bulk load utility also provides an option to bulk load account data.
Refer below link.
https://docs.oracle.com/cd/E27559_01/dev.1112/e27150/bulkload.htm#OMDEV1769

Related

How do I limit users to just the DBSQL lens of Databricks? I don't want them to have access to SQL endpoints or DE & ML lenses

Here are the current options. I want to give the BI team only access to the S: SQL
Completely removing the other options is currently not available. However, you can lock them as shown in the picture.
Remove databricks-sql-access entitlement from default users group.
Create a new group called 'sql-users-only' and give them only this entitlement. (so they won't see workspace and they can't spin up endpoints)
Resources:
Databricks provide a lot of control using ACLs. Access control overview
You can restrict the dashboard viewing options as well. Dashboard permissions

Azure Data Catalog Backup

Since ADC is provided by MS as SaaS to customers, is MS taking backups of the dataset and business glossary? If yes, how often and how can a customer get access to the backups for recovery purposes?
Unfortunately, there is no explicit backup/restore feature available for catalogs.
I would suggest you to vote up an idea submitted by another Azure customer.
https://feedback.azure.com/forums/906052-data-catalog/suggestions/33125845-azure-data-catalog-backup-feature
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
The closest way to achieve this with current functionality is to use the Azure Data Catalog REST API to extract all assets and persist them locally (and re import them manually later).
There is a sample application available that demonstrates this technique: Data Catalog Import/Export sample tool.

Parameterised datasets in Azure Data Factory

I'm wondering if anyone has any experience in calling datasets dynamically in Azure Data Factory. The situation we have is that we dynamically sweep all tables in from IaaS (on-premise SQL Server installations on an Azure VM) application systems to a data lake. We want to have one pipeline that can pass server name, database name, user name and password to the pipeline's activities. The pipelines will then sweep whatever source they've been told to read from the parameters. The source systems are currently within a separate subscription and domain within our Enterprise Agreement.
We have looked into using the AutoResolveIntegrationRuntime on a generic SQL Server dataset but, as it is Azure and the runtimes on the VMs are self-hosted, it can't resolve and we get 'cannot connect' errors. So,
i) I don't know if this problem goes away if they are in the same subscription and domain?
That leaves whether anyone can assist with:
ii) A way of getting a dynamic runtime to resolve which SQL Server runtime it should use (we have one per VM for resilience purposes, but they can all see each other's instances). We don't want to parameterise a linked service on a particular VM as it places reliance for other VMs on that single VM.
iii) Ability to parameterise a dataset to call a runtime (doesn't look possible in the UI).
iv) Ability to parameterise the source and sink connections with pipeline activities to call a dataset parameter.
Servers, database, tableNames are possible to be dynamic by using parameters. The key problem here is that all the reference in ADF can’t be parameterized, like linked services reference in dataset, integrationRuntime reference in linked service. If you don’t have too many selfhosted integrationRuntime, maybe you can try setup different pipelines for different network?

Azure Batch - Create Nodes from VM Image

I have some Selenium test code that I need to run in parallel. In order for Selenium to run effectively, certain configurations have to be done on the machine (I.E. zone settings, Chrome and Firefox installs, etc.) and these settings are hard (if not impossible) to apply via an automated approach. I've manually created a VM, done all the setup and created an image following the directions in Microsoft's documentation.
Now I need to setup my code so that I can specify a VM image to use when creating the nodes. I've searched as much as I can and not found any documentation that explains how I can go about doing this. The example in the DotNetTutorial sample doesn't seem to have any way to specify an image.
There is a feedback item here on this same topic and shows the request as started on Jun 1st 2015. I'm hoping this means that it's done now and that it just hasn't been documented well.
Q: How I can specify a custom VM image as the source for my Azure Batch nodes?
https://github.com/Azure/azure-sdk-for-net/blob/AutoRest/src/Batch/Client/changelog.md
• Added support for deploying nodes using custom VHDs, via the OSDisk property of VirtualMachineConfiguration. Note that the Batch account being used must have been created with PoolAllocationMode = UserSubscription to allow this.
Updated Answer on 2017-12-05:
Custom images are now supported through normal Batch accounts (i.e., Batch service pool allocation mode accounts). You will need to specify a valid ARM Image Id and use Azure Active Directory authentication to create custom images (shared key auth does not support custom images).
Updated Answer on 2017-03-17:
Custom images are now supported through "User Subscription" Batch accounts. You can create these types of accounts in Azure Portal or through the newest management SDKs for supported languages.
Previous Answer:
Currently, custom VM images are not supported. As you noted, this is a feature that is being worked on. In addition to uservoice, you can periodically check for product updates at this site.

Azure addon - accessing WADPerformanceCountersTable?

If I write an Azure addon, can it access the WADPerformanceCountersTable table (of the business application that provisioned this addon)? Especially in terms of security/permissions.
E.g. say I wanted my addon to monitor some performance counters, and send an email alert if they pass some thresholds (regardless of whether there are already such commercial products, I'm just interested in the technical capability). What will I have to do? I'm guessing WADPerformanceCountersTable isn't publicly exposed to the entire worlds - so how can I make them accessible to my addon?
thanks very much
WADPerformanceCountersTable is nothing different from other Azure tables, and it's stored in the storage defined by Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString in the configuration file. You will need the storage account name/key pair to read from this table.
FYI, here is an article about how to effectively fetching performance counter data from this table: http://gauravmantri.com/2012/02/17/effective-way-of-fetching-diagnostics-data-from-windows-azure-diagnostics-table-hint-use-partitionkey/

Resources