User A promoted databricks pipeline in PROD using rest API.
User B configured external orchestration tool to trigger job using REST API.
User C uses the Databricks Jobs UI to take "Owner" privileges.
when the job are triggered, which user will be displayed as creator ? USER A or B or C ?
pls explain which user and what would be the reason.
Creator always will be user A (until it's deleted later). You can find description in the Jobs REST API documentation.
Related
My requirements are to find all the users not logged in via Azure AD since last 45 days and last 90days and take action. That is,
A daily nightly job to run on Azure AD and if it finds users not login since last 45days; it should automatically disable the users.
A daily nightly job to run on Azure AD and if it finds users not login since last 90days or previous inactive users; it should delete the users.
This link looks similar where it’s going via a review process. However, my requirements are bit simple.
Thanks.
There are several options for identifying and removing stale/inactive users:
The access review feature you linked for identifying and removing inactive users is the most seamless, built-in way to achieve this at the moment. You can specify the "days inactive" and then remove the accounts either after the review period passes or after no reviewer has responded. To create access reviews and identify inactive users, you do need to have a Premium P2 license.
Alternatively though, you could use an Azure Automation account or Azure Logic app to achieve the same thing. For instance, you could create an Azure Automation Powershell runbook with a daily schedule that checks the Azure AD sign-in logs and deletes the accounts based on the condition of whether they have recently signed in (i.e. where max_TimeGenerated <= ago(45d)). There is an example blog post here that implements this logic. Note that to update the accountEnabled property of admin users, you need to use delegate permissions which need to run in the context of a user.
Another option is to query based on the lastSignInDateTime property.
The documentation for How To Manage Inactive Users has an example of how to query users who haven't signed in after a certain date using Microsoft Graph API.
Example:
https://graph.microsoft.com/beta/users?filter=signInActivity/lastSignInDateTime le 2019-06-01T00:00:00Z
To test the call, you can Sign in to Graph Explorer using the Global Administrator account of your tenant and execute the GET call.
Permissions Required:
Directory.AccessAsUser.All
Directory.Read.All
The SignInActivity property/endpoint is documented in detail here: https://docs.microsoft.com/en-us/graph/api/user-list?view=graph-rest-beta&tabs=http#example-3--list-users-including-their-last-sign-in-time
If you don't want the full list of users, you can also search for a specific user by name and evaluate the lastSignInDateTime:
https://graph.microsoft.com/beta/users?$filter=startswith(displayName,'marileet')&$select=displayName,signInActivity
I would like to send a welcome email to the user when the Azure AD admin assigns the user to the application (enterprise or custom). It would be nice to use a custom template to define an access URL and maybe some additional info (how to use, some rules, etc.). I haven't found anything similar in the Azure portal.
Does Azure AD provide such functionality? Or should I build custom implementation (Graph API, EventGrid + Azure Functions, SCIM protocol, etc.)?
From what I understand, at the very base, you want an email to be sent whenever a user is assigned to an enterprise application (or custom - assuming that also falls under same). I assume you do not need to know much about who actually assigned the permissions. Either way, with the query below you should have enough to get going.
As far as I know there's no native support for this in AD (I couldn't find anything at all that's available but I could be wrong) but a workaround for us was to do it using Log Analytics + Azure Logic App. While our use case is slightly different (we use it to monitor and be alerted for logins to a specific account), the same logic might apply to you - I put together a few lines to query log analytics. but I couldn't get the alert part working - Azure could just be sleepy right now.
EDIT: Alert worked the following day. Just needed some time to warm up I guess.
Give it a try:
Make sure Azure AD has diagnostic settings configured to send logs to a Log Workspace
Query the workspace using the following:
AuditLogs
| where TimeGenerated > ago(5m) //Change as required
| where ActivityDisplayName has "Add app role assignment grant to user"
| project Time = TimeGenerated, Activity = ActivityDisplayName, Application=parse_json(TargetResources)[0].displayName, User=parse_json(TargetResources)[1].userPrincipalName
| where Application contains "myapp"
Create an alert from Log Analytics (hopefully you get it working right away - each alert cost USD 1.50/month).
(a) You will need to create an Action Group
(b) Under Action Group, configure your email in the Notifications.
(c) You will need to come back to reconfigure an Action field to the Logic App you will create below.
Create a Logic App and start with an HTTP connector as the trigger to receive the content in JSON format.
Setup a Send an Email (v2) action with all variables and such, or another connector if applicable in your case. Customise the email in HTML
One alternative to using Log Analytics and Alerts could also be to use PowerShell to query AAD logs and then parse the information to Logic App through the HTTP POST Url that shows up when you save the HTTP Connector.
Another alternative to using the HTTP Connector could be to use the O365 connector with trigger When a new email arrives (v3)
Things to consider:
There's at least a 5 minutes delay between the time the event is logged and triggered. This is just the way the alert query works.
You will need to login to Logic App using the mailbox from which you want to send that email. That's something you might want to manage separately - expiring credentials etc.
Hope this gives you some ideas.
I need to schedule two cloud functions to run at a predefined time using Cloud Scheduler. However, when I click on the Cloud Scheduler tab it shows the below error message.
You don't have permission to enable Cloud Scheduler (appengine.applications.create, serviceusage.services.enable)
So I asked the project owner to grant me access to the below roles:
Cloud Scheduler admin
AppEngine Admin
Service Usage Admin
However, even after this I'm still getting the same message as before.
Below are the current roles that I have access to:
App Engine Admin
BigQuery Data Viewer
BigQuery User
Cloud Scheduler Admin
Cloud SQL Admin
Editor
Service Usage Admin
Storage Admin
Kindly let me know if I'm missing something here.
You don't need to be the project Owner.
You need these permission:
appengine.applications.create
serviceusage.services.enable
Predefined roles for first permission:
roles/owner
roles/appengine.appCreator
Predefined roles for second permission:
roles/owner
roles/editor
roles/serviceusage.serviceUsageAdmin
Since you already are an Editor, you only need to request App Engine Creator role for the first permission.
For you to be able to perform the configuration of Cloud Scheduler, you need to be the Project Owner.
Could you please give it a try asking your administrator to make you the Project Owner?
Understanding roles
This should fix your issue and solve your case. In case it doesn't, let me know if you are facing the same error.
Please, let me know if it worked!
If you are using target HTTP Method in your Cloud Scheduler, you can add Auth Header (Add OAuth token) with a particular or spesific service account.
Is it possible within Azure devops to send a bespoke notification to a user when a work item is completed.
Example Scenario
Work item are logged in azure devops under project by user 'Y' on behalf of a user 'X'
When this work item is completed is it possible to automate an email to user 'X'. Saying something like your request has been completed.
User 'Y' = Member of development team
User 'X' = End user of system, who has requested feature
Is this possible to achieve or is there a better way to go about this process ?
I think, this is possible but azure devops should detect your X by some properties. I see two ways:
User X may to Follow a work item.
User Y may create some specific tag and you can create a custom notification for that tag:
Or create a custom application that will scan your work items and send notifications.
I need to be able to make one user temporarily mirror another on demand. The mirroring user should get the same business unit, teams, and roles as the target user. Right now it is done manually, but it's a pain. I wrote a custom workflow activity to do it and it works if I run it as a system administrator and pick a mirroring user and target user.
But the end goal is to be able to allow certain users to run the dialog themselves. If I try to run it with myself as the mirroring user I get an error saying I don't have the privilege to assign roles, which makes sense since the workflow takes away my roles and then tries to assign me the target user's roles.
I'd like for the workflow activity to run as a privileged user but haven't had any luck so far. I've tried creating the IOrganizationService like this:
var context = executionContext.GetExtension<IWorkflowContext>();
var serviceFactory = executionContext.GetExtension<IOrganizationServiceFactory>();
var service = serviceFactory.CreateOrganizationService(null);
According to the documentation calling CreateOrganizationService with null as the parameter should force the user of the System user but it appears to still be running as the calling user.
I also tried calling CreateOrganizationService and passing the Guid of a different user with the System Administrator role, but got the same results.
Workflows has special conditions and is designed to ignore the guid you pass to the CreateOrganizationService.
I take the next paragraph from this article:
For the automatic workflow case, the owner of the workflow is also the
person who activates it and who selects the trigger mechanism and the
workflow steps so it is OK if the workflow executes under that user’s
context. For the on-demand case, a user is specifically requesting
some actions to be performed on his behalf by a workflow so the user
is fully aware of the workflow definition and that it will execute;
therefore it is safe to execute the workflow under that user’s context
instead of the workflow owner (who might not be aware that a user
requests an on-demand execution).
The custom workflow activity could be converted to a plug-in registered to run in the context of CRM Service or an Administrator
The workflow could be automatically, rather than manually triggered
If the end users are explicitly starting the workflow, it will be running in their user context
Dialogs are always run in the initiating users context
A workflow triggered by an event rather than being explicitly started by the user will run in the context of the user who started, and owns, the workflow - in this case an Administrator
A dialog or custom ribbon button could change something (a custom field) on the record that your custom workflow activity is registered to execute on-change