How to change BigQuery scopes? - scope

I am trying to write a query that pulls from federated tables in BQ. In BQ I can run the query and get results. However, when I run the same query in Domo, I get the error: Domo is ready, but received a Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found..Please contact the data provider for support.
I have read all over the place that I need to change the scope to do this. I am not a developer though, so I am not sure exactly how to go about this in BQ.
Does anyone have step by step instructions for how to do this?
Thanks!

When you created federated table you grant access of account that run query on the file in e.g. Google Drive.
So when you run query in BQ Console - it used your credentials.
When you run from domo - it may use different account (probably some service account) - so to have everything work you should grant proper access (essentially share document with this account) to your drive file to this account.

Related

Problems accessing azure data explorer free cluster

I am trying use https://dataexplorer.azure.com/freecluster. I have succefully logged in with my personal email id and created a table. I am able to query data run basic commands..etc. Then i was trying to create a java app to query to my data base. So i navigated to this link https://dataexplorer.azure.com/oneclick, and created a java application downloaded it. When i run this sample program it prompts for login but when i give the email id it shows this error. May i know what is going wrong here?
NB: I have used the same personal account which i used to login and query to ADX via web UI.

AWS nodejs SDK check if can access DynamoDB table

Using the AWS SDK I can make a get request and fetch a document, I will then know if I have the IAM access to access the database.
Is there a way to test with the NodeJS AWS SDK to see if I have allow access for the action dynamodb:getItem. Of course I can just write a query but is there a way without me having to spend time writing a meaningless query?
The easiest way I can think of right this moment is to try a simple getItem like you mentioned with a primary key, but do it with the low level API. Then you are not writing a "meaningless query." If I find another way, I'll add it here.
You can simply check through the user's role through console and CLI as well with the get-user-policy command.
CLI Approach:
aws iam get-user-policy --user-name Bob --policy-name ExamplePolicy
With the help of this command, you check the rights you have on that user. For detail look into this DOC.
Console Approach:
Login with AWS Console and Search IAM service
Under the User Section, Search your user whose permission need's to check.
Then in the permission section, you can watch all the permission.

Azure data factory pipeline showing RequestingConsent forever

I am unable to fix the "Requesting Consent" status for an azure Data Factory Pipeline querying some Office365 (Graph) simple data (i.e. smtp addresses and UPN of my colleagues).
Can you suggest me something to check ?
I am adding 2 pictures showing where "Graph Data Connect" is easily enabled, and the always empty PAM (Privileged Access Management) portal.
New image: Graph Data Connect configurator
New image: Empty PAM portal
As per the error we could see its a permission issue where you need to be Granted Permission before querying in Graph to pass simple data (i.e. smtp addresses and UPN of my colleagues).
Here, are the steps how you can add permissions:
You have to create a API permissions service, you have to Grant Permission for reporting API, must allow your app the appropriate
permissions based on the API you wish to access.
Next you could navigate to API Permission in the left column under the Manage.
Then you can click on +Add Permission as shown in bubbles in the Snip.
Please grant the permissions Directory.ReadWrite.All and Users.ReadWrite.All.
At last I found what was missing: it was a licensing requirement, but nothing warned me about this in PAM page. Simply nothing was listed in it.
If you like, here are the requirements nowadays.
Have a nice day to everyone !
Julian

How to use Power BI REST API's in Azure Data Factory

I am trying to create a list of all the workspaces and the reports contained in each one of them for a documentation project.
I found online that we can use this to get the workspaces, I wanna use it with a "web" activity:
https://api.powerbi.com/v1.0/myorg/groups
And then I want to use the IDs we get in the output with a foreach and another web activity inside of it and use this to get the reports in each workspace, then copy it somewhere (datalake or DB) :
https://api.powerbi.com/v1.0/myorg/groups/{groupId}/reports
But I don't know how to configure the activity and the authentication.
If there is a better way like connecting directly to Power BI I'm all ears, I tried to do a get data from web source but I don't have any "key" for API and organization authentication doesn't work.
When I run the code here: https://learn.microsoft.com/en-us/rest/api/power-bi/groups/getgroups it works perfectly.
Thanks in advance
'But I don't know how to configure the activity and the authentication.' For this question, I'm not sure if you don't know how to use access token to call an API.
Register an azure ad application with the api permission of 'https://analysis.windows.net/powerbi/api/Workspace.ReadWrite.All'
so that you can generate an access token by this application with
this scope. Add api permission.
Use ropc flow to generate an access token. This flow contains user information so that your api could know who you are to return correct groups.
Add 'Authorization' in request head with value of 'Bearer accessToken' to call the api.
ropc flow 1.0:

Create Google Contact with Node.js and Service Account

I would like to create a script to be scheduled in a .bat file that automatically links to google contacts and creates contacts read in a Mysql Database.
I would like a system that does not require any user action.
I know that service-account exist but I have no idea how to create the program. Do you know how to do it?
I hope you can give me a hand.
For the moment, I wish you a good day.
This can be done in three steps if the user is not part of G Suite.
Authenticate the user using OAuth with access_type = offline.
Save the generated Refresh Token.
Use the Refresh Token to generate a new Access Token and then update the account's contacts. The Access Token will be valid (default) for 3,600 seconds.
If the user is part of G Suite, then enable Domain Wide Delegation on a service account and impersonate the user.

Resources