error on command "gsutil notification watchbucket... " - python-3.x

I am trying to setup object change notification. i am trying to run below command :
gsutil notification watchbucket https://<project_id>.appspot.com/ gs://bucket_name and getting error :
ServiceException: 401 Unauthorized WebHook callback channel: https://<project_id>.appspot.com
I have created service account and gave all permissions:
gsutil acl ch -u <project_id>#appspot.gserviceaccount.com:FC gs://bucket_name
Successgfully verified the domain as an owner of url on webmaster. But still getting same problem.
Have seen similar question on StackOverflow but could not find answers. Can anyone please help ?

It sounds like you haven't correctly verified the domain. There are a couple of easy mistakes you may have made:
It is possible you verified http://project.appspot.com instead of https://project.appspot.com. You can check and fix this on the search console.
It is possible you verified the domain using your own Google account but then attempted to use gsutil with a different account, such as a service account. This is particularly easy on a GCE instance as the default credentials will be the instance's service account. I recommend running gsutil in cloud shell as this uses your real credentials.

Related

Issue when trying to create a sendgrid account on azure server

I am trying to use sendgrid on azure, but when I am creating the account, it gives me an error saying:
The portal is having issues getting an authentication token. The experience rendered may be degraded.
Additional information from the call to get a token:
Extension: SendGrid_EmailService
Details: code: 500, statusText: error, message: There was an error processing your request. Please try again in a few moments., stack:
It has been giving me this since morning, pretty annoyed. And also it disables two fields, and marks them as loading:
Screenshot of the two fields marked as loading (For a very long time)
Since sendgrid wasnt working I thought I'd try and use SparkPost- The signup was successful, but its been taking hours to deploy.
Then I thought of manually configuring the smtp settings so the host and user and stuff could be sendgrid, but I wasnt able to find a way to do so.
Could someone help me out please! Thanks in advance!!
EDIT: This problem has been solved by the Microsoft Team.
Looks like SendGrid has some technical problems. You should check first SendGrid official support website if this is the issue. I was using SendGrid for a while, but I had to move to another solution. When you are registering SendGrid account via Azure you getting standard SendGrid plan. That means that you are sending your mails through shared SendGrid IPs. This is probably ok for marketing emails, but if you intend to send any transactional emails like password reset, bills etc you will end up eventually with tearing your hair off the head, because shared SendGrid IPs are in most existing spam blacklists out there.
SendGrid app status
I was able to enter to SendGrid using the following steps from Aaryaman Maheshwari in this comment:
Steps from Aaryaman's answer:
Step 1: In order to find ur username for SendGrid, first, go to the
SendGrid resource and then click properties. Now copy the resource id.
Step 2: Now, in the azure online shell, open bash and type the
following command: az resource show --ids [THE COPIED RESOURCE ID]
Make sure to replace [THE COPIED RESOURCE ID] with the resource Id you
copied in step 1
Step 3: In the json string that the terminal outputs, look for the
username property and note that down
Step 4: After you do that you can manually go to sendgrid.com and then
enter the username you just retrieved and then the password which you
used to sign up with.
Thanks Aaryaman Maheshwari
In order to incresase security, Sendgrid has recently requested to enable 2 factors authentication to connect to your account (it started one or two weeks ago).
Since this moment, the "automatic" connection from Azure to Sendgrid stopped to succes, and we have the same 500 error.
Also, "basic authentication" (username / password) will stop to work (starting from 10 decemeber I believe) in your api.
I'm not sure this is the reason, but it happens at the same time ;)
Just to update:
There was bug identified on Azure Portal and our product engineering team have fixed the issue.
Provisioning SendGrid account via https://portal.azure.com/ and managing works as expected.
The alternate https://rc.portal.azure.com/ URL was shared during the impact and is no longer required to be used.
We had a discussion on Q&A thread. Once again apologies for all the inconvenience. Much appreciate the follow-up and great collaboration.

When creating a publisher via `vsce` I get a 401

I am running this command to create a VSCode publisher:
vsce create-publisher cprev
but I am getting this 401 error:
my shell showing the 401
I have a personal token created like so:
the azure devops console
anyone know why I am getting a 401? Is there some way to debug it, to get a more specific message?
One easy mistake to make when creating the PAT (Personal Access Token)
is to not select all accessible accounts in the Accounts field
drop-down (instead selecting a specific account). You should also set
the Authorized Scopes to All scopes for the publish to work。
Please set Organization in drop down list with All accessible organizations.
More details pleas take a look at this similar issue here: vsce create-publisher returns 401

Cannot get Azure Active Directory Group from my logic app

Trying to access azure active directory group information but experienced below error.
Tried many ways but cannot find out the exact reason.
Config Information:
I am provided my azure ad group Id , and connected with my email myemail#outlook.com
Seems you have configured your azure active directory logic app connector with your personal Microsoft Account. As per my understanding you couldn't achieve it with myemail#outlook.com you have login with YourOrgEmail#YourTenant.onmicrosoft.com email. See the screen shot below:
Your Case:
I have successfully reproduce your problem and configure with my Organizational email and got success. See the screen shot below:
Permission:
I have also noticed that permission also could be the issue. In that case you might encounter insufficient privilege: 401 error. So you need at least following permission:
Permission Type: Application
Permission Name: Group.Read.All
See the screen shot below:
For more details you could take look this official docs

Google Cloud Storage - insufficient permission

The issue seems similar to another post, but It's different for me. Because I check the testIamPermission, and the returns showed that I got all permission I needed and still receive insufficient permission.
This is what I received:
{'storage.buckets.get' : true}
{'storage.buckets.getIamPolicy' : true}
{'storage.objects.create' : true}
{'storage.objects.delete' : true}
{'storage.objects.get' : true}
{'storage.objects.getIamPolicy' : true}
{'storage.objects.list' : true}
{'storage.objects.setIamPolicy' : true}
{'storage.objects.update' : true}
The code I used to test:
googleBucket.iam.testPermissions([testPermissions], function(err, permissions) {
if(!err)
console.log(permissions);
})
Permission I missed:
'storage.buckets.create',
'storage.buckets.delete',
'storage.buckets.list',
'storage.buckets.setIamPolicy',
'storage.buckets.update',
It's really confused that I got all permission on create objects, but still throw an insufficient permission. What I used for api is just uploading a file on bucket. Is there any permission I missed? (Server is located at Google Compute Engine, on the same project of Google Cloud Storage)
it would be rather interesting which user runs the script.
because, it seems that the user/service which runs the script only has the viewer, but not the editor role. check in the IAM, if you have the proper roles assigned to the proper service-account. you also might need to login to that GCE instance with cloud shell and add those service-account credentials. in cloud shell, there is a tiny "upload" button top-right, which can be used to upload the credentials json file into the VM. the documentation also explains this, step by step.
I have found the answer precisely. There is an option, called Identity and API Access, in Create a new instance page. Just Switch from Default to whatever access option(do config properly, tho), and the problem sloved!
For the answer provided by Martin Zeitler, It's not what GCE works on running the script, GCE automatically connect Its email to another API as Editor permission, and no need to Hook any json to Instance that established on GCE. As I mentioned that Server is located at Google Compute Engine, on the same project of Google Cloud Storage.
However, Documentation Link is fairly helpful, Thanks Martin Zeitler, give you an upvote for quick answer :)
The Service account of your Google Cloud Compute Engine instance should match the one being used to access the Google Cloud Storage Bucket.
If it doesn't match, stop the instance, change the service account by selecting the correct service account from the drop box (Service accounts linked to only current project would be visible in the dropdown list).
Also, make sure that the selected Service account has correct Google Cloud Storage access.

Error 80045C17 when trying to login at Azure to manage account

Everytime I try to login im my azure account (http://manage.windowsazure.com/), I keep receiving the same error 80045C17, and the error message says nothing more then ask my to try later.
I Tried Every possible browser and the same result comes everytime!
I use the same username to log at every microsoft service.
I can't even stop my sites!
How to fix this?
I was able to fix this by first signing in to http://msdn.microsoft.com and then go to https://manage.windowsazure.com . It is using your Microsoft Live login session anyway so you have to log in to anything Microsoft. For example login.live.com
I think it happened to me because the azure portal was currently opened on another machine with the same account. Make sure you always sign out when you end using the Azure portal.
Thanks,
Alex Vezenkov

Resources