GCP Cloud Data Loss Prevention Request Violates Constraint - security

I'm new to the Security Command Center (SCC) and Data Loss Prevention (DLP). I'm trying to create a job in DLP to check if there is any PII in a BigQuery table, but I'm getting the following error upon creation:
Request violates constraint constraints/gcp.resourceLocations on the project resource.
Learn more https://cloud.google.com/resource-manager/docs/organization-policy/defining-locations.
On the organization level, there's an organization policy (inherited in my project) allowing resources to be created in Europe only due to GDPR. I suspect that maybe DLP runs in some other region (US maybe) and that's why I'm getting this error.
I can't seem to be able to choose where does the job run in the options while creating the job, and I can't seem to find anything about this in the documentation. Any idea why am I getting this error and how to fix it?

The answer is copied from here.
Your organization policy is not allowing DLP to run the job, at this moment the DLP API is blocked by this "constraints/gcp.resourceLocations" and there's no workaround at the moment. However, there's a feature request to have the possibility to set a specific location rather than using "global", which is what in fact is causing this issue.

Related

Error issuing part using Maximo integration framework MXINVISSUE

We are upgrading from Maximo 7.5 to 7.6.1. Our web service that uses MXINVISSUEInterface is throwing an exception when we try to issue a part that is marked as a spare part and the work order has an asset. The exception says "BMXAA4195 - A value is required for the Organization field on the SPAREOBJECT object." The part is not in the SPAREPART table for the asset so it is trying to add it, but for some reason the ORGID is not populated from the MXINVISSUE_MATUSETRANSType object.
I re-generated the WSDL on the new server and rebuilt the solution, but after populate a new required field, I still get the same error.
Is there a system property that must be set. It works in 7.5 writing the record to MATUSETRANS and SPAREPART.
This sounds like a bug, so you might raise a Support Case with IBM about it. For a workaround until IBM releases a fix and you install said fix, consider the following options.
Can you set the Default Insert Site for the user using the web service?
Is it practical to put a Default Value on SPAREPART.ORGID?
Create an automation script called SPAREPART.NEW that will somehow figure out an ORGID to use. To "figure out", my first would be to check if mbo has an owner that has an ORGID and, assuming it does, use that.

Get Data from Web - 404 error

I am trying to get data from a website into a table in Excel. I am just using the regular button (get data - from web) in Excel (No code) Works fine for two websites but for a different website I am getting the following error:
Details: "The remote server return an HTTP status code '404' when trying to access 'https://smarkets.com/listing/sport/football/premier-league-2017-2018'."
The webpage certainly exists - I am guessing this is a deliberate strategy by the website to prevent data harvesting.
Anyone have any idea how I can get round it either through the get data route or a VBA approach?
Thanks
JL
I inspected traffic with Fiddler and Postman to no avail and in the end contacted the team direct for an answer.
The short answer, from their API team, is no.
Eventually our API, which may be suitable for your needs, will be
available to everyone.
API is in closed alpha stage as I mentioned in comments. More information here: API feed.
API/Odds Feed We're currently working on a new streaming API that is
faster and more scalable. The API is currently in a closed alpha
stage. Unfortunately there is no timeframe on when we'll be able to
release it to the public.
We will prioritise market makers when issuing streaming API accounts.
If you would like to gain alpha access to this service, you can apply
by outlining your proposal here
You can gain access to their XML feed with odds.smarkets.com/oddsfeed.xml .
The feed is updated every few seconds but the information is delayed
by 30 seconds.

Event Hub - Invalid Token error Azure Stream Analytics Input

I am trying to to follow the tutorial below
Azure Tutorial
As noted at the bottom there appear to have been changes since this was created
When I get to the part where I create an input for my stream analytics job, I cannot select an event hub even though there is one in my subscription
So I went to provide the information manually and I get an error stating invalid token
Has anyone got any ideas how to resolve this or can point me to a better/more recent tutorial?
I am looking to stream data in real time
Paul
Thanks for the help here I ended up using the secondary key and that worked fine!
Change to use Secondary connection string or use a different shared policy altogether.
You can use the primary of the new shared access policy.
PS : It is a weird error, sometimes removing the last ";" worked.

Azure Table Storage Bad Request - RequestInformation.Extended​Error​Information null

We've been getting a lot of Bad Request (400) exceptions when trying to save objects in our table storage, furthermore the storage exception is returning a null Extended​Error​Information so i just have the BadRequest(400).
I believe that the value of the properties shouldn't be the problem because our application is deployed on several build configurations and this only happens on one build configuration.
My main doubt is that we have a build configuration with a table name prefix [buildconfigname] for the dev environment and then a [buildconfiname]Prod for the production environment. The one that is returning all the Bad Request values is the dev.
So my question is, can the cause of the bad requests be the fact that one table name prefix contains or is contained in another table name prefix?
getting a lot of Bad Request (400) exceptions when trying to save objects in our table storage
Many factors can cause “Bad Request (400)” error when making operation on the Table service. Please refer to the following table (or this article: Table Service Error Codes) and check your code (entity class, entity properties & values etc) to find the cause of the issue.
Besides, if possible, please share us the code you are using to define your entity class and insert the entity.

Azure Machine Learning Endpoint SQL Access fails, works in experiement

I've created a classification endpoint using Azure ML, the input for which is a database query to retrieve the database row to classify.
When I run my experiment in the Machine Learning Studio, it works and connects properly to my database. When I submit the same query as a web service parameter on the import data module, I get the following error:
Ignoring the dangers of an SQL query as input, why am I getting this? Shouldn't it work the same?
Sidenote: I've used an SQL query on my training endpoint in the exact same way on the same database, and this didn't cause any problems.
UPDATE: It seems as if this is only a problem when I create a new endpoint for a service. If I use the default endpoint it does indeed work, but any new endpoints do not.
UPDATE 2: It also works when I submit my request as a batch run. If I use Request-Response, it fails.
According to Microsoft this is a known bug and they are working on it. Another workaround, although NOT recommended, is to pass the password in as a web service parameter (but may be ok for test/proof of concept apps)
Here is a link to the thread on the MS forums that states this is a bug.

Resources