I have to create a new data provider or use one of Acumatica data providers to process a fixed-field length .txt file every day. I need to do an Import scenario, but I can't get the Data Provider for this .txt file to work. Is there any way that I can either create one or use one of the predefined data providers in Acumatica?
I have tried to modify the ACH Data provider but I could not make it work. Attached is an example of that file.
Related
Is there any way to purge/mask data in a Log Analytics workspace with regular expressions or similar, to be able to remove sensitive data that has been sent to the workspace?
Like social security numbers, that is a part of an URL?
As per this Microsoft Document, Log Analytics is a flexible store, which while prescribing a schema to your data, allows you to override every field with custom values. we can Mask the data in the Log Analytics workspace and here are a few approaches where we can set a few strategies for handling personal data
Where possible, stop the collection of, obfuscate, anonymize, or otherwise adjust the data being collected to exclude it from being considered "private". This is by far the preferred approach, saving you the need to create a very costly and impactful data handling strategy.
Where not possible, attempt to normalize the data to reduce the impact on the data platform and performance. For example, instead of logging an explicit User ID, create a lookup data that will correlate the username and their details to an internal ID that can then be logged elsewhere. That way, should one of your users ask you to delete their personal information, it is possible that only deleting the row in the lookup table corresponding to the user will be sufficient.
Finally, if private data must be collected, build a process around the purge API path and the existing query API path to meet any obligations you may have around exporting and deleting any private data associated with a user.
Here is the KQL query for verifying the private data in log analytics
search *
| where * matches regex #'\b((25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.|$)){4}\b' //RegEx originally provided on https://stackoverflow.com/questions/5284147/validating-ipv4-addresses-with-regexp
| summarize count() by $table
I've scripted something to create dashboards in DevOps and it seems to do the trick, but I've noticed an issue that I could do to resolve.
The issue is duplicates.
Although you can't have the same dashboard name if you put them in as team dashboards, putting them in as project dashboards allows it. I learn something new every day.
I've got a number of projects to go through to find and locate these duplicates - and wondered what the easiest and safest way would be to identify duplicates (and if possible delete them) based on the dashboard name?
Is this even possible? From what I've read, I can delete dashboards based on an id (but not name) which is fine so long as I can still pick out projects with more than one dashboard that have the same name.
If you want to find and delete duplicate dashboards, first you can call the REST API1 with Postman to get a list of dashboards under a project. From the attached screenshot, we can see that the parameter "dashboardScope" represents the project, and "name " represents the dashboard name, id" represents the dashboardId. We can see that there are two duplicate dashboard names: Hello World in the UI interface.
The dashboard name duplicated in the UI interface
The dashboard name duplicated in the REST API
Then we use the dashboardId obtained by calling REST API1, and then call REST API2 to delete the duplicate dashboard name. You need to note that this also deletes the widgets associated with this dashboard.
Delete duplicate dashboard name with REST API
The only dashboard in the UI interface
REST API1:
GET https://dev.azure.com/{organization}/{project}/_apis/dashboard/dashboards?api-version=6.0-preview.3
REST API2:
DELETE https://dev.azure.com/{organization}/{project}/_apis/dashboard/dashboards/{dashboardId}?api-version=6.0-preview.3
I need to get XML Data from API and store the data in Azure Data lake store and finally have a table created for this in Azure SQL DB/DWH. I know ADF can't handle XML data. How do i need to pull XML Data into azure. I have checking some links on using Logicapps.Any suggestions or way to handle it
As I'm not so clear about the details of your requirement, you asked "how to pull XML data into azure", so I post some suggestions for your reference.
If you want to get the xml data of your api, you can use "HTTP" action in your logic app.
Then you can use the output of the api in the next steps of your logic app.
If you want to parse the xml data, you need to transform it to json, please refer to the screenshot below.(input "json(xml(body('HTTP')))" to the content box and provide a schema)
We allow our creators to create their own site, so how do we get one feed per site?
How do we create multiple react activity feeds using one single getsteam app?
One simple way to do this is to adopt a naming convention to your feed and include some namespacing.
For instance, you could make feed IDs include the name or the ID of the application:
ie. timeline:${app.id}-${user.id} would be the timeline feed ID for app's user.
I was using the following SO post to create a folder in a SharePoint list: Here
It kinda-sorta works, but not as expected. Here is the way the logic works:
A Policy is either new or has been updated in our Document Library,
which acts as my trigger for Logic Apps
The condition then looks forthe word "Policy" in the File Identifier
It then goes and creates a folder in another SharePoint list (Policy Acknowledgement), based on the filename/Policy Name
Then it sends an email to staff advising them that a new/updated policy is available to read, which then has another link that says "I agree, etc"
This then writes to the Policy Acknowledgement list that the user has read the document
So all steps seem to work, except for when I go to view the Policy Acknowledgement list, the new folder created for that Policy doesn't display in the list. However, if I type in the address to the list/folder, I can view it. Here are my steps, below:
Create the folder
Send the email
Acknowledge the Policy
List where the new folder should be created - But doesn't (the other 3 folders were manually created for testing purposes)
Here is the actual folder, once the direct URL is typed-in
Although the folder isn't listed in SP, it still appears in Logic Apps as an actual folder
This isn't an issue for me, as I can always type in the address manually to get to the Policy Folder. But this is meant to be for our HR team, who just want to be able to open the "Policy Acknowledgement" list, and view all the Policy Folders in there.
I am starting to look down the path of creating a PowerShell runbook, that creates the folder for me, if that is the last resort (or should it have been my first resort?).