I have added a schedule export for my Azure Resource Group Billing invoice on a monthly basis.
The invoice generated will consist of the billing details of the Last Month and will store the
.CSV file in my storage account as a Blob every month.
I'm using an Azure Logic app to retrieve the Invoice file and send it via mail to a group of
recipients.
The invoice is a .CSV file which consists of a number of columns like "InstanceID, MeterID,
UsageQuantity, ResourceLocation". But I need to get the TOTAL COST for the billing period.
Any idea how I can achieve this? Is there a specific column that I need to include in my CSV file. Or do I need to do some sort of data processing of the CSV file to get the total amount of resources consumed?
Any advise on this?
Thanks!
1. I created a csv file(named billing.csv) as below and upload it to blob storage.
InstanceID, MeterID, UsageQuantity, ResourceLocation, Pre tax cost
1,1,2,aa,10
2,2,3,bb,20
3,3,5,cc,30
2. In logic app, use "Get blob content" to get the csv file.
3. Search the action "Parse CSV" in you logic app.
4. The "Parse CSV" action will ask you to input "API Key", you need to go to this page first --> click "Start free trial", register an account and create a new API Key.
Copy the secret and paste it to your logic app as "API key", it will allow you to connect Plumsail.
5. Then choose the blob content into the "Parse CSV" and input the headers InstanceID, MeterID, UsageQuantity, ResourceLocation, Pre tax cost. Add a new parameter "Skip first line" and set its value as Yes.
6. Initialize a variable sum and set its value as 0 in integer type. Initialize another variable tempItem and also set its value as 0.
7. Use a "For each" loop.
The Body comes from "Parse CSV" action and the expression of "value" is: add(variables('tempItem'), int(items('For_each')?['Pre tax cost']))
8. After running the logic app, we can see the sum in last loop is:
9. Here is the whole logic app for your reference:
Import:
This solution uses the third party connector "Plumsail Documents", I'm not sure if it is free. I registered account in the past, it worked without any cost. But today the api key can't continue use, I need to register another account and create another api key. So I think this third party connector need extra cost if you want to use it for a long time.
Related
I am having a below API
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=0
Have to extract more data by increasing the offset into 2000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=2000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=4000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=6000
I have a maximum of 6000 records. I don't know how to pass the offset value of every 2000 in the data factory
I can able to extract individually with the above links but wanted to do it automatically for all 6000 records.
Can anyone show me some documentation or advise how to execute this in the datafactory?
I saw the pagination documentation, but no success
Extracting data using copy activity from rest API
Step1: Create a new pipeline and add a Copy Data activity.
Step2: Configure the source of copy activity, adding a pagination rule configured as below
Follow this sample URL and also make sure ={offset} at the end of URL.
Pagination rule either select option1 or option2 .In my case I selected Range:0:8:2
In your scenario you can follow the range as below :
Option1: QueryParameter.{offset}: Range:0:6000:2000
Option2: AbsoluteUrl.{offset}:Range:0:6000:2000
Range option are using 0 as the start value,6000 as the max value, and increasing by 2000 each time.
Storage account
For more detail information refer this official article.
Here's the scenario:
If Transaction type is under Order to Cash process (example: sales order or customer invoice, etc.) and Account Type is Revenue or COGS, it will require Product code and region code when adding item line in the sublist.
The Account type will base from the setup of the item.
The rule should apply for both UI in CSV upload across Order to Cash process
At a minimum, you will need a user event script that runs on the sales order, invoice, and any other record you need validation on. On the script deployment record, select CSV Import and User Interface in the Context Filtering tab.
For better user experience, create a client script that calls the same validation when the user saves the record.
I want to automatically get the Billing Invoice details of my Azure Resource-group on a monthly basis using the Schedule Export feature in Azure Cost-Management.
I'm able to create a monthly export. However, I'm not able to get the entire month's invoice details. Eg. Oct2020, Nov2020, Dec2020.
Instead this is the format in which my invoice is getting exported -->
This doesn't prove to be useful to me because I need to get the billing and invoice details for the entire month. And the invoice should automatically be exported monthly.
Eg. Oct2020 then next -> Nov2020 then next -> Dec2020, so on & so forth.
Any advise on how to achieve this?
Thanks in advance!
As you mentioned, the scheduled exports doesn’t support invoice-wise data export. However, here is what you can do:
Use Azure Cost Management Query API to generate the usage for the invoice period using the custom timeframe value.
Convert the JSON response to CSV.
Export the file to Storage Account.
Currently, the Query API supports grouping up to 10 dimensions. If you need the complete dataset, you can use Usage List API instead of Query API. In the Usage List API, you can specify the start and end date (as per your invoice) using the $filter parameter. Rest of the process is same.
I created a report with a StartDate and EndDate parameter. If I want to see the information for a single day, I use the same Date in both parameters. I now want to create a subscription for this report so that it runs everyday. How can I use the current date and pass it to these parameters when the report runs? Thanks!
Step 1: You will have your defaultDates dataset as so. can be query or you can wrap it as a Stored proc.
Select TodaysDate = cast(getdate() as date)
Step 2: then , under default values for both the params you will anchor get value from dataset and point to this dataset, which is defaultDates.
Step 3: test it locally. Make sure to delete .DATA from your working directory to enforce fresh data.
Step 4: build and deploy to whatever test location.
EDIT: This will only work with Enterprise edition.
First, write a query that gets the current date and formats it to match the VALUE in your parameter (for example, is it DD-MM-YYYY, YYYY-MM-DD?). Make sure to name your column something meaningful like "CurrentDate".
select cast(current_timestamp as date) as CurrentDate
Then create a new subscription for your report. Instead of Standard Subscription, choose data driven subscription. Now select your SQL datasource and paste in your query. Press validate to make sure it runs fine. Hit OK.
Now you can go down to your subscription parameters at the very bottom of the page. Set Source of Value to be "Get value from dataset" then pick your "CurrentDate" from the drop down.
That's it, data driven subscription with current date.
Here is what we are trying to do:
We want to track our vendors Drop Ship inventory which we do not own in NetSuite and be able to use the inventory feed from NetSuite to feed our eCommerce channels for both our owned and unowned inventory.
Here is what we've tried:
1- Enabled multi-location inventory
2- Created a "Drop Ship" location
3- Did an inventory adjustment to allocate inventory to that location
Here is the main issue we are facing:
The inventory should not show up on our balance sheet as we do not actually own it. In the scenario above, our inventory feed works as it should. However, the inventory shows up on our balance sheet.
Any assistance on how to get this to work without having the inventory shown on our balance sheet would be very much appreciated.
You could set cost to zero on the item record to reduce the balance sheet value to zero. To correctly show COGS on your P&L, you would need to add a cost on POs for each drop ship item.
To partially automate the PO process, create a custom column field that stores cost. When a PO is created or on a schedule, use a saved search email notification that includes an import ready CSV with Internal ID, Line ID, the custom cost field and a link to a saved CSV import in the body of the email. When you get the email notification, click the link and proceed to import the attached file with the correct item cost.