gbXML file generation time - revit-api

I am trying to generate the gbXML file using solar computer plugin GBIS. However it has updated to 15% only in more than one hour. Is it normal for gbXML file to generate or export. For more information, the Revit model is 6 storey building with around 270-280 MEP Rooms.

Related

Azure cost analysis for a particular subscription using Python SDK

So I'm trying to automate fetching the current cost and cost forecast (Like it is shown under cost analysis for a particular subscription) for a particular subscription using python SDK but I haven't been able to find a single API that does this yet.
I've tried using UsageAggregate and Rate card but I haven't really figured out a way to find the cost for the current month to date. If there is an API that I'm missing or if I need to calculate monthly costs myself, I'd appreciate any code snippets or help.
If you already have the usage and the ratecard data, then you must combine them.
Take the meterId of the usage data and get the related ratecard data.
The ratecard data contains the MeterRates and the IncludedQuantity which you must take.
There are probably multiple meter rates and the included quantity because there are probably different costs per usage (e.g. first 10 calls for free, 3 GB for free, ...).
The consumption starts/is reseted at the 14th of the month. That's the reason why you have to read the data from the whole billing period (begins with 14th of each month), because that's the only way how you get the correct consumption.
So, if you are using e.g. Azure Functions and you have a usage of 100.000 units per day and you want the costs from 20th - 30th, then the calculation works as follows:
read data from 14th - 30th. These are 17 days and therefore it used 1.700.000 units. The first 400.000 are for free = IncludedQuantity (so in this sample the first 4 days).
From the 400.001 unit on, you have to take the meter rate (0,0000134928 €) and calculate the costs. 1.300.000 * 0,0000134928 = ~17,54€.
Fortunately, the azure functions have only one rate. If the rate changes e.g. after 5.000.000 units, then you also have to take this into account. If you have the whole costs, then you can filter on your date which is 20.-30. and you will get the result.
Its calculation implemented in C# and published it as a NuGet package here. It also contains a sample console which you could use to export the data.
I know I am bit late to the party, but after struggling with the same problem, I managed to create the code for getting the cost of a resource group using
azure.mgmt.costmanagement
Link to cost management API
Code sample is in my answer here

Proc Groovy to parse larger XML into SAS

We tried reading 3-4 GB of XML file using SAS XML mapper .but when we PROC COPY the data from the XML engine to SAS Dataset its taking almost 5 to 6 mins which is too much time for us since we have to process 3000 files a day .We are running almost 10 files in parallel.One table almost have 230 columns.
Is there any other faster way to process the XML ?
can we use PROC GROOVY ? will it be efficient? if yes can any one provide me a sample code?
i tried searching online but not able to get one.
The XML has PII data and its huge of 3 GB .
The Code being run is very simple and straight forward:
filename NHL "/path/ODM.xml";
filename map "/path/odm_map.map";
libname NHL xmlv2 xmlmap=map;
proc copy in=nhl out=work;
run;
Total Table created : 54 in which more than 14 tables have ~18000 records and remaining tables have ~1000 records
The Log window shows
NOTE: PROCEDURE COPY used (Total process time):
real time 4:03.72
user cpu time 4:00.68
system cpu time 1.17 seconds
memory 32842.37k
OS Memory 52888.00k
Timestamp 19/05/2020 03:14:43 PM
Step Count 4 Switch Count 802
Page Faults 3
Page Reclaims 17172
Page Swaps 0
Voluntary Context Switches 3662
Involuntary Context Switches 27536
Block Input Operations 504
Block Output Operations 56512
SAS Version : 9.4_M2
total memsize is MEMSIZE=3221225472 in our server
3000 files total out of which 1000 will be 3 to 4 GB and some of which will be 1 GB and 1000 files will be in KB .The smaller files are getting processed quickly the problem is only with big files .it uses almost the entire CPU.
The copy time from XML engine varies when we reduce the number of file,but for that to happen we have to change the map file or the input xml.
Already raised SAS tracks and have questioned the same in SAS communities still no luck.looks like its parser limitation itself.
Any idea about the shredder in Teradata ? will it be efficient?
I would do this in two pieces, first convert XML to ascii and then into SAS. SAS isn't going to be very fast at converting XML into SAS; it's just not something SAS is optimized for. You're using nearly entirely CPU time, so you're not disk limited - you're limited by SAS's ability to parse the XML file.
Write a program in a more optimized language that can parse the XML much faster, and then read the results of that into SAS. Python might be one option - it's not super optimized either, but it's more optimized for this sort of thing than SAS I suspect - or an even lower level language (like c/c++) might be your best bet.

OSGB - number of tiles per file

I have OSGB models that are created with either software: AgiSoft, Bentley, SkylineSoft, Pix4D. Currently each tile in the output folder is divided into at least two files, one osgb and one or several texture file (jpg).
I have a problem in the deployment with the number of output files, in large models it can reach millions of files and when I copy them to the target computer it takes a long time. Is it possible to export with the above softwares to an osgb format that one file can contain several tiles / textures?
Thank you!

writing genetic algorithm with vba in microsoft project

would it be possible to write vba code for genetic algorithm in MSP as like as excel ?
I want to optimize the duration of project's tasks according to minimize total cost, for this aim I need to run genetic algorithm to change duration of each task randomly, if i can write this algorithm with Vba in microsoft project, MSP will level resources based on new duration of each task and by new total duration of project, I can calculate the new cost.
I know there are vba genetic codes available for optimization in excel, is it possible to write the same code in Vba of MSP ?
if it is not, what about exporting data from MSP and importing them to excel for running the algorithm and then at the same time importing that data to MSP again?
thanks a lot,
I want to optimize the duration of project's tasks according to
minimize total cost, for this aim I need to run genetic algorithm to change duration of each task randomly
Yes, you can do this in MS Project using VBA (Task object methods and properties). And you need to do this in MS Project since you rely on Project's CPM engine and leveling capabilities.
Algorithm
Loop through all tasks and change duration to random number. Skip Summary tasks as Duration is read-only and skip tasks that are complete, and optionally skip tasks that have already started.
Level the resources.
Calculate total cost.
Store the durations for all tasks in the Duration1 field and store the total cost in the Cost1 of the first task.
Repeat steps 1-4 nine times, storing the durations in Duration2-Duration10 and total cost in Cost2-Cost10.
Determine the iteration that had the lowest cost and move those durations to the Duration1 field.
Repeat steps 5-6 until cost is optimized sufficiently.
Move the durations from the Duration1 field to the Duration field skipping tasks per Step 1.

Finding time of testing data in Sphinx train.

I am training data via pocketsphinx and sphinxtrain. We can see our training data time in log file. like my current training data is shown as
Phase 5: Determine amount of training data, see if n_tied_states seems reasonable.
Estimated Total Hours Training: 1.00766111111111
After training, testing is done. for testing I have added 20 files. But I dont know what is length of these files. Finding it manually is a hard task as I am going to increase testing data.
So is there any log file or any other (than manual) way I can check my testing data time.
I just found it, I am posting own answer so it may be helping for others
You can find it under logdir/decode/dbname-1-1.log
while dbname is your main folder name in my case logdir/decode/tester-1-1.log.
Open this file and there will be a line
INFO: batch.c(778): TOTAL 81.24 seconds speech, 30.43 seconds CPU, 37.54 seconds wall
Here TOTAL 81.24 seconds speech is time of my testing audio data.

Resources