NDepend: Get average LOC per method - cql

Let's say I a have specific method set in my solution.
How can i get an average number of code lines per method in the method set?
Those numbers are usually shown in the statistic section of each NDepend report (like Sum, Average, Minimum, etc) but I want to be able to write queries for such numbers separately.

CQLinq query could be like following:
let totalLinesSum = JustMyCode.Methods.Where(t => t.IsPublic).Sum(t => t.NbLinesOfCode)
let methodsCount = JustMyCode.Methods.Where(t => t.IsPublic).Count()
let result = (double)totalLinesSum / methodsCount
select (double?)result
...or a bit more refined, this query can be refactored like:
// Let define your methods set the way you need
// It is worth removing abstract method that have no LoC
let methodsSet = JustMyCode.Methods.Where(m => m.IsPublic && !m.IsAbstract)
let totalLoc = methodsSet.Sum(t => t.NbLinesOfCode)
let methodsCount = methodsSet.Count()
let avgLoc = (double)totalLoc / methodsCount
select (double?)avgLoc

Related

Copy Excel cell value and add rows to another table

In a table (in excel) in a column I have some number(A).
I want the flow to take that number (A) and to create number of rows equels to Number (A)
For example if number(A) is 4, then in another table to be added 4 rows
I've made an assumption on the source and destination tables. This concept can be adjusted and applied to suit your own scenario.
I'd be using Office Scripts to do this. If you've never used it then feel free to consult the Microsoft documentation to get you going ...
https://learn.microsoft.com/en-us/office/dev/scripts/tutorials/excel-tutorial
This is the script you need to create (change the name of your tables accordingly) ...
function main(workbook: ExcelScript.Workbook)
{
var addRowsTable = workbook.getTable('TableRowsToAdd');
var addRowsToTable = workbook.getTable('TableAddRowsToTable');
var addRowsTableDataRange = addRowsTable.getRangeBetweenHeaderAndTotal();
var addRowsTableDataRangeValues = addRowsTableDataRange.getValues();
// Sum the values so we can determine how many more rows need to be added
// to the destination table.
var sumOfAllRowsToBeInExistence = 0;
for (var i = 0; i < addRowsTableDataRangeValues.length; i++) {
if (!isNaN(addRowsTableDataRangeValues[i][0])) {
sumOfAllRowsToBeInExistence += Number(addRowsTableDataRangeValues[i][0]);
}
}
var currentRowCount = addRowsToTable.getRangeBetweenHeaderAndTotal().getRowCount();
var rowsToAdd = sumOfAllRowsToBeInExistence - currentRowCount;
console.log(`Current row count = ${currentRowCount}`);
console.log(`Rows to add = ${rowsToAdd}`);
if (rowsToAdd > 0) {
/*
The approach below is contentious given the performance impact but this approach ...
for (var i = 1; i <= rowsToAdd; i++) {
... didn't always yield the correct result. May be a bug but needs investigation.
Ultimately, there are a few ways to achieve the same result, like using the resize method.
This was the easiest option for a StackOverflow answer.
*/
while (addRowsToTable.getRangeBetweenHeaderAndTotal().getRowCount() <
sumOfAllRowsToBeInExistence) {
addRowsToTable.addRows();
}
}
}
You can then call that from PowerAutomate using the Run script action under Excel Online (Business) ...
You can use that approach or all of the actions that are available in PowerAutomate which will achieve the same sort of thing.
IMO, Using Office Scripts is much easier. Creating a large flow can be a real pain in the backside to deal with given there'll be a whole heap of actions that you'll need to throw in to reach the same outcome.
I would pass the number of rows to add in an office scripts script as a parameter. Once you have the value, create a JSON string of a 2d array. You want to create a loop using the number of rows to add. In the loop you continue to concatenate the 2d array. Once you've exited the loop, parse the JSON string and add the 2d array to the table. You can see how you code might look below:
function main(workbook: ExcelScript.Workbook, rowsToAdd: number)
{
//set table name
let tbl = workbook.getTable("table2")
//initialize json string with open bracket
let jsonArrString = "["
//set the temp json string with a 2d array
let tempJsonArr = '["",""],'
//concatenate json string equal to the number of rows to add
for (let i = 0; i < rowsToAdd; i++){
jsonArrString += tempJsonArr
}
//remove extra comma from JSON string
jsonArrString = jsonArrString.slice(0, jsonArrString.length-1)
//add closing bracket to JSON string
jsonArrString += "]"
//parse json string into array
let jsonArr: string[][] = JSON.parse(jsonArrString)
//add array to table to add the number of rows
tbl.addRows(null,jsonArr)
}

M (PowerQuery), set the value of a non-primitive variable in a let statement

I'm writing a custom M Language (PowerQuery in Excel) function to query a RESTful interface. This interface has a large number of optional parameters.
Starting with a simple case- I handle an optional limit passed as a simple (primitive) value as follows-
/*
* RESTful API Get from the named endpoint
*/
(endpoint as text, optional limit) =>
let
// query limit
// If limit is supplied as a number, it will be converted to text
// If limit is not supplied it will be set to the value "1000"
limit = if limit <> null then Text.From(limit) else "1000",
As the full API has many paramaters I wanted to use a Record to pass them to the function, but then I realised I don't know how to persuade M to write the default values into the parameter record.
I tried a couple of options.
Direct access-
(endpoint as text, optional params as record) =>
let
params[limit] = if (params[limit] = null) then "1000",
the result is a syntax error-'Token equal expected'
Merging the new value of limit as a Record with "&"
(endpoint as text, optional params as record) =>
let
params = params & if params[limit] = null then [limit = "1000"] else [],
result syntax error-'Token Literal expected'
I'm clearly missing something about the syntax rules for let statements, I know I need a variable = value assignment, and it looks as if putting anything other than a plain variable name on the LHS to write elements inside a structured value is not allowed, but i'm not sure how to acieve this otherwise?
Not sure exactly what you want here, but to create a List of Records where some Records have a default parameter and others do not, you could try something like:
(newParams as record) =>
let
default = [limit=1000, param2=2, param3=3],
final = Record.Combine({default, newParams})
in
final
With regard to Record.Combine, the beauty is that the right hand record will override the left hand record if both are present; and it will just add to it if nothing is present.
So something like:
let
Source = [limit=400, param3="x", param7=246],
conv = fnParams(Source)
in
conv
=>
Depending on the required format of your output string, you can build it using List.Accumulate. eg:
let
Source = [limit=400, param3="x", param7=246],
conv = fnParams(Source),
list = List.Accumulate(List.Zip({Record.FieldNames(conv), Record.ToList(conv)}), "",
(state,current) =>state & "&" & current{0} & "=" & Text.From(current{1}) )
in
list
=> &limit=400&param2=2&param3=x&param7=246

firebase Starting point was already set

I use firebase admin and realtime database on node.js
Data look like
When I want to get data where batch = batch-7, I was doing
let batch = "batch-7";
let ref = admin.database().ref('qr/');
ref.orderByChild("batch").equalTo(batch).on('value', (snapshot) =>
{
res.json(Object.assign({}, snapshot.val()));
ref.off();
});
All was OK!
But now i should create pagination, i.e. I should receive data on 10 elements depending on the page.
I use this code:
let page = req.query.page;// num page
let batch = req.params.batch;// batch name
let ref = admin.database().ref('qr/');
ref.orderByChild("batch").startAt(+page*10).limitToFirst(10).equalTo(batch)
.on('value', (snapshot) =>
{
res.json(Object.assign({}, snapshot.val()));
ref.off();
});
But I have error:
Query.equalTo: Starting point was already set (by another call to startAt or equalTo)
How do I get data in the amount of N, starting at position M, where batch equal my batch
You can only call one startAt (and/or endAt) OR equalTo. Calling both is not possible, nor does it make a lot of sense.
You seem to have a general misunderstanding of how startAt works though, as you're passing in an offset. Firebase queries are not offset based, but work purely on the value, often also referred to as an anchor node.
So when you want to get the data for a second page, and you order by batch, you need to pass in the value of batch for the anchor node; first item that you want to be returned. This anchor node is typically the last item of the previous page, since you don't know the first item of the next page yet. And for this anchor node, you need to know the value of the item you order on (batch) and usually also its key (if/when there may be multiple nodes with the same value for batch).
It also means that you usually request one item more than you need, which is the anchor node.
So when you request the first page, you should track the key/batch of the last node:
var lastKey, lastValue;
ref.orderByChild("batch").equalTo(batch).limitToFirst(10).on('value', (snapshot) => {
snapshot.forEach((child) => {
lastKey = child.key;
lastValue = child.child('batch').value();
})
})
Then when you need the second page, you do a query like that:
ref.orderByChild("batch").start(lastValue, lastKey).endAt(lastValue+"\uf8ff").limitToFirst(11).on('value', (snapshot) => {
snapshot.forEach((child) => {
lastKey = child.key;
lastValue = child.child('batch').value();
})
})
There's one more trick above here: I use startAt instead of equalTo, so that we can get pagination working. But it then uses endAt to ensure we still end at the correct item, by using the last known Unicode character as the last batch value to return.
I'd also highly recommend checking out some of the previous questions on pagination with the Firebase Realtime Database.

How to maintain counters with LinqToObjects?

I have the following c# code:
private XElement BuildXmlBlob(string id, Part part, out int counter)
{
// return some unique xml particular to the parameters passed
// remember to increment the counter also before returning.
}
Which is called by:
var counter = 0;
result.AddRange(from rec in listOfRecordings
from par in rec.Parts
let id = GetId("mods", rec.CKey + par.UniqueId)
select BuildXmlBlob(id, par, counter));
Above code samples are symbolic of what I am trying to achieve.
According to the Eric Lippert, the out keyword and linq does not mix. OK fair enough but can someone help me refactor the above so it does work? A colleague at work mentioned accumulator and aggregate functions but I am novice to Linq and my google searches were bearing any real fruit so I thought I would ask here :).
To Clarify:
I am counting the number of parts I might have which could be any number of them each time the code is called. So every time the BuildXmlBlob() method is called, the resulting xml produced will have a unique element in there denoting the 'partNumber'.
So if the counter is currently on 7, that means we are processing 7th part so far!! That means XML returned from BuildXmlBlob() will have the counter value embedded in there somewhere. That's why I need it somehow to be passed and incremented every time the BuildXmlBlob() is called per run through.
If you want to keep this purely in LINQ and you need to maintain a running count for use within your queries, the cleanest way to do so would be to make use of the Select() overloads that includes the index in the query to get the current index.
In this case, it would be cleaner to do a query which collects the inputs first, then use the overload to do the projection.
var inputs =
from recording in listOfRecordings
from part in recording.Parts
select new
{
Id = GetId("mods", recording.CKey + part.UniqueId),
Part = part,
};
result.AddRange(inputs.Select((x, i) => BuildXmlBlob(x.Id, x.Part, i)));
Then you wouldn't need to use the out/ref parameter.
XElement BuildXmlBlob(string id, Part part, int counter)
{
// implementation
}
Below is what I managed to figure out on my own:.
result.AddRange(listOfRecordings.SelectMany(rec => rec.Parts, (rec, par) => new {rec, par})
.Select(#t => new
{
#t,
Id = GetStructMapItemId("mods", #t.rec.CKey + #t.par.UniqueId)
})
.Select((#t, i) => BuildPartsDmdSec(#t.Id, #t.#t.par, i)));
I used resharper to convert it into a method chain which constructed the basics for what I needed and then i simply tacked on the select statement right at the end.

What is wrong in this LINQ Query, getting compile error

I have a list AllIDs:
List<IAddress> AllIDs = new List<IAddress>();
I want to do substring operation on a member field AddressId based on a character "_".
I am using below LINQ query but getting compilation error:
AllIDs= AllIDs.Where(s => s.AddressId.Length >= s.AddressId.IndexOf("_"))
.Select(s => s.AddressId.Substring(s.AddressId.IndexOf("_")))
.ToList();
Error:
Cannot implicitly convert type 'System.Collections.Generic.List<string>' to 'System.Collections.Generic.List<MyCompany.Common.Users.IAddress>'
AllIDs is a list of IAddress but you are selecting a string. The compiler is complaining it cannot convert a List<string> to a List<IAddress>. Did you mean the following instead?
var substrings = AllIDs.Where(...).Select(...).ToList();
If you want to put them back into Address objects (assuming you have an Address class in addition to your IAddress interface), you can do something like this (assuming the constructor for Address is in place):
AllIDs = AllIDs.Where(...).Select(new Address(s.AddressID.Substring(s.AddressID.IndexOf("_")))).ToList();
You should also look at using query syntax for LINQ instead of method syntax, it can clean up and improve the readability of a lot of queries like this. Your original (unmodified) query is roughly equivalent to this:
var substrings = from a in AllIDs
let id = a.AddressId
let idx = id.IndexOf("_")
where id.Length >= idx
select id.Substring(idx);
Though this is really just a style thing, and this compiles to the same thing as the original. One slight difference is that you only have to call String.IndexOf() one per entry, instead of twice per entry. let is your friend.
Maybe this?
var boundable =
from s id in AllIDs
where s.AddressId.Length >= s.AddressId.IndexOf("_")
select new { AddressId = s.AddressId.Substring(s.AddressId.IndexOf("_")) };
boundable = boundable.ToList();

Resources