Using SonarQube search API with analyzedBefore set to today's date is yielding less data than the same API call which doesn't use this field.
Without any date constraints.
GET /api/projects/search
paging
pageIndex 1
pageSize 100
total 49383
With date of today for analyzedBefore
GET /api/projects/search?analyzedBefore=2019-08-31
paging
pageIndex 1
pageSize 100
total 30425
Why this there a count difference between 2 queries ? What am I missing.
Related
I am currently doing some population analyses with the package "FSA" in R.
By using the mrOpencommand, I want to get the survival rate.
My rawdata is a simple table with one row per indidivual, one column per sample date and values of 0 and 1 (for not capured or captured during that respective sampling).
id
total.captures
date1
date2
date3
etc
1
3
1
1
1
...
2
1
1
0
0
...
The first two columns contain the individual id and the aggregated number of captures which is why I excluded them in the analysis.
This is the exact code:
hold.data<-capHistSum(data, cols2use = c(3:13))
est.data<-mrOpen(hold.data)
summary(est.data)
confint(est.data)
It seems to work out, as I get the tables and summaries with all the parameters. See here as an example:
Screenshot_Results
However, there's a problem with the survival estimate phi.
The phi value is not between 0 and 1, but in some cases, exceeds 1.
Any idea, what went wrong here?
Thanks,
Pia
I have a Power Query in excel linked to another file. This file has a time column. I understand that M language will not sum above 24 hours automatically without some work as it uses a datetime reference hence if I import a time of 25 hours it reverts back 2 hours to 1 hour...
In the 3rd column along in my image below using the second row as a reference, this is actually supposed to read 47:47:38. How can I get the instances where the value is above 24 hours to show the true hours?
I have tried using duration.hours(#hours()) this also does not work for some reason.
The same data from the source excel file is below also
Power Query doesn't have custom formats for how it displays data. If you have it read your data as a Duration instead of a DateTime it will display as [d].hh.mm.ss format, but still not with the total hours. Ultimately though this doesn't really matter because even when your data is formatted to display total hours in Excel, it's really being stored internally as days+hours+minutes+seconds. So how it displays in Power Query doesn't matter, as you can just use the hour formatting wherever you output the data to.
Now if you need to use the hours for a calculation between something that isn't another Duration, you can extract the hours by doing
Duration.Days([Your Hours]) * 24 + Duration.Hours([Your Hours])
Or now that I look at it, there is also a TotalHours function that gives you the hours plus mm:ss as a fractional amount of that
Duration.TotalHours([Your Hours])
Power BI doesn't handle this case very gracefully. A solution could be to convert the duration to a number to make it additive (so you can perform calculations and aggregations) and when you need to visualize it, to convert it to the desired format (HH:MM:SS).
Duration and Time are often confused. When such Excel files are read, the type of the column usually is DateTime, and date 1899-12-31 is added to the "time" part. You can change the data type of the column to be Decimal Number, but the "zero point" in Excel unfortunately is one day off (1899-12-30), so you need to subtract 1 from the result to get the actual "number of days" of the duration (i.e. 0.25 means 06:00:00).
So you must perform some conversion of the data. I would make a new column in the model to get the duration in the lowest granularity that I need (seconds in your example). In Power Query Editor add a custom column to calculate the duration in seconds (where Column1 is the name of the original duration column):
Duration in seconds = Duration.TotalSeconds([Column1] - #datetime(1899, 12, 31, 0, 0, 0))
Make sure the data type of this column is Whole Number (change it if necessary). Here 9144 seconds are calculated as 2 * 3600 + 32 * 60 + 24, or 02:32:24. Now you can calculate a sum on this column to get total duration in seconds for example. But when you visualize this column, don't do it directly, but make a measure to convert the data to the desired format. It could me made like this:
Measure Duration =
VAR duration_in_seconds = SUM(Sheet1[Duration in seconds])
VAR hours = ROUNDDOWN ( duration_in_seconds / 3600; 0 )
VAR minutes = ROUNDDOWN ( MOD ( duration_in_seconds; 3600 ) / 60; 0 )
VAR seconds = INT ( MOD ( duration_in_seconds; 60 ) )
RETURN hours & ":" & FORMAT(minutes; "00") & ":" & FORMAT(seconds; "00")
duration_in_seconds variable hold the total duration in seconds of the data in the context. From it we are calculating hours, minutes and seconds and constructing a string to represent the duration in the desired format. FORMAT is used to make sure there is a leading zero in case minutes or seconds are less than 10.
Here is how all three columns looks like when visualized:
Hope this helps!
Hi I am wonder about how to set the time to live in dynamo db.
I understand that I have to create a field, could be called ttl and set the value to be deleted, but in which format innodejs do I have to save to use for the ttl field for 20 or 24 hours?
Thanks for help.
From the official DynamoDB documentation:
TTL compares the current time in epoch time format to the time stored
in the Time To Live attribute of an item. [...]
Note
The epoch time
format is the number of seconds elapsed since 12:00:00 AM January 1st,
1970 UTC.
In Javascript, you can get the number of milliseconds since the epoch by doing Date.now(). Once you have that, you can divide everything by 1000 to get the seconds (also rounding to the nearest integer value) and finally add the number of seconds in the TTL that you want.
This means that if you want to set the expiration time 24 from now, you can easily set the TTL field with the value expirationTime calculated this way:
const SECONDS_IN_AN_HOUR = 60 * 60;
const secondsSinceEpoch = Math.round(Date.now() / 1000);
const expirationTime = secondsSinceEpoch + 24 * SECONDS_IN_AN_HOUR;
All data are coming from one table. Date is based on the creation_date field of the table. I would like to create a report similar to the example below:
No. Prior to No. On
10/01/2015 10/1/2015 Total No.
a 5 1 6
b 10 3 13
c 1 0 1
I could not figure out how to display the combine results on the same "Report Page". I have to create 2 report pages one is for the prior date and the other is for the date "on".
Ok. If I understand your data correctly do this.
In your report create query with query items:
Type (a,b,c etc)
[No. Prior to 10/01/2015] with Aggregate Function = 'Total' and with expression
case when creation_date < to_date('10/01/2015', 'DD/MM/YYYY')
then 1 else 0 end
[No. On 10/1/2015] with Aggregate Function = 'Total' and expression
case when creation_date >= to_date('10/01/2015', 'DD/MM/YYYY')
then 1 else 0 end
[Total No.] with expression
[No. Prior to 10/01/2015] + [No. On 10/1/2015]
And create simple table with this values.
Or you can try to do it with crosstab.
Using the Start Date and End Date of PTO - Personal Time Off Days Used only count days used up to end of prior month, excluding weekends and U.S Holidays in that certain month. Example of a Holiday is Sept 7th 2015 in the United States.
My goals are:
Create a Data Item Month End Personal Time Off Days used.
Of course it should be getting the number of PTO Days USED from the prior month only.
Exclude weekends in that certain month. So if the Resource takes a Leave on Friday and Monday, Saturday and Sunday should not be excluded in the computation.
How to exclude U.S Holidays, if this is possible that's great but if it's not possible then I'm okay with numbers 1, 2 and 3.
I have created a Data Item column that captures the PTO days used. But this is good for Year to date.
Case when [PTO Info].[PTO Audit].[PTOAuditTypeId] = 31571
and [PTO Info].[PTO Audit].[TimeOffTypeId] = 31566
then [PTO Info].[PTO Audit].[PTODays]
when [PTO Info].[PTO Audit].[PTOAuditTypeId]=31572
and [PTO Info].[PTO Audit].[TimeOffTypeId] = 31566
and [PTO Info].[PTO Audit].[PTODays] < 0
then abs([PTO Info].[PTO Audit].[PTODays] )
else 0 end
I'm not sure if the query below can help.
A calendar table is really going to help you out here. Assuming it has one record per calendar date, you can use this table to note weekends, holidays, fiscal periods vs Calendar periods, beginning of month/end of month dates. A number of things that can help simplify your date based queries.
See this question here for an example on creating a calendar table.
The main point is to create a data set with 1 record per date, with information about each date including Month, Day of Week, Holiday status, etc.
Without a calendar table, you can use database functions to generate your set of dates on the fly.
Getting the Month number for a date can be done with
extract([Month], <date field goes here> )
Getting a list of values from nothing will be required to generate your list of dates (if you don't have a calendar table with 1 record per date to use) will vary depending on your source database. In oracle I use a 'select from all_objects' type query to achieve this.
An example from Ask Tom:
select to_date(:start_date,'dd-mon-yyyy') + rownum -1
from all_objects
where rownum <=
to_date(:end_date,'dd-mon-yyyy')-to_date(:start_date,'dd-mon-yyyy')+1
For Sql Server refer to this stackoverflow question here.
Once you have a data set with your calendar type information, you can join it to your query above:
join mycalendar cal on cal.date >= c.PTOStartDate
and cal.date <= c.PTOEndDate
Also note, _add_days is a Cognos function. When building your source queries, try and use Native functions, like in oracle you can 'c.PTOStartDate + a.PTODays'. Mixing Cognos functions with native functions will sometime force parts of your queries to be processed locally on the Cognos server. Generally speaking, the more work that happens on the database, the faster your reports will run.
Once you have joined to the calendar data, you are going to have your records multiplied out so that you have 1 record per date. (You would not want to be doing any summary math on PTODays here, as it will be inflated.)
Now you can add clauses to track your rules.
where cal.Day_Of_Week between 2 and 6
and cal.Is_Holiday = 'N'
Now if you are pulling a specific month, you can add that to the criteria:
and cal.CalendarPeriod = '201508'
Or if you are covering a longer period, but wanting to report a summary per month, you can group by month.
Final query could look something like this:
select c.UserID, cal.CalendarPeriod, count(*) PTO_Days
from dbo.PTOCalendar c
join myCalendar cal on on cal.date >= c.PTOStartDate
and cal.date <= c.PTOEndDate
where cal.day_of_week between 2 and 6
and cal.Is_Holiday = 'N'
group by c.UserID, cal.CalendarPeriod
So if employee with UserID 1234 Took a 7 day vacation from thursday June 25th to Friday July 3th, that covered 9 days, the result you get here will be:
1234 201506 4
1234 201507 3
You can join these results to your final query above to track days off per month.