How many time to refresh the repository in the GitHub API? - github-api

I pushed a new repository, let's say named test in the servoz organisation.
As a first use of the API, I want a global view of this repository and I type in a shell:
% curl https://api.github.com/repos/servoz/test # linux OS
I observe in the return:
"size": 0,
So I deduce that the repository is not yet updated via the API, right ?
Is it done when it is possible (:-) or what is the rule (after 1 min, 5 mins, etc .. other way to do ?)

I tracked the delay between a push on Github and the update of the data via the GitHub's API.
I observed that the update takes place exactly one hour after the push.
I haven't done any other pushes in the meantime between the push and push+1h. What would be the result if other pushes are made in the interval?
I would do other tests later, but the answer to my first question, seems to be: 1 hour.

Related

How to get all commits in a repository along with the corresponding branch name?

I am faced with a problem of getting all commits in a repository with the branch name too along with the commit ID . While there exists one endpoint that lists all the commits (https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Bworkspace%7D/%7Brepo_slug%7D/commits) what this API does not give is the branch name along with the commit ID . If I call the branches endpoint : /2.0/repositories/{workspace}/{repo_slug}/refs/branches/{name} I can get only the latest commit and not all commits in the branch . To do any kind of mapping I would need to call each branch and then another loop to call each commit within a branch this causes the code to fail as I exceed the no. of requests allowed . I need some solutions to tackle this problem .
I am writing a python script that calls these two api endpoints in two loops and generating a list of lists out of this
You can use the file history option provided by bitbucket.
And Branch name by default is master unless you change it in your properties file.
https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Bworkspace%7D/%7Brepo_slug%7D/filehistory/%7Bcommit%7D/%7Bpath%7D

Filter recent files in Logic Apps' SFTP when files are added/modified trigger

I have this Logic App that connects to an SFTP server and it's triggered by the "files are added or modified" trigger. It's set to run every 10 minutes, looking for new/modified files and copying them to an Azure storage account.
The problem is that this SFTP server path is set to overwrite a set of files every X minutes (I have no control over this) and so, pretty often the Logic App overlaps with the update process of these files and downloads files that are still being written. The result is corrupted files.
Is there a way to add a filter to the When files are added or modified (properties only) so that it only takes into consideration files with a modified date of, at least, 1 minute old?
That way, files that are currently being written won't be added to the list of files to download. The next run of the Logic App would then fetch this ignored files and so on.
UPDATE
I've found a Trigger Conditions in the trigger's setting but I can't find any documentation about it.
According to test the trigger "When files are added or modified", it seems we can not add a filter in the trigger to filter the records which are modified at least 1 minute ago. We can just get the List of Files LastModified datetime and loop them, use "If" condition to judge if we should download it.
Update:
The expression in the screenshot is:
sub(ticks(utcNow()), ticks(triggerBody()?['LastModified']))
Update workaround
Is it possible to add a "Delay" action when the last modified time less than 1 minute ? For example, if the last modified time less than 60 seconds, use "Delay" to wait 5 minutes until the overwrite operation complete, then do the download.
I check the sample #equals(triggers().code, 'InternalServerError'), actually it uses the condition functions in Logical comparison functions, so the key word is make sure the property you want to filter is in the trigger or triggerBody or you will get the below error.
So I change the expression to like #greater(triggerBody().LastModified,'2020-04-20T11:23:00Z'), this could filter the file modified less than 2020-04-20T11:23:00Z not trigger the flow.
Also you could use other function like less ,greaterOrEquals etc in the Logical comparison functions.

sync two vobs file (by clearfsimport) without checking in the updated file

I am using following command to sync B vob files from A vob
clearfsimport -master -follow -nsetevent -comment $2 /vobs/A/xxx/*.h /vobs/B/xxx/
It works fine. But it will check in all the changes automatically. Is there a way to do the same task but leave the update files in a check out status?
I want to update the file for B from A. Build my programme, and then re-cover the branch. So if the updated files is an check out status, I can do unco later. Well with my command before, everything is checked in. I cann't re-cover my branch then.
Thanks.
As VonC said, it's impossible to prevent "clearfsimport" to do the check in. And he suggested to use a label to recover back.
For me, the branch where I did "clearfsimport" is branched from a label.Let's call it LABEL_01. So I guess I can use that label for recovery. Is there an easy way (one command) to recover the files under /vobs/B/xxx/ to label LABEL_01 ? I want to do it in my bash script, so the less/easy the command is, the better.
Thanks.
After having a look at the man page for clearfsimport, no, it isn't possible to prevent the checkins.
I would set a label before the clearfsimport, and modify the config spec for the new version to be created in a branch (similar to this config spec).
That way, "re-cover" the initial branch would be easy: none of the new version would have been created in it.

How to let git check for updates on the master server?

I have very poor knowledge about git and would like to ask for help.
I have a linux(-only) application which shall only be "downloaded" (i.e. cloned) with git. On startup, the app shall ask the git "master server" (github) for whether there are updates.
Does git offer a command to check for whether there is an update (without really updating - only checking)? Furthermore, can my app read the return value of that command?
If you do not want to merge, you can just git fetch yourremote/yourbranch, the remote/branch specification usually being origin/master. You could then parse the output of the command to see if new commits are actually present. You can refer to the latest fetched commit as either yourremote/yourbranch or possibly by the symref FETCH_HEAD.
Note: I was reminded that FETCH_HEAD refers to the last branch that was fetched. Hence in general you cannot rely on git fetch yourremote with FETCH_HEAD since the former fetches all tracked branches, thus the latter may not refer to yourbranch. Additionally,
you end up fetching more than strictly necessary.
also refer to Jefromi's answer to view but not actually downloaded changes
the following are not necessarily the most compact formats, just readable examples.
That being said, here are some options for checking for updates of a remote branch, which we will denote with yourremote/yourbranch:
0. Handling errors in the following operations:
0.1 If you attempt to git fetch yourremote, and git gives you an error like
conq: repository does not exist.
that probably means you don't have that remote-string defined. Check your defined remote-strings with git remote --verbose, then git remote add yourremote yourremoteURI as needed.
0.2 If git gives you an error like
fatal: ambiguous argument 'yourremote/yourbranch': unknown revision or path not in the working tree.
that probably means you don't have yourremote/yourbranch locally. I'll leave it to someone more knowledgeable to explain what it means to have something remote locally :-) but will say here only that you should be able to fix that error with
git fetch yourremote
after which you should be able to repeat your desired command successfully. (Provided you have defined git remote yourremote correctly: see previous item.)
1. If you need detailed information, git show yourremote/yourbranch and compare it to the current git show yourbranch
2. If you only want to see the differences, git diff yourbranch yourremote/yourbranch
3. If you prefer to make comparisons on the hash only, compare git rev-parse yourremote/yourbranch to git rev-parse yourbranch
4. If you want to use the log to backtrack what happened, you can do something like git log --pretty=oneline yourremote/yourbranch...yourbranch (note use of three dots).
If you really don't want to actually use bandwidth and fetch new commits, but just check whether there is anything to fetch, you can use:
git fetch --dry-run [remote]
where [remote] defaults to origin. You'll have to parse the output, though, which looks something like this:
From git://git.kernel.org/pub/scm/git/git
2e49dab..7f41b6b master -> origin/master
so it's really much easier to just fetch everything (git fetch [remote]), and then look at the diff/log e.g. between master and [remote]/master.
I'd say git fetch is a potential solution. It only updates the index, not working code. In cases of large commit sets, this would involve a download of compressed files/info, so it may be more than you want, but it is the most useful download you can do.

Can TortoiseSVN provide a cross-repository view of user activity?

Is there a way I can see my commit history for a given time period across multiple repositories using TortoiseSVN? It would be nice to be able to see this, and it's a little cumbersome to get my complete commit history if I'm working in multiple repositories.
If you're not going to rule out the svn.exe client, you could do:
svn log <path_to_repo> -r1:head -q | find "william_leara" >> c:\my_commits.txt
Do this for every repository, and "my_commits.txt" will contain your commits from every repository. If you don't have an obscene number of repositories, it's not a big deal. Further example:
:: dump my commits
svn log http://<server>/<path1> -r1:head -q | find "william_leara" >> c:\my_commits.txt
svn log http://<server>/<path2> -r1:head -q | find "william_leara" >> c:\my_commits.txt
svn log file:///c:/src/myrepo -r1:head -q | find "william_leara" >> c:\my_commits.txt
. . . I think you get the idea. Of course you can edit the range as necessary, or write a batch file that accepts arguments to specify repository/range/user, whatever.
The only way to have something like cross-repository view is using Settings menu and then Log Caching->Cached Repositories. This allows to get svn repository statistics (actually, related to local usage of the particular repository) - Details and export repository data in the form of file set: [filename].changes.csv, [filename].merges.csv, [filename].paths.csv, [filename].revisions.csv, etc. The latter is the most probable you are interested in. I think it could be processed easily for example by perl to have a commit history for a given period in a form you need.

Resources