Is there a way to retrieve information about the number of stars of a GitHub repository over time? I know I can get a list of all users who starred a repository using the stargazers API endpoint. This doesn't include information about when each user has starred the repo, though. Any hints on how I can retrieve this data?
You can get the property starred_at using this custom media :
Accept: application/vnd.github.v3.star+json
Your headers might look something like this (I'm using Javascript here):
headers: {
...
Accept: 'application/vnd.github.v3.star+json',
...
},
See the documentation here: https://developer.github.com/v3/activity/starring/#alternative-response-with-star-creation-timestamps
This repository: https://github.com/timqian/star-history uses this technique to retrieve the stars over time to create a chart.
Related
Good morning,
I'm writing my own library to improve skills for getting media from Instagram profile and showing it on my website.
I read a lot of articles, documentation, etc. but there are many old versions and endpoints. At the official API documentation, I'm unable to find, how to get like count or comment count on every single media.
So:
I can OAuth - OK
Get short-live token - OK
Get long-live token - OK
Refresh long-live token - OK
Get media list - OK
Get one media based on its ID - OK
But none of endpoints gives me like count or comment count. How can I achieve this?
https://graph.instagram.com/me/media?fields=id,caption,comments_count,like_count,media_type,media_url,thumbnail_url&access_token=
returns me:
{ data: [{ "id":"123456798", "caption":"My title", "media_type":"IMAGE", "media_url":"https://..." }, { ... }] }
I found in the documentation https://developers.facebook.com/docs/instagram-api/reference/ig-media/ that I can get the like_count field only for requests at media with their ID. So I tried EP:
https://graph.instagram.com/v15.0/123456789?fields=id,caption,media_type,media_url,thumbnail_url,like_count&access_token=
and return is: Tried accessing nonexisting field (like_count) on node type (Media).
On the instagram I've able to get likes and comments and the media has likes and comments. What I'm doing bad? Thank you for your time.
I'm using Apps - listRepos to get a list of all the repositories installed on my Probot GitHub application.
I want the response data to include the GitHub topics for each repository. This is currently only available as a preview feature:
The topics property for repositories on GitHub is currently available for developers to preview. To view the topics property in calls that return repository results, you must provide a custom media type in the Accept header:
application/vnd.github.mercy-preview+json
So I think I want to "provide a custom media type in the Accept header".
Is there a way to enable GitHub preview features in Probot? Perhaps by somehow setting RequestOptions?
Success: I added a headers object to my listRepos() call.
const repositories = await octokit.paginate(
octokit.apps.listRepos({
per_page: 100,
headers: {
accept: 'application/vnd.github.machine-man-preview+json,application/vnd.github.mercy-preview+json'
}
}),
res => res.data.repositories // Pull out only the list of repositories from each response.
);
according to the documentation I can get the contents of a particular file in a github repo like this:
GET /repos/:owner/:repo/contents/:path
which indeed works for my public repos? But what about my private ones? How can I have my applications access their contents?
You need to add authorization to your request.
One way to do this is through the headers. Add both of the following headers:
User-Agent: 'YOUR_USERNAME'
Authorization: 'token YOUR_TOKEN'
My requirement is to clear all activities on a notification feed.
Based on this stackoverflow question I understand that there is an undocumented REST API to delete a feed and the dashboard truncate feed functionality uses it.
I tried to replicate the call with the same parameters as dashboard:
DELETE /api/v1.0/feed/notification/f8fa1d12-594a-4b2b-ac58-23c912d1335a/?api_key=...&location=unspecified
Host: api.getstream.io
Authorization: notificationf8fa1d12-xxxx-xxxx-xxxx-23c912d1335a writetoken
stream-auth-type: simple
X-Stream-Client: stream-javascript-client-browser-unknown
Cache-Control: no-cache
Tried to use the same but am getting this error message:
{
"code": null,
"detail": "url signature missing or invalid",
"duration": "6ms",
"exception": "AuthenticationFailed",
"status_code": 403
}
Is this the right way to use this API?
I am using this from Java code and believe that the Java client doesn't have this functionality built in.
There's two ways to do this. You can do it manually from the explorer on the dashboard. Search for the feed, select an activity and press the truncate feed button. This is the easiest way to do this if manually doing this is sufficient.
It's also possible like you found to use the delete API endpoint to do it programmatically. This endpoint is not built in to most clients, including the Java client. The URL and HTTP verb that you used should indeed work.
From what I can tell from your headers and the response it seems like you are having an issue with supplying a correct signature. The easiest way to do this correctly is to use the built in methods in the library you're using to generate them. I'm not an expert in the Java library, but it seems like these methods are inside the StreamRepoUtils class.
How do you update a SharePoint 2013 wiki page using the REST API?
Three permutations:
Reading an existing page (content only)
Updating an existing page
Creating a new page
For reading an existing page, of course I can just to a "GET" of the correct URL, but this also brings down all the various decorations around the actual data on the wiki page-- rather than fish that out myself, it would be better if there was a way to just get the content if that is possible.
Are there special endpoints is the REST API that allow for any of these three operations on wiki pages?
As stated in GMasucci's post, there does not appear to be a clean or obvious way of instantiating pages through the REST API.
You can call the AddWikiPage method from the SOAP service at http://[site]/_vti_bin/Lists.asmx. This is an out of the box service that will be accessible unless it has been specifically locked down for whatever reason.
To read the content of a wiki page through the REST API, you can use the following endpoint:
https://[siteurl]/_vti_bin/client.svc/Web/GetFileByServerRelativeUrl('/page/to/wikipage.aspx')/ListItemAllFields
The content is contained within the WikiContent field. You may want to add a select to that URL and return it as JSON to reduce the amount of data getting passed over if that is a concern.
As for updating the content of an existing wiki page, it is not something I have tried but I would imagine it's just like populating another field through the REST API. This is how I would expect to do it:
Do a HTTP POST to the same endpoint as above
Use the following HTTP headers:
Cookie = "yourauthcookie"
Content-Type = "application/json;odata=verbose"
X-RequestDigest = "yourformdigest"
X-HTTP-Method, "MERGE"
If-Match = "etag value from entry node, returned from a GET to the above endpoint"
Post the following JSON body
{
"__metadata": { "type": "SP.Data.SitePagesItem" },
"WikiField" : "HTML entity coded wiki content goes here"
}
The interim answer I have found is to not utilise REST, as it appears to not be
fully documented
fully featured
supported across Sharepoint 2013 and On-line in the same way
So my current recommendation would be to utilise the SOAP services to achieve the same, as these are more documented and easily accessible.