I have been trying to create a pull request using the createPullRequest plugin from octokit and so far for simple text files it works well, but when I am trying to commit some changes into a JSON file in the repository, I get something like this:
ss from the pull request
I want this data to present itself in a json-structure, I am using the following code:
const updatedContent = JSON.stringify(json_for_package);
const pr=await octokit_pull.createPullRequest({
owner: ownername,
repo: repoName,
title: "pull request "+dependency,
body: "change "+dependency+" v"+curr_version+" to v"+version,
base: "main" ,
head: "pulls-requests-branches-names-main-"+dependency+"-v"+version,
forceFork: true,
changes: [
{
files: {
"package.json": updatedContent,
},
commit:
"updated the contents of package.json",
},
],
});
Here json_for_package is the JSON content that I want to write into my package.json file, I just need to know how I can pass the updatedContent in the files so that it recognizes that it a JSON file and not any simple text.
Anyone facing the same problem can do the following- first writing the json data into a .json file and then read it to use it, seems to have worked for me!
Beside writing the JSON to a file first, and use the file as parameter, you can also use gr2m/octokit-plugin-create-pull-request:
Example
const MyOctokit = Octokit.plugin(createPullRequest);
const TOKEN = "secret123"; // create token at https://github.com/settings/tokens/new?scopes=repo
const octokit = new MyOctokit({
auth: TOKEN,
});
And then:
// Returns a normal Octokit PR response
// See https://octokit.github.io/rest.js/#octokit-routes-pulls-create
octokit
.createPullRequest({
owner: "user-or-org-login",
repo: "repo-name",
title: "pull request title",
body: "pull request description",
base: "main" /* optional: defaults to default branch */,
head: "pull-request-branch-name",
forceFork: false /* optional: force creating fork even when user has write rights */,
changes: [
{
/* optional: if `files` is not passed, an empty commit is created instead */
files: {
"path/to/file1.txt": "Content for file1",
"path/to/file2.png": {
content: "_base64_encoded_content_",
encoding: "base64",
},
// deletes file if it exists,
"path/to/file3.txt": null,
// updates file based on current content
"path/to/file4.txt": ({ exists, encoding, content }) => {
// do not create the file if it does not exist
if (!exists) return null;
return Buffer.from(content, encoding)
.toString("utf-8")
.toUpperCase();
},
"path/to/file5.sh": {
content: "echo Hello World",
encoding: "utf-8",
// one of the modes supported by the git tree object
// https://developer.github.com/v3/git/trees/#tree-object
mode: "100755",
},
},
commit:
"creating file1.txt, file2.png, deleting file3.txt, updating file4.txt (if it exists), file5.sh",
},
],
})
.then((pr) => console.log(pr.data.number));
Related
I have the following structure
User {
image: Asset
...
}
Comment {
author: User
...
}
BlogArticle {
slug: Text
author: User
comments: Comment[]
}
When I pull entries with the following method
const articles = await client.getEntries({ content_type: "BlogArticle" })
console.log(articles.entries.fields.comments)
I only get the sys property for the author
[
{
author: {
sys: {
...
}
fields ??????
}
}
]
PS: This is the case for all types that come in second level of nesting
I checked the docs and the apis but with no luck
Any help ?
I created a similar content model and was able to get the fields of the Author successfully. One thing you can do is use the include parameter. With the include parameter, your code should look as follow:
const articles = await client.getEntries({ content_type: "BlogArticle", include: 2 })
console.log(articles.entries.fields.comments)
You can learn more about it here
How do I get the date of the last time a folder on google drive has been updated.
For an example, when the last file was copied into it.
Currently I'm using the following code to get back the date when the folder was created:
const drive = google.drive({ version: 'v3', auth});
console.log("Calling listing")
await drive.files.list({
pageSize: 1000,
fields: 'nextPageToken, files(id, name, modifiedTime)',
q: "mimeType='application/vnd.google-apps.folder'",
}, (err, dataResponse) => {
if(err) {
res.json("Error " + err);
}
const folders = dataResponse.data.files;
console.log(folders);
res.json(folders);
})
In agreement with the UI behavior, the Last Modified time the API will give you for the folder itself will be different from the Last Modified time of the latest modified file within this folder
This is due to Google's Drive hierarchy where a change of the Last Modified time of a file does not necessary change the Last Modified time of the parent folder
In any case, if you would like to know the Last Modified time of the folder itself, you need to perform a Files: get request specifying the id of the folder as fileId and setting fields to modifiedTime.
Sample:
...
await drive.files.get({
fileId: "XXX"
fields: "modifiedTime"
},
...
If you would like to access the modifiedTime property of the files in the folder, you can do it with Files: list , setting q to 'XXX' in parents, whereby XXX is the id of the parent folder, and setting fields to files/modifiedTime.
Sample:
...
await drive.files.list({
fields: "files(modifiedTime)",
q: "'XXX' in parents"
}
...
This will return you a response like
{
"files": [
{
"modifiedTime": "2021-10-09T14:22:58.306Z"
},
{
"modifiedTime": "2021-10-07T17:38:56.515Z"
},
{
"modifiedTime": "2021-09-28T16:28:12.476Z"
},
{
"modifiedTime": "2021-09-27T16:13:58.201Z"
}
]
}
You can then programmatically determine the newest timestamp from the result list.
I'm creating a blog where each blog page will show a map using leaflet with a GPX route on it. Below the map will be some statistics and some text and images.
I have the text and images part defined in mark down so i figured the way to handle this would be to define my gpx filename in the frontmatter like so:
---
title: Awesome Blog Post Title
author: Cancrizan
date: 2021-01-04
gpxFile: activity4835702422
---
BLOG POST here
where the field gpxFile refers to a a file in my project src/gpx/activity4835702422.gpx.
I've written a transformer plugin that will read in the GPX file so that it can be queried like this:
query MyQuery {
allActivity4835702422Gpx {
edges {
node {
geometry {
coordinates
}
}
}
}
}
and outputs something like this:
{
"data": {
"allActivity4835702422Gpx": {
"edges": [
{
"node": {
"geometry": {
"coordinates": [
[
-1.2134187016636133,
52.92038678191602,
29.399999618530273
],
[
-1.2134256586432457,
52.92039977386594,
29.399999618530273
],
...,
]
}
}
}
]
}
},
"extensions": {}
}
I want to access that node based on the frontmatter of the markdown file and i'm not sure how?
Can anyone suggest a solution or am i going about this the wrong way?
The structure and the mindset you've followed is perfectly valid, the only part you're missing is to pass the gpxFile to your template in order to create another query based on that parameter.
Your gatsby-node.js should look like something like this:
const path = require("path")
exports.createPages = async ({ graphql, actions, reporter }) => {
const { createPage } = actions
const result = await graphql(
`
{
allMarkdownRemark(limit: 1000) {
edges {
node {
frontmatter {
title
author
date
gpxFile
slug
}
}
}
}
}
`
)
if (result.errors) {
reporter.panicOnBuild(`Error while running GraphQL query.`)
return
}
const postTemplate= path.resolve(`src/templates/post.js`);
result.data.allMarkdownRemark.edges.forEach(({ node }) => {
const path = node.frontmatter.path
createPage({
path: `/posts/${node.frontmatter.slug}`,
component: postTemplate,
context: {
gpxFile: node.frontmatter.gpxFile,
path: node.frontmatter.slug
}
})
})
}
The idea is to use context API to pass data (gpxFileData and path) to your template (postTemplate) to use it as a filter for your markdown files.
In your postTemplate, your query should look like:
export const postData = graphql`
query getArticleData($path: String!, $gpxFile: String) {
post: markdownRemark (fields: {frontmatter: { slug: { eq: $path }}}) {
html
excerpt (pruneLength: 350)
frontmatter {
title
author
date
gpxFile
slug
}
}
gpxFileData: allFile(relativePath: { in: $gpxFile })
# your gpx data here
}
}
`;
It's quite self-explanatory, basically, you are passing via context the necessary data to make a query in your template (gpxFileData) from your gatsby-node.js. There, you can create a new query, allFile, filtering by relativePath (you may need to access to file directly or use absolutePath, test it in localhost:8000/___graphql) and retrieve the whole data using props.data.post and post.data.gpxFileData.
Disclaimer: I'm assuming that you've set your filesystem (gatsby-source-filesystem) properly to use allFile across your .gpx files.
I have created a set using dynamoDB document client . I am able to remove items in this set however when i remove to the last element in the set nothing returns until i make a new post. Then all the other data is displayed.
const params = {
TableName: 'beta-user-' + process.env.NODE_ENV,
Key: {
username: request.username
},
UpdateExpression: "DELETE #features :feature",
ExpressionAttributeNames: { "#features" : "features" },
ExpressionAttributeValues: { ":feature": dynamodb.createSet([request.feature]) },
ReturnValues: "NONE"
};
and im calling it like
const dynamoPromise = dynamodb.update(params).promise();
return await dynamoPromise.then(result => { // stuff })
The UpdateExpression i do not think is wrong
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.UpdateExpressions.html#Expressions.UpdateExpressions.DELETE
I belive the problem is with ExpressionAttributeValues if i remove
dynamodb.createSet I get many validation errors.
When i make a get request to app i get
{
"message": [
{
"username": "x",
"feature": [
"blah",
"test"
]
},
{
"username": "z",
"feature": [
"blah"
]
},
}
I make a delete request and remove the feature test from username x. This works and returns the same response minus the test feature. I make another delete request to remove blah. Blah is removed however when I make a get request I recieve:
{
"message": {}
}
The other data is returned when i make a new post to that specific user.
EDIT:
I think the issue might be due to dynamo not liking an empty set
The issue was with my return statement in my get request. I assumed that once features were deleted the record would be deleted. I was trying to return features on an object that had no features, therefore, it was erroring out and not returning anything.
I am using Onlyoffice document Server version 5.0.3 It works fine. find and replace a text in the document editor
Onlyoffice configuration
file: editor.jsp
config = {
"document": {
"fileType": "docx",
"key": "Khirz6zTPdfd7",
"title": "sample.docx",
"url": "http://192.168.0.1:8080/onlyofficeexample/files/192.168.0.1/sample.docx"
},
"documentType": "text",
"editorConfig": {
"callbackUrl": "http://192.168.0.1:8080/onlyofficeexample/IndexServlet?type=track&fileName=sample.docx&userAddress=192.168.0.1"
}
........
.......
};
var docEditor = new DocsAPI.DocEditor("placeholder", config);
setTimeout(function(){
var text_replace = {
textsearch: "~($#effective_date#$)~",
textreplace: "23/05/1991",
matchcase: false,
matchword: false,
highlight: true
};
docEditor.onReplaceText(text_replace);
}, 30000);
I am trying to replace a text with calling API, and have created the further trigger function in Onlyoffice API call.
//trigger function
onReplaceText: function (data) {
$me.trigger("onreplacetext", data)
},
here i want find and replace a text based on data passed to this function
onReplaceText:function(data){
}
Common.Gateway.on('onreplacetext',_.bind(me.onReplaceText, me));
thank you
It is incorrect to use the method of asc_replaceText to modify the content of the document. Your request can be realized by means of document builder (please see this section of API) or by means of plugin. We are also glad to announce that feature list of find-replace method will be added in the following update of the document builder and it will be also available for the plugin.
finally, i replaced the text in onlyoffice API by using below code.
//trigger function
onReplaceText: function (data) {
$me.trigger("onreplacetext", data)
},
here i want find and replace a text based on data passed to this function
onReplaceText:function(data){
data=data.data;
this.api.asc_replaceText(data.textsearch, data.textreplace,true, data.matchcase, data.matchword);
}
Common.Gateway.on('onreplacetext',_.bind(me.onReplaceText, me));