Patch K8s Custom Resource with #kubernetes/client-node - node.js

I'm building Istio/K8s-based platform for controlling traffic routing with NodeJS. I need to be able to programmatically modify Custom Resources and I'd like to use the #kubernetes/node-client for that. I wasn't able to find the right API for accessing Custome Resources in docs and the repo. Am I missing something? Thx in adv.
EDIT: When using CustomObjectApi.patchNamespacedCustomObject function, I'm getting the following error back from K8s API:
message: 'the body of the request was in an unknown format - accepted media types include: application/json-patch+json, application/merge-patch+json, application/apply-patch+yaml', reason: 'UnsupportedMediaType', code: 415
My Code:
const k8sYamls = `${path.resolve(path.dirname(__filename), '..')}/k8sYamls`
const vServiceSpec = read(`${k8sYamls}/${service}/virtual-service.yaml`)
const kc = new k8s.KubeConfig()
kc.loadFromDefault()
const client = kc.makeApiClient(k8s.CustomObjectsApi)
const result = await client.patchNamespacedCustomObject(vServiceSpec.apiVersion.split('/')[0], vServiceSpec.apiVersion.split('/')[1], namespace, 'virtualservices', vServiceSpec.metadata.name, vServiceSpec)
virtual-service.yaml:
apiVersion: networking.istio.io/v1alpha3
kind: VirtualService
metadata:
name: message-service
spec:
hosts:
- message-service
http:
- name: 'production'
route:
- destination:
host: message-service
weight: 100
retries:
attempts: 3
perTryTimeout: 2s
retryOn: 5xx

I was using the wrong type for the body object in that method. I got it to work following this example.
const patch = [
{
"op": "replace",
"path":"/metadata/labels",
"value": {
"foo": "bar"
}
}
];
const options = { "headers": { "Content-type": k8s.PatchUtils.PATCH_FORMAT_JSON_PATCH}};
k8sApi.patchNamespacedPod(res.body.items[0].metadata.name, 'default', patch, undefined, undefined, undefined, undefined, options)
.then(() => { console.log("Patched.")})
.catch((err) => { console.log("Error: "); console.log(err)});

You can use the patchClusterCustomObject or patchNamespacedCustomObject methods, depending on whether the given object is namespaced or not, of the customObjectsApi.

Since the edit queue is full for the accepted answer I'm going to provide a working example and more thorough answer here using patchNamespacedCustomObject:
Here are some helpful links:
patchNamespacedCustomObject Docs
Your patch object should be an array of this object
The op field should be one of these
import * as k8s from "#kubernetes/client-node";
import { CustomObjectsApi, KubeConfig } from "#kubernetes/client-node";
export async function patchCustomResource(newValue: any) {
const kc: KubeConfig = new k8s.KubeConfig();
kc.loadFromDefault();
const custObjApi: CustomObjectsApi = kc.makeApiClient(k8s.CustomObjectsApi);
const objectName = 'your-object';
const options = { "headers": { "Content-type": k8s.PatchUtils.PATCH_FORMAT_JSON_PATCH}};
const group = 'your-group'; // found in the CRD
const version = 'v1'; // found in the CRD
const plural = 'object-plural'; // found in the CRD
const namespace = `your-namespace`;
// this will replace the value at `"/spec/path/to/modify"` with the contents of `newValue`
const patch = [{
"op": "replace",
"path":"/spec/path/to/modify",
"value": newValue
}];
try {
await custObjApi.patchNamespacedCustomObject(group, version, namespace, plural, objectName, patch, undefined, undefined, undefined, options);
console.log(`Successfully updated custom resource`);
} catch (error) {
const {body} = error;
console.log(body.message);
}
}
I urge anyone to spend some time on the CustomObjectsAPI docs to get more familiar with the available methods. The docs are dense but it's at least worth it to glance over method names.

After fighting through endless body-parsing errors using #kubernetes/node-client, I opted to install kubectl on my NodeJS workers and use shell.js to run it with.
My conclusion is that the #kubernetes/node-client is buggy when used with Istio Custom Resources, but didn't want to spend time debugging exactly what's wrong. I will post the Github Issue about it in their repo soon.

Related

Cannot insert a comment on a youtube video using the Youtube Data API

I am using the Youtube Data API to insert a video.
The I use the API to insert a "commentThread" to the video, as described here in the doc. Here is my code:
import { google } from "googleapis";
const googleApiClient = new google.auth.OAuth2(
GOOGLE_CLIENT_SECRET.web.client_id,
GOOGLE_CLIENT_SECRET.web.client_secret,
GOOGLE_CLIENT_SECRET.web.redirect_uris[0]
);
googleApiClient.credentials = tokens;
google.options({ auth: googleApiClient });
const apiYoutube = google.youtube({ version: "v3" });
await apiYoutube.commentThreads.insert({
part: ["snippet"],
requestBody: {
snippet: {
videoId,
topLevelComment: {
snippet: {
textOriginal: "Hello world!",
},
},
},
},
});
Depending on the video visibility, the response differs:
if visibility is "private", I get a 403 error:
{
message: 'The comment thread could not be created due to insufficient permissions. The request might not be properly authorized.',
domain: 'youtube.commentThread',
reason: 'forbidden',
location: 'body',
locationType: 'other'
}
I am positive that my access token (acquired through an oauth2 process) has the correct scope: "https://www.googleapis.com/auth/youtube.force-ssl"
if visibility is "public" or "unlisted", I get a 400 error:
{
message: 'This action is not available for the item.',
domain: 'youtube.mfk',
reason: 'mfkWrite'
}
In the youtube console, in Settings > Upload defaults > Avanced settings > comments, it is set to "Hold potentially inappropriate comments for review". I have also tried "Allow all comments".
How can I fix this issue and publish comments to my video?
I am aware of this similar question and tried the various suggestions but it still does not work and the API may have evolved.

Mailchimp and Node.js with typescript noob question: Import vs Require

I am creating an app that sends certain transactional emails using Mailchimp.
They have great docs here: https://mailchimp.com/developer/api/transactional/messages/send-using-message-template/
But Im using typescript, so the line:
var mailchimp = require("mailchimp_transactional")("API_KEY");
Doesn't work, I get the following error:
Error: Cannot find module 'mailchimp_transactional'
I know this is something small, but I am not sure how to get around it at all. I found an article that describes creating your own types file here: #mailchimp/mailchimp_marketing/types.d.ts' is not a module in nodeJs
But there has to be a quicker simpler solution. It also doesn't make it clear how to set the API key in that case.
I have tried to import the module which is #mailchimp/mailchimp_transactional which did not work.
I have ofcourse also run npm install #mailchimp/mailchimp_transactional
Any help would be appreciated, here is a full sample just incase it helps.
var mailchimp = require("mailchimp_transactional")("API_KEY");
export const testSendEmailFromTemplate = async () => {
let mcbody = {
template_name: "my-template",
template_content: [{
name:"firstname",
content:"INJECTED.BY.TEMPLATE.CONT.firstname"
},
{
name:"surname",
content:"INJECTED.BY.TEMPLATE.CONT.surname"
}],
message: {
to:{
email:"email#gmail.com",
name: "Test",
type: "to"
}
},
async:true
};
return await mailchimp.messages.sendTemplate(mcbody);
}
If anyone is unfortunate enough to face this issue because Mailchip's docs don't cater to the typescript setup, and you aren't sure how to make it 'just work' here is the answer:
const mailchimpFactory = require("#mailchimp/mailchimp_transactional/src/index.js");
const mailchimp = mailchimpFactory("PUTKEYHERE");
This pulls in the javascript file directly and then the second line initialises the object.
Good luck all!
As of March 2022 the types have been added to DefinitelyTyped and can be accessed by running
npm install --save-dev #types/mailchimp__mailchimp_transactional
I had the same problem in a node/Typescript project, but this is working for me:
const mailchimp = require('#mailchimp/mailchimp_marketing')
export class MailchimpServices {
constructor() {
mailchimp.setConfig({
apiKey: '...',
server: 'us5',
});
}
async ping() {
console.log('Start mailchimp ping!')
const response = await mailchimp.ping.get();
console.log(response);
}
}

Google cloud tasks NodeJS api: Get queue stats?

I want to obtain the stats field for a queue in google cloud tasks using the nodejs client library #google-cloud/tasks. The stats field only exists in the v2beta3 version, however to get it we need to pass a query params readMask=*, but I don't know how to pass it using the client lib.
I tried using the otherArgs params, but its not working.
const tasks = require('#google-cloud/tasks');
const client = new tasks.v2beta3.GoogleCloudTasks()
// Get queue containing stats
const queue = await client.getQueue({name: '..'}, {otherArgs: {readMask: '*'}})
The readMask specifies which paths of the response object to fetch. The response will include all possible paths with placeholders like null, UNSPECIFIED, etc. and then contain the actual values you want.
const request = {
...
readMask: { paths: ['name', 'stats', 'state', ...] }
};
getQueue
const { v2beta3 } = require('#google-cloud/tasks');
const tasksClient = new v2beta3.CloudTasksClient();
async function main() {
const request = {
name: 'projects/PROJECT/locations/LOCATION/queues/QUEUE',
readMask: { paths: ['name', 'stats'] }
};
const [response] = await tasksClient.getQueue(request);
console.log(response);
}
main();
/*
{
name: 'projects/PROJECT/locations/LOCATION/queues/QUEUE',
...
stats: {
tasksCount: '113',
oldestEstimatedArrivalTime: null,
executedLastMinuteCount: '0',
concurrentDispatchesCount: '0',
effectiveExecutionRate: 500
}
}
*/
listQueues
const { v2beta3 } = require('#google-cloud/tasks');
const tasksClient = new v2beta3.CloudTasksClient();
async function main() {
const request = {
parent: 'projects/PROJECT/locations/LOCATION',
readMask: { paths: ['name', 'stats'] }
};
const [response] = await tasksClient.listQueues(request);
console.log(response);
}
main();
/*
[
{
name: 'projects/PROJECT/locations/LOCATION/queues/QUEUE',
...
stats: {
tasksCount: '1771',
oldestEstimatedArrivalTime: [Object],
executedLastMinuteCount: '0',
concurrentDispatchesCount: '0',
effectiveExecutionRate: 500
}
},
...
]
*/
By taking a look at the source code for the client library I see no reference for the readMask parameter as specified on the v2beta3 version of the REST API projects.locations.queues.get method.
The relevant method on the NodeJS client library getQueue() expects a type of request IGetQueueRequest that doesn't have the readMask parameter and is only expecting the name property.
Nonetheless this implementation might change in the future to include a relevant method to get the stats.
Regarding the REST API itself, there is an error on the public docs on the readMask section as * is not a valid character. If you want to get the Queue.stats field you should simply enter stats on the readMask parameter. If you want to get all the relevant fields you should enter all of them (e.g. name,rateLimits,retryConfig,state,taskTtl,tombstoneTtl,type,stats should get all the relevant fields you get from calling the method + the Queue.stats field).
The following picture should help you.
As a workaround if you click on the expand symbol on the Try this API section of the docs for the relevant method you could click on the JAVASCRIPT section and get the relevant code as how to build the request as shown on the following picture.
EDIT JANUARY 23rd 2020
The documentation was corrected to inform that in order to express that:
[Queue.stats] will be returned only if it was explicitly specified in the mask.
Which translates that simply writing stats under the readMask field will return the stats.

How can one upload an image to a KeystoneJS GraphQL endpoint?

I'm using TinyMCE in a custom field for the KeystoneJS AdminUI, which is a React app. I'd like to upload images from the React front to the KeystoneJS GraphQL back. I can upload the images using a REST endpoint I added to the Keystone server -- passing TinyMCE an images_upload_handler callback -- but I'd like to take advantage of Keystone's already-built GraphQL endpoint for an Image list/type I've created.
I first tried to use the approach detailed in this article, using axios to upload the image
const getGQL = (theFile) => {
const query = gql`
mutation upload($file: Upload!) {
createImage(file: $file) {
id
file {
path
filename
}
}
}
`;
// The operation contains the mutation itself as "query"
// and the variables that are associated with the arguments
// The file variable is null because we can only pass text
// in operation variables
const operation = {
query,
variables: {
file: null
}
};
// This map is used to associate the file saved in the body
// of the request under "0" with the operation variable "variables.file"
const map = {
'0': ['variables.file']
};
// This is the body of the request
// the FormData constructor builds a multipart/form-data request body
// Here we add the operation, map, and file to upload
const body = new FormData();
body.append('operations', JSON.stringify(operation));
body.append('map', JSON.stringify(map));
body.append('0', theFile);
// Create the options of our POST request
const opts = {
method: 'post',
url: 'http://localhost:4545/admin/api',
body
};
// #ts-ignore
return axios(opts);
};
but I'm not sure what to pass as theFile -- TinyMCE's images_upload_handler, from which I need to call the image upload, accepts a blobInfo object which contains functions to give me
The file name doesn't work, neither does the blob -- both give me server errors 500 -- the error message isn't more specific.
I would prefer to use a GraphQL client to upload the image -- another SO article suggests using apollo-upload-client. However, I'm operating within the KeystoneJS environment, and Apollo-upload-client says
Apollo Client can only have 1 “terminating” Apollo Link that sends the
GraphQL requests; if one such as apollo-link-http is already setup,
remove it.
I believe Keystone has already set up Apollo-link-http (it comes up multiple times on search), so I don't think I can use Apollo-upload-client.
The UploadLink is just a drop-in replacement for HttpLink. There's no reason you shouldn't be able to use it. There's a demo KeystoneJS app here that shows the Apollo Client configuration, including using createUploadLink.
Actual usage of the mutation with the Upload scalar is shown here.
Looking at the source code, you should be able to use a custom image handler and call blob on the provided blobInfo object. Something like this:
tinymce.init({
images_upload_handler: async function (blobInfo, success, failure) {
const image = blobInfo.blob()
try {
await apolloClient.mutate(
gql` mutation($image: Upload!) { ... } `,
{
variables: { image }
}
)
success()
} catch (e) {
failure(e)
}
}
})
I used to have the same problem and solved it with Apollo upload link. Now when the app got into the production phase I realized that Apollo client took 1/3rd of the gzipped built files and I created minimal graphql client just for keystone use with automatic image upload. The package is available in npm: https://www.npmjs.com/package/#sylchi/keystone-graphql-client
Usage example that will upload github logo to user profile if there is an user with avatar field set as file:
import { mutate } from '#sylchi/keystone-graphql-client'
const getFile = () => fetch('https://github.githubassets.com/images/modules/logos_page/GitHub-Mark.png',
{
mode: "cors",
cache: "no-cache"
})
.then(response => response.blob())
.then(blob => {
return new File([blob], "file.png", { type: "image/png" })
});
getFile().then(file => {
const options = {
mutation: `
mutation($id: ID!, $data: UserUpdateInput!){
updateUser(id: $id, data: $data){
id
}
}
`,
variables: {
id: "5f5a7f712a64d9db72b30602", //replace with user id
data: {
avatar: file
}
}
}
mutate(options).then(result => console.log(result));
});
The whole package is just 50loc with 1 dependency :)
The easies way for me was to use graphql-request. The advantage is that you don't need to set manually any header prop and it uses the variables you need from the images_upload_handler as de docs describe.
I did it this way:
const { request, gql} = require('graphql-request')
const query = gql`
mutation IMAGE ($file: Upload!) {
createImage (data:
file: $file,
}) {
id
file {
publicUrl
}
}
}
`
images_upload_handler = (blobInfo, success) => {
// ^ ^ varibles you get from tinymce
const variables = {
file: blobInfo.blob()
}
request(GRAPHQL_API_URL, query, variables)
.then( data => {
console.log(data)
success(data.createImage.fileRemote.publicUrl)
})
}
For Keystone 5 editorConfig would stripe out functions, so I clone the field and set the function in the views/Field.js file.
Good luck ( ^_^)/*

NodeJS gcloud - Upload to google storage with public-read property/custom cache-expire

I am trying to upload to google storage using the gcloud library (NodeJS).
I need to enable public-read property and also set the cache-expiration to 5 minutes.
I am using this (simplified) code:
storage = gcloud.storage({options}
bucker = storage.bucket('name');
fs.createReadStream(srcPath).pipe(bucket.file(targetFile).createWriteStream()).on('error', function(err)
How do I go about setting the approprate ACL/cache expire?
(I found this but not sure what to make of it:
https://googlecloudplatform.github.io/gcloud-node/#/docs/v0.11.0/storage?method=acl)
Thanks for the help
You can set the predefined ACL following the instructions here:
yourBucket.acl.default.add({
entity: "allUsers",
role: gcloud.storage.acl.READER_ROLE
}, function (err) {})
Regarding cache control, I don't believe you can set this as a default, but you can set this at the time of uploading your file:
var opts = { metadata: { cacheControl: "public, max-age=300" } }
bucket.file(targetFile).createWriteStream(opts)
Reference: https://cloud.google.com/storage/docs/reference-headers#cachecontrol
Api changed, use:
var gcloud = require('gcloud')({
projectId: 'your_id',
keyFilename: 'your_path'
});
var storage = gcloud.storage();
var bucket = storage.bucket('bucket_name');
bucket.acl.default.add({
entity: 'allUsers',
role: storage.acl.READER_ROLE
}, function(err) {});
To make the entire bucket public you can also use:
bucket.makePublic
Source: https://github.com/GoogleCloudPlatform/gcloud-node/blob/v0.16.0/lib/storage/bucket.js#L607
Or for just the file:
var bucketFile = bucket.file(filename);
// If you upload a new file, make sure to do this
// in the callback of upload success otherwise it will throw a 404 error
bucketFile.makePublic(function(err) {});
Source: https://github.com/GoogleCloudPlatform/gcloud-node/blob/v0.16.0/lib/storage/file.js#L1241 (link might change, look for makePublic in the source code.)
Or:
bucketFile.acl.add({
scope: 'allUsers',
role: storage.acl.READER_ROLE
}, function(err, aclObject) {});
which is the verbose version.
Source: https://github.com/GoogleCloudPlatform/gcloud-node/blob/v0.16.0/lib/storage/file.js#L116
Stephen's comment is accurate, however it did not work for me as the value didnt get set. After some trial and error, turn out cacheControl (no dash) to get it to work. At the time of writing, this is not documented anywhere that it needs to be in this format. I assume other fields will have the same issue.
var opts = { metadata: { "cacheControl": "public, max-age=300" } }
bucket.file(targetFile).createWriteStream(opts)

Resources