Trouble importing an image - node.js

I keep getting this error when importing an image and using canvas with discord.js:
C:\Users\Travi\OneDrive\Documents\GitHub\re\src\img\licenseTemp.png:1
�PNG
SyntaxError: Invalid or unexpected token
Here's my code:
I'm also using module Alias so that's why it's const licenseTemp = require('#img')
const BaseCommand = require('../../utils/structures/BaseCommand');
const Canvas = require('canvas');
const { MessageAttachment } = require('discord.js');
const licenseTemp = require('#img');
module.exports = class RankCommand extends BaseCommand {
constructor() {
super('rank', 'Information', []);
}
async run(client, message, args) {
const canvas = Canvas.createCanvas(449, 292);
const ctx = canvas.getContext('2d');
const background = await Canvas.loadImage(licenseTemp);
ctx.drawImage(background, 0, 0, canvas.width, canvas.height);
const attachment = new MessageAttachment(canvas.toBuffer(), 'license.png');
message.channel.send(attachment);
}
}

Do not require('#img'), this will try to make your image's content read as text be part of the script.
Instead let node-canvas's loadImage handle the fetching of that resource: make licenseTemp be the URL to that image directly.

Related

instagram-private-api throwing an error while trying to post an image on Instagram

const IgApiClient = require('instagram-private-api').IgApiClient;
const fs = require('fs');
const config = require('./config.json');
const ig = new IgApiClient();
ig.state.generateDevice(config.instagramUsername);
ig.state.proxyUrl = config.igProxy;
async function uploadImage(imageBuffer) {
const imageUrl = await ig.upload.photo(
{
file: imageBuffer
}
);
console.log(`URL: ${imageUrl}`);
}
(async () => {
await ig.simulate.preLoginFlow();
const loggedInUser = await ig.account.login(config.instagramUsername, config.instagramPassword);
process.nextTick(async () => await ig.simulate.postLoginFlow());
const imageBuffer = fs.readFileSync('output.png');
uploadImage(imageBuffer);
})();
Error:
C:\Users\PC\Desktop\spotted\node_modules\instagram-private-api\dist\core\request.js:126
return new errors_1.IgResponseError(response);
^
IgResponseError: POST /rupload_igphoto/1672943340753_0_8268117741 - 400 Bad Request;
at Request.handleResponseError (C:\Users\PC\Desktop\spotted\node_modules\instagram-private-api\dist\core\request.js:126:16)
at Request.send (C:\Users\PC\Desktop\spotted\node_modules\instagram-private-api\dist\core\request.js:54:28)
at async UploadRepository.photo (C:\Users\PC\Desktop\spotted\node_modules\instagram-private-api\dist\repositories\upload.repository.js:18:26)
at async uploadImage (C:\Users\PC\Desktop\spotted\bomba.js:10:20)
Node.js v19.2.0
Hello, I've been trying to create an app, which will automatically post generated images on instagram, but there's a problem, it doesn't work even if i do it as it is intended in documentation (probably). Does anyone have any ideas?

When I use the plaiceholder plugin to blur images in nextjs, why does it crash when there are more images?

I use the following code to blur the remote images, when the number of images is small it can be executed, but now I have 50 images and the browser gets stuck after the project runs.
what is the best way to optimize it? Is it caused by the plaiceholder plugin or by nextjs itself?
import fs from 'fs';
import path from 'path';
import { getPlaiceholder } from 'plaiceholder';
import { PhotoData } from '../types';
const directory = path.join(process.cwd(), 'data');
export async function getPhotos() {
const filePath = path.join(directory, 'photos.json');
const jsonData = fs.readFileSync(filePath, 'utf8');
const photosData = JSON.parse(jsonData).sort((a: PhotoData, b: PhotoData) =>
new Date(b.date) > new Date(a.date) ? 1 : -1
);
const photosProps = await Promise.all(
photosData.map(async (photoData: PhotoData) => {
const { thumbnail } = photoData;
const { base64, img } = await getPlaiceholder(thumbnail);
return {
...img,
blurDataURL: base64,
};
})
).then((values) => values);
return {
photosProps,
photosData,
};
}

Convert SVG to PNG and download, vue 2

I'm looking for a clean way to convert a SVG to PNG and download, in Vue 2. What is the best approach ?
I finally managed to come up with this solution:
save() {
const a = document.querySelector("a");
const triggerDownload = (imgURI: string) => {
const evt = new MouseEvent("click", {
view: window,
bubbles: false,
cancelable: true,
});
a?.setAttribute("download", "FileName.png");
a?.setAttribute("href", imgURI);
a?.setAttribute("target", "_blank");
a?.dispatchEvent(evt);
};
const canvas = document.getElementById("canvas") as HTMLCanvasElement;
const ctx = canvas.getContext("2d");
data = new XMLSerializer().serializeToString(
document.getElementById("svgId") as Node
);
const DOMURL = window.URL || window.webkitURL || window;
const img = new Image();
const svgBlob = new Blob([data], { type: "image/svg+xml;charset=utf-8" });
const url = DOMURL.createObjectURL(svgBlob);
img.onload = function () {
ctx?.drawImage(img, 0, 0);
DOMURL.revokeObjectURL(url);
const imgURI = canvas
.toDataURL("image/png")
.replace("image/png", "image/octet-stream");
triggerDownload(imgURI);
ctx?.clearRect(0, 0, canvas.width, canvas.height);
};
img.src = url;
},

How to process webcam images on nodejs using Posenet?

I'm trying to process images from the WebCam using Posenet on the server-side, but I'm not sure how to pass the image data to the estimateSinglePose.
Below is the simplified version of the code;
CLIENT
const imageData = context.getImageData(0, 0, 320, 180);
const buffer = imageData.data.buffer;
socket.emit("signal", buffer); //Pass it to the server through websocket
BACKEND
socket.on("signal", (data)=> {
const buffer = new Uint8Array(data);
const image = ts.tensor(data).reshape([180, 320, -1]);
// this where I'm stuck, I don't know how to pass the image to the estimateSinglePose
})
EDIT 1
Passing it to the estimateSinglePose resulted in an error.
Error: Invalid TF_Status: 3
Message: Incompatible shapes: [193,257,4] vs. [3]
at NodeJSKernelBackend.executeSingleOutput (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:209:43)
at Object.kernelFunc (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-node/dist/kernels/Add.js:28:24)
at kernelFunc (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3139:32)
at /Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3200:27
at Engine.scopedRun (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3012:23)
at Engine.runKernelFunc (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3196:14)
at Engine.runKernel (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3068:21)
at add_ (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:8969:19)
at Object.add__op [as add] (/Users/xxx/app/server/node_modules/#tensorflow/tfjs-core/dist/tf-core.node.js:3986:29)
at ResNet.preprocessInput (/Users/xxx/app/server/node_modules/#tensorflow-models/posenet/dist/resnet.js:41:19)
estimateSinglePose takes as parameter an HTMLImageElement or HTMLVideoElement. Server side with nodejs, you can use the package canvas to have the same behavior as the canvas in the browser
const posenet = require('#tensorflow-models/posenet');
const {Image, createCanvas} = require('canvas');
const canvas = createCanvas(img.width,img.height); // 180, 320
const ctx = canvas.getContext('2d');
const net = await posenet.load();
socket.on("signal", async (data)=> {
ctx.putImageData(data, 0, 0)
const pose = await net.estimateSinglePose(canvas, {
flipHorizontal: false
});
// you can now use pose
})

My object shows all black in STL Reader of VTK.js

I am using vtk.js in my angular app to display 3D STL objects. I know STL files don't have color info, but at least my 3D object should be white and details of it could be seen. However, my 3D object is full black, zero details.
import vtkFullScreenRenderWindow from 'vtk.js/Sources/Rendering/Misc/FullScreenRenderWindow';
import vtkActor from 'vtk.js/Sources/Rendering/Core/Actor';
import vtkMapper from 'vtk.js/Sources/Rendering/Core/Mapper';
import vtkSTLReader from 'vtk.js/Sources/IO/Geometry/STLReader';
ngOnInit(): void {
const reader = vtkSTLReader.newInstance();
const mapper = vtkMapper.newInstance({ scalarVisibility: false });
const actor = vtkActor.newInstance();
actor.setMapper(mapper);
mapper.setInputConnection(reader.getOutputPort());
function update() {
const fullScreenRenderer = vtkFullScreenRenderWindow.newInstance();
const renderer = fullScreenRenderer.getRenderer();
const renderWindow = fullScreenRenderer.getRenderWindow();
const resetCamera = renderer.resetCamera;
const render = renderWindow.render;
renderer.addActor(actor);
resetCamera();
render();
}
const myContainer = document.querySelector('body');
const fileContainer = document.createElement('div');
fileContainer.innerHTML = '<input type="file" class="file"/>';
myContainer.appendChild(fileContainer);
const fileInput = fileContainer.querySelector('input');
function handleFile(event) {
event.preventDefault();
const dataTransfer = event.dataTransfer;
const files = event.target.files || dataTransfer.files;
if (files.length === 1) {
myContainer.removeChild(fileContainer);
const fileReader = new FileReader();
fileReader.onload = function onLoad(e) {
reader.parseAsArrayBuffer(fileReader.result);
update();
};
fileReader.readAsArrayBuffer(files[0]);
}
}
fileInput.addEventListener('change', handleFile);
reader.setUrl("./assets/test2.stl", { binary: true }).then(update);
}
How can I add color to my object? I couldn't find any example about it, either.

Resources