how to load atlas using base64 in phaser3? - phaser-framework

I am making a phaser3 playable ad and require to put images and other assets in a single HTML file. I can load the images using textures.addBase64 but how can I load atlas using base64.
Also if you could tell me how to put JSON in the HTML file so that I can refer to it while loading the atlas.
Thank you 🙂

Untested personally but take a look at this solution
private loadBase64Atlas(key: string, data: string, json: object): void {
const imageElement = new Image();
imageElement.onload = () => {
this.scene.textures.addAtlas(key, imageElement, json);
this.onAssetLoaded();
};
const spriteBlob = this.base64ToBlob(data.split(',')[1], 'image/png');
imageElement.src = URL.createObjectURL(spriteBlob);
}
Adapt as needed if you're not using TypeScript. Here's the base64ToBlog implementation
The JSON object can be just bounced back and forth from and to base64 if you had the need to embed it in base64 aswell:
// get the base64 string to embed in the ad and physically store it
const atlasJSON64 = btoa(JSON.stringify(atlasJSON));
[...]
const objectatlasJSON64 = JSON.parse(atob(atlasJSON64));

Related

Convert an imagen from an url to base64 without downloading it

I want to generate a pdf with images and text, for this I'm using the jsPdf package which includes this method to add images.
addImage(imageData, format, x, y, width, height, alias, compression, rotation)
The imageData argument accepts base64 format data and this part is which I'm struggling with.
My images are not local but hosted in cloudinary and I would prefer not to download them, searching on the web I've found that if I don't want to download the image I will need:
Fetch the image data, for this I'm using node-fetch package(fetch in the code).
Convert the image data into a Buffer.
Finally encode that buffer into a base64.
This is my attempt so far:
const fetchImage = async () => {
const imageUrl = <url-of-my-image>;
const response = await fetch(imageUrl, {
compress: false
});
const dataUrlPrefix = `data:${response.headers.get('content-type')};base64,`;
const body = await response.text();
const buffer = Buffer.from(body)
const imageBase64 = buffer.toString('base64');
const imageDataUrl = dataUrlPrefix+imageBase64;
doc.text("Hello world!", 10, 10);
doc.addImage(imageDataUrl, 'webp', 15, 40, 120, 120)
doc.save("a4.pdf");
}
The code runs without errors but the image is not being inserted into the pdf, it just display the "Hello World!" text but nothing else.
My guess is that I'm doing something wrong in the converting/encoding process (before I add it to jsPdf) because if I convert my image with an online converter like this one the base64 string that results is successfully decoded into an actual image when using a online base64 decoder like this one, whereas when I run the base64 output from my nodejs code i.e. the output for the imageBase64 or imageDataUrl variables in the same base64 decoder it results in not image being decoded.

How do I create an expressjs endpoint that uses azure tts to send audio to a web app?

I am trying to figure out how to expose an express route (ie: Get api/word/:some_word) which uses the azure tts sdk (microsoft-cognitiveservices-speech-sdk) to generate an audio version of some_word (in any format playable by a browser), and res.send()'s the resulting audio, so that a front end javascript web app could consume the api in order to play the audio pronunciation of the word.
I have the azure sdk 'working' - it is creating an 'ArrayBuffer' inside my expressjs code. However, I do not know how to send the data in this ArrayBuffer to the front end. I have been following the instructions here: https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started-text-to-speech?tabs=import%2Cwindowsinstall&pivots=programming-language-javascript#get-result-as-an-in-memory-stream
Another way to phrase my question would be 'in express, I have an ArrayBuffer whose contents is an .mp3/.ogg/.wav file. How do I send that file via express? Do I need to convert it into some other data type(like a Base64 encoded string? A buffer?) Do I need to set some particular response headers?
I finally figured it out seconds after asking this question 😂
I am pretty new to this area, so any pointers on how this could be improved would be appreciated.
app.get('/api/tts/word/:word', async (req, res) => {
const word = req.params.word;
const subscriptionKey = azureKey;
const serviceRegion = 'australiaeast';
const speechConfig = sdk.SpeechConfig.fromSubscription(
subscriptionKey as string,
serviceRegion
);
speechConfig.speechSynthesisOutputFormat =
SpeechSynthesisOutputFormat.Ogg24Khz16BitMonoOpus;
const synthesizer = new sdk.SpeechSynthesizer(speechConfig);
synthesizer.speakSsmlAsync(
`
<speak version="1.0" xmlns="http://www.w3.org/2001/10/synthesis"
xmlns:mstts="https://www.w3.org/2001/mstts" xml:lang="zh-CN">
<voice name="zh-CN-XiaoxiaoNeural">
${word}
</voice>
</speak>
`,
(resp) => {
const audio = resp.audioData;
synthesizer.close();
const buffer = Buffer.from(audio);
res.set('Content-Type', 'audio/ogg; codecs=opus; rate=24000');
res.send(buffer);
}
);
});

How can you resize an image in NodeJS using Sharp having only a URL, using async/await, and without a local copy being created?

I'm working in an environment where the available image processing library is NodeJS's Sharp for scaling images. It has been stable as it is being pipe based, but I'm tasked with converting it into TypeScript, and set it up using Async/Await when possible. I have most of the pieces ready, but the issue I am facing lies in the fact that all I have is a URL of an image and Sharp either expects a string URI (local file only) or a Buffer.
Currently, I am using the package Axios in order to fetch the image as a string retrievable by the data property on the response. I've been feeding a buffer created from the string by Buffer.from(response.data) into Sharp, and it doesn't have any issues until I try and "work" with the image by attempting to gather the metadata. At this point it throws an error: [Error: Input buffer contains unsupported image format]. But I know that the image is valid, as it worked in the old system, and I didn't change any dependencies.
I use QuokkaJS to test, and the following PoC fails, and I need to get it to functioning order.
import axios from 'axios';
import Sharp from 'sharp';
const url = 'https://dqktdb1dhykn6.cloudfront.net/357882-TLRKytH3h.jpg';
const imageResponse = await axios({url: url, responseType: 'stream'});
const buffer = Buffer.from(imageResponse.data);
let src = new Sharp(buffer);
const src2 = src.clone();//this is simply because it will end up being a loop, if this is the issue let me know.
try {
await src2.jpeg();
await src2.resize(null, 1920);
await src2.resize(1080, null);
const metadata = await src2.clone().metadata();//this is where it fails
console.log(metadata);
} catch(e) {
console.log(e);//logs the mentioned error
}
If anybody has any idea what I am doing incorrectly, or has anything specific information that they would like me to add, please let me know! If I need to pipe the image data, let me know. I've tried to directly pipe it getting a pipe is not a function on the string (which makes sense).
Update #1:
A big thank you to #Thee_Sritabtim for the comment, which solved the issue. Basically, I had been trying to convert a Stream based String into a Buffer. I needed to instead declare that the request was for an ArrayBuffer, and then feed it into Sharp while declaring its type of binary. The working example of the PoC is below!
import axios from 'axios';
import Sharp from 'sharp';
const url = 'https://dqktdb1dhykn6.cloudfront.net/357882-TLRKytH3h.jpg';
const imageResponse = await axios({url: url, responseType: 'arraybuffer'});
const buffer = Buffer.from(imageResponse.data, 'binary');
let src = new Sharp(buffer);
try {
await src.jpeg();
await src.resize(null, 1920);
await src.resize(1080, null);
const metadata = await src.metadata();//this was where it failed, but now it prints an object of metadata
console.log(metadata);
} catch(e) {
console.log(e);//Doesn't catch anything any more!
}
To get a buffer from a axios response, you'll have to set responseType to 'arraybuffer'.
const imageResponse = await axios({url: url, responseType: 'arraybuffer'})
const buffer = Buffer.from(imageResponse.data, 'binary')
Alternatively,
You could also use stream as input for sharp(), so you could keep the responseType to 'stream'
const imageResponse = await axios({url: url, responseType: 'stream'})
const src = imageResponse.data.pipe(sharp())
//...
const metadata = await src.metadata()

How to convert base64 string of an image to an uploadable file without writing it to a filesystem

I have a variable in my NodeJS application which contains base64 string of an image. I need to send a form to some server with POST request containing this image. The problem is I can't convert base64 string to an image without writing it to a filesystem. Here's my code:
const imagePath = path.resolve(__dirname, '../../../images/anomalies/' + Date.now() + '.png')
fs.writeFileSync(imagePath, img, { encoding: 'base64' })
setTimeout(() => {
fs.unlinkSync(imagePath)
}, 30_000)
const form = new FormData()
form.append('photo', fs.createReadStream(imagePath))
As you can see, I need to write base64 string to a file and then grab it with fs.createReadStream. Otherwise file won't upload. I tried converting it to ReadStream via stream-buffers (but server still not accepted that data) and also I tried to make a Blob from it but Node don't have blobs and all these modules on npm are either too old or don't have typings which is not great at all. I also tried Buffer.from(base64, 'base64'), which doesn't work as well.
Is there any way to create an uplodable image file from base64 encoded string without accessing filesystem in NodeJS?
You should be able to convert the Buffer object returned by fs.readFileSync to a base64 string
const base64Image = fs.readFileSync(imagePath).toString('base64')

How to decode base64 PDF string in Flutter?

I know there is a package called dart:convert which let me decode base64 image. But apparently, it doesn't work with pdf files. How can I decode the base64 PDF file in Flutter?
I want to store it in Firebase Storage (I know how to do it) but I need the File variable to do it.
I have a web service written in node js where I have a POST route. There, I create a pdf file and encode it to base 64. The response is a base64 string, look at the code.
router.post('/pdf', (req, res, next) => {
//res.send('PDF');
const fname = req.body.fname;
const lname = req.body.lname;
var documentDefinition = {
content: [ write your pdf with pdfMake.org ],
styles: { write your style };
const pdfDoc = pdfMake.createPdf(documentDefinition);
pdfDoc.getBase64((data) => {
res.send({ "base64": data });
});
});
As you can see, it returns the pdf as a base64 string.
Now, in Flutter, I have written this:
http.post("https://mypostaddreess.com",body: json.encode({"data1":"data"}))
.then((response) {
print("Response status: ${response.statusCode}");
print("Response body: ${response.body}");
var data = json.decode(response.body);
var pdf = base64.decode(data["base64"]);
});
}
I have the PDF in the variable 'pdf' as you see. But I don't know how to decode it to download the pdf or show it in my Flutter app.
#SwiftingDuster
a little added, maybe besides decoding, it's also necessary to create a pdf file and open it.
createPdf() async {
var bytes = base64Decode(widget.base64String.replaceAll('\n', ''));
final output = await getTemporaryDirectory();
final file = File("${output.path}/example.pdf");
await file.writeAsBytes(bytes.buffer.asUint8List());
print("${output.path}/example.pdf");
await OpenFile.open("${output.path}/example.pdf");
setState(() {});
}
library needed:
1. open_file
2. path_provider
3. pdf
I think it's better to get the BufferArray and convert it into a pdf file.
Check out my answer from here : Get pdf from blob data
This should convert base64 encoded pdf data into a byte array.
import 'packages:dart/convert.dart';
List<int> pdfDataBytes = base64.decode(pdfBase64)
.map((number) => int.parse(number));
The pdf and the image plugins seems to suit your needs for displaying pdf.
The code should be roughly like so:
import 'package:pdf/pdf.dart';
import 'package:image/image.dart';
...
Image img = decodeImage(pdfDataBytes);
PdfImage image = PdfImage(
pdf,
image: img.data.buffer.asUint8List(),
width: img.width,
height: img.height);
// Display it somehow
...

Resources