how to send attach a file from firebase storage and send it to an email using mailgun services in node.js - node.js

Is it possible to attach a file from the firebase storage??
I tried the following code but it doesn't work
var mailgun = require("mailgun-js");
var api_key = 'key-acf9f881e32c85b3c0dad34358507a95';
var DOMAIN = 'sandbox76c6f74ddab14862816390c16f37a272.mailgun.org';
var mailgun = require('mailgun-js')({apiKey: api_key, domain: DOMAIN});
var path = require("path");
var filepath = path.join(`gs://i-m-here-c01f6.appspot.com/Groups/${leaderId}`, 'group_image.jpg');
var data = {
from: 'Excited User <postmaster#sandbox76c6f74ddab14862816390c16f37a272.mailgun.org>',
to: 'rayteamstudio#gmail.com',
subject: 'Complex',
text: 'Group Creation Request',
html: `<p>A user named: ${fromName} wants to create a group.<br />
User ID: ${leaderId}<br />
Group Name: ${groupName}<br />
Group Description: ${groupDescription}<br /><br />
To Accept the request click here:<br />
https://us-central1-i-m-here-c01f6.cloudfunctions.net/acceptOrDenyGroupCreation?leaderID=${leaderId}&requestStatus=approved <br /><br />
To Deny the request click here:<br />
https://us-central1-i-m-here-c01f6.cloudfunctions.net/acceptOrDenyGroupCreation?leaderID=${leaderId}&requestStatus=denied /></p>`,
attachment: filepath
};
mailgun.messages().send(data, function (error, body) {
if(error)
console.log('email err: ',error);
});
please help

You can't use a gs://bucket-name/path-to-file URL to download a file from Cloud Storage just like it was an HTTP URL. Instead, you'll have to do one of these:
Use the Cloud Storage SDK to download the file locally, then attach it to your email
Or, use the Cloud Storage SDK to generate a "Signed URL", which will give you an HTTPS URL to the file, which can be used to download it.

You can get it as a buffer and send it like this :
var request = require('request');
var file = request("https://www.google.ca/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png");
var data = {
from: 'Excited User <me#samples.mailgun.org>',
to: 'serobnic#mail.ru',
subject: 'Hello',
text: 'Testing some Mailgun awesomeness!',
attachment: file
};
mailgun.messages().send(data, function (error, body) {
console.log(body);
});
from here

In the example you've given you are using the gs:://bucket-name/path to download the file from Cloud Storage and send it as an attachment. In order to do so, you would need to use the request dependency as per the docs:
https://github.com/bojand/mailgun-js#attachments
In this case, all you would have to do is:
// var path = require("path");
const request = require('request');
var filepath = request(`gs://i-m-here-c01f6.appspot.com/Groups/${leaderId}`);
If you wanted to get more specific about assigning properties to the file, you can use the new mailgun.Attachments(options) to pass in an options parameter that looks something like this:
options: {
data: filepath,
filename: 'name,jpg',
contentType: 'image/jpeg',
knownLength: 2019121,
};
Where
data - can be one of:
a string representing file path to the attachment
a buffer of file data
an instance of Stream which means it is a readable stream.
filename - the file name to be used for the attachment. Default is 'file'
contentType - the content type. Required for case of Stream data. Ex. image/jpeg.
knownLength - the content length in bytes. Required for case of Stream data.

[That's an example for Nodemailer, it took me so much time to attach dynamically an attachment from a Cloud Storage for Firebase. Therefore, hopefully it'll help out someone. You can find so many similarities.
All the examples are in Angular / TypeScript.
It starts somewhere in the component.html file
<form
#formDirective="ngForm"
[formGroup]="contactForm"
(ngSubmit)="onSubmit(contactForm.value, formDirective)"
>
<mat-form-field>
<ngx-mat-file-input
(change)="uploadFile($event)"
formControlName="fileUploader"
multiple
type="file"
>
</ngx-mat-file-input>
</mat-form-field>
</form>
In component.ts
import { AngularFirestore } from '#angular/fire/firestore';
import {
AngularFireStorage,
AngularFireStorageReference,
AngularFireUploadTask,
} from '#angular/fire/storage';
import {
FormBuilder,
FormGroup,
FormGroupDirective,
Validators,
} from '#angular/forms';
import { throwError } from 'rxjs';
...
constructor(
private angularFirestore: AngularFirestore,
private angularFireStorage: AngularFireStorage,
private formBuilder: FormBuilder
) {}
public contentType: string[] = [];
public downloadURL: string[] = [];
public fileName: string = '';
public maxFileSize = 20971520;
public contactForm: FormGroup = this.formBuilder.group({
fileUploader: [
'',
Validators.compose([
// Your validators rules...
]),
],
...
});
...
public onSubmit(form: any, formDirective: FormGroupDirective): void {
form.contentType = this.contentType;
form.fileUploader = this.downloadURL;
form.fileName = this.fileName;
this.angularFirestore
.collection(String(process.env.FIRESTORE_COLLECTION_MESSAGES)) // Make sure the environmental variable is a string.
.add(form)
.then(() => {
// Your logic, such as alert...
.catch(() => {
// Your error handling logic...
});
}
public uploadFile(event: any): void {
// Iterate through all uploaded files.
for (let i = 0; i < event.target.files.length; i++) {
const file = event.target.files[i]; // Get each uploaded file.
const fileName = file.name + '_' + Date.now(); // It makes sure files with the same name will be uploaded more than once and each of them will have unique ID, showing date (in milliseconds) of the upload.
this.contentType = file.type;
this.fileName = fileName;
// Get file reference.
const fileRef: AngularFireStorageReference = this.angularFireStorage.ref(
fileName
);
// Create upload task.
const task: AngularFireUploadTask = this.angularFireStorage.upload(
fileName,
file,
file.type
);
// Upload file to Cloud Firestore.
task
.snapshotChanges()
.pipe(
finalize(() => {
fileRef.getDownloadURL().subscribe((downloadURL: string) => {
this.angularFirestore
.collection(String(process.env.FIRESTORE_COLLECTION_FILES)) // Make sure the environmental variable is a string.
.add({ downloadURL: downloadURL });
this.downloadURL.push(downloadURL);
});
}),
catchError((error: any) => {
return throwError(error);
})
)
.subscribe();
}
}
That's all frontend, now it's finally time for our backend.
import { DocumentSnapshot } from 'firebase-functions/lib/providers/firestore';
import { EventContext } from 'firebase-functions';
async function onCreateSendEmail(
snap: DocumentSnapshot,
_context: EventContext
) {
try {
const contactFormData = snap.data();
// You can use those to debug your code...
console.log('Submitted contact form: ', contactFormData);
console.log('context: ', _context); // This log will be shown in Firebase Functions logs.
const mailTransport: Mail = nodemailer.createTransport({
// Make sure the environmental variables have proper typings.
host: String(process.env.MAIL_HOST),
port: Number(process.env.MAIL_PORT),
auth: {
user: String(process.env.MAIL_ACCOUNT),
pass: String(process.env.MAIL_PASSWORD),
},
tls: {
rejectUnauthorized: false, //! Fix ERROR "Hostname/IP doesn't match certificate's altnames".
},
});
const mailOptions = {
attachments: [
{
contentType: `${contactFormData!.contentType}`,
filename: `${contactFormData!.fileName}`,
path: `${contactFormData!.fileUploader}`,
},
],
... // Your other mails options such as bcc, from, to, subject, html...
};
await mailTransport.sendMail(mailOptions);
} catch (err) {
console.error(err);
}
}
That's more or less all the headache from frontend to backend a developer has to go through to dynamically attach file to an email with correct contentType, downloadURL, and fileName. I was missing a complete solution for the case across WWW, that's why the front-to-back answer.
Note:
The frontend is handling multiple file uploads on the UI/Cloud Storage for Firebase side and all the files are uploaded to Firebase. However, only one dynamically added attachment is working fine. I still can't figure out how to handle multiple dynamic files uploads.

Related

Email PDF from Firebase Storage as attachment using SendGrid

I've created a Firebase Function that, upon click, will get a PDF from Firebase Storage, attach it to an email and send it via SendGrid.
I have no problem getting the PDF Storage URLs and sending the email, but I'm not able to attach the PDFs. I've tried referencing the SendGrid documentation and it seems like the fs.readFileSync function doesn't work on URLs.
How can I get the data from Firebase to be able to stringify it to base64? I've tried reading the Firebase documentation as well but can't seem to be able to figure out a solution. Any help would be greatly appreciated!
Function that successfully grabs all Document URLs from Firebase Storage
const getDocumentURLs = () => {
firebase
.storage()
.ref("Tenant Resumes/" + tenantID)
.listAll()
.then((res) => {
res.items.forEach((result) => {
result.getDownloadURL().then((docURL) => {
setDocumentData((newURLs) => [...newURLs, docURL]);
console.log(docURL);
});
});
}); };
And the firebase function (the console.log gives an error that says "Not an object" - assuming this is because I'm passing a string URL not an actual file)
exports.sendTenantMail = functions.https.onCall((data, res) => {
const name = data.name;
const email = data.email;
const documents = data.documents;
const documentOne = fs.readFileSync(data.documentOne).toString("base64");
const documentTwo = fs.readFileSync(data.documentTwo).toString("base64");
const documentThree = fs.readFileSync(data.documentThree).toString("base64");
const documentFour = fs.readFileSync(data.documentFour).toString("base64");
const documentFive = fs.readFileSync(data.documentFive).toString("base64");
const documentSix = fs.readFileSync(data.documentSix).toString("base64");
const realtorEmail = data.realtorEmail;
console.log(fs.readFileSync(data.documentOne).toString("base64"));
const msg = {
to: "concierge#chexy.co",
from: "concierge#chexy.co",
templateId: "d-1aec2e98c1af44fdb0e1f97d540c6973",
dynamicTemplateData: {
subject:
"A new tenant resume has been submitted by " +
email +
" for " +
realtorEmail,
name: name,
},
attachments: [
{
content: documentOne,
filename: "Doc1.pdf",
type: "application/pdf",
disposition: "attachment",
},
],
};
sgMail.send(msg).catch((err) => {
console.log(err);
});
});
Any help would be huge!

How to send file by email from Firebase Storage ObjectMetadata

I am developing a trigger function that listens for a new object in the bucket specified. What I want is to send the object returned by email using nodemailer.
const transport = nodemailer.createTransport({
host: "smtp.gmail.com",
port: 465,
secure: true,
auth: {
user: "xxxxxxxxxx#gmail.com",
pass: "xxxxxxxxxxxxxxx"
}
});
exports.sendConfirmationEmail = functions.storage.bucket('bucket-name').object().onFinalize(async (object) => {
const orderID = object.name.slice(0, -4);
admin.database().ref('/pedidos/' + orderID).once('value', (snapshot) => {
return sendEmail(snapshot.val().customer, snapshot.val().email, snapshot.val().number, /*FILE*/);
});
});
function sendEmail(user, email, order, file){
console.log("Sending Email...");
return transport.sendMail({
from: "XXXXXX <xxxxxxxxxx#gmail.com>",
to: email,
subject: "Confirmación pedido " + order,
html: `
<h1> Estiamdo ${user}, </h1>
<p> Hemos recibido su pedido correctamente. Le mantendremos infromado de su estado. </p>
<p> Gracias por confiar en nosotros </p>
`,
attachment: file
})
.then(r => r)
.catch(e => {
console.log("An error has ocurred " + e);
});
}
}
Please can someone help?
First let's fix up the main part of your Cloud Function. When writing code for your functions, as a general rule, don't use the callback API of the Admin SDK.
admin.database().ref('/pedidos/' + orderID).once('value', (snapshot) => { /* ... */ });
should be
admin.database().ref('/pedidos/' + orderID).once('value')
.then((snapshot) => { /* ... */ });
or
const snapshot = await admin.database().ref('/pedidos/' + orderID);
It's also very important to make sure that you return or await any promises in your event handler, otherwise your code could be terminated at any time resulting in unexpected errors.
As you are only sending PDF documents, we'll ignore any files that are not PDFs.
exports.sendConfirmationEmail = functions.storage.bucket('bucket-name').object().onFinalize(async (object) => {
if (object.contentType !== "application/pdf")
return; // ignore non-pdfs
const orderID = object.name.slice(0, -4);
// ↓↓ this return is needed
return admin.database().ref('/pedidos/' + orderID).once('value')
.then((snapshot) => {
return sendEmail(snapshot.val().customer, snapshot.val().email, snapshot.val().number, /*FILE*/);
});
});
Next, we move on to your sendEmail function. In your current sendEmail function, you incorrectly use attachment instead of attachments. You can also remove these lines that will just introduce problems:
.then(r => r) // doesn't do anything
.catch(e => { // logs an error, but incorrectly prevents it being handled by .catch() elsewhere
console.log("An error has ocurred " + e);
});
This allows us to redefine sendEmail as:
function sendEmail (user, email, order, attachments = undefined) {
return transport.sendMail({
from: "XXXXXX <xxxxxxxxxx#gmail.com>",
to: email,
subject: "Confirmación pedido " + order,
html: `
<h1> Estiamdo ${user}, </h1>
<p> Hemos recibido su pedido correctamente. Le mantendremos infromado de su estado. </p>
<p> Gracias por confiar en nosotros </p>
`,
attachments
});
}
Next, let's review the documentation for the attachments property:
attachments option in the message object that contains an array of attachment objects.
Attachment object consists of the following properties:
filename - filename to be reported as the name of the attached file. Use of unicode is allowed.
content - String, Buffer or a Stream contents for the attachment
path - path to the file if you want to stream the file instead of including it (better for larger attachments)
href – an URL to the file (data uris are allowed as well)
httpHeaders - optional HTTP headers to pass on with the href request, eg. {authorization: "bearer ..."}
contentType - optional content type for the attachment, if not set will be derived from the filename property
contentDisposition - optional content disposition type for the attachment, defaults to ‘attachment’
cid - optional content id for using inline images in HTML message source
encoding - If set and content is string, then encodes the content to a Buffer using the specified encoding. Example values: ‘base64’, ‘hex’, ‘binary’ etc. Useful if you want to use binary attachments in a JSON formatted email object.
headers - custom headers for the attachment node. Same usage as with message headers
raw - is an optional special value that overrides entire contents of current mime node including mime headers. Useful if you want to prepare node contents yourself
Now we know what we can use for each attachment object, we need to compare that list against what we can extract from the object parameter passed into the Storage Event Cloud Function event handler. The main properties include:
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
So, for our attachment, we want to provide filename, content, contentDisposition and contentType. Because the Cloud Storage object we are sending isn't located on disk or in memory, we are going to stream it from Cloud Storage through to nodemailer by passing a Stream as the content property of our attachment object. This results in:
const bucket = admin.storage().bucket(object.bucket);
const remoteFile = bucket.file(object.name);
const attachment = {
filename: `order-${orderID}.pdf`, // the attachment will be called `order-<ID>.pdf`
content: remoteFile.createReadStream(), // stream data from Cloud Storage
contentType: object.contentType, // use appropriate content type
contentDisposition: "attachment", // this file is a downloadable attachment
};
We can now roll it all together and clean it up with async/await syntax:
const transport = nodemailer.createTransport({ /* ... */ });
exports.sendConfirmationEmail = functions.storage.bucket('bucket-name').object().onFinalize(async (object) => {
if (object.contentType !== "application/pdf") {
console.log("Content-Type was not application/pdf. Ignoring.");
return; // ignore non-pdfs
}
// if (object.metageneration > 1) {
// console.log("Metageneration was greater than 1. Ignoring.");
// return; // ignore rewritten files
// }
try {
const orderID = object.name.slice(0, -4);
const orderSnapshot = await admin.database()
.ref(`/pedidos/${orderID}`)
.once('value');
if (!orderSnapshot.exists) {
console.error(`Order #${orderID} document not found`);
return;
}
const { customer, email, number } = orderSnapshot.val();
// prepare attachment
const bucket = admin.storage().bucket(object.bucket);
const remoteFile = bucket.file(object.name);
const attachment = {
filename: `order-${orderID}.pdf`, // override name of the PDF
content: remoteFile.createReadStream(), // stream data from Cloud Storage
contentType: object.contentType, // use appropriate content type
contentDisposition: "attachment", // this file is a downloadable attachment
};
console.log("Sending confirmation email...");
await sendEmail(customer, email, number, [ attachment ]);
console.log(`Email confirmation was sent successfully for Order #${orderID}`);
} catch (error) {
console.error("Unexpected error: ", error);
}
});
function sendEmail (user, email, order, attachments = undefined) {
return transport.sendMail({
from: "XXXXXX <xxxxxxxxxx#gmail.com>",
to: email,
subject: "Confirmación pedido " + order,
html: `
<h1> Estiamdo ${user}, </h1>
<p> Hemos recibido su pedido correctamente. Le mantendremos infromado de su estado. </p>
<p> Gracias por confiar en nosotros </p>
`,
attachments
});
}
Note: You should decide if the email should be sent or not when metageneration is greater than 1.
Addendum: I highly recommend using functions.config() to store things like username/password combos for nodemailer rather than write them into your code.

s3 file upload - not working for video - React/Meteor - aws-sdk

I am having issue uploading mp4 files to my s3 bucket in my react/meteor project. It works for the other type of files (mp3, images) but not for video. I don't get any error but when I try to read the file that was uploaded it doesn't work.
here is my client code:
import React from "react";
import { Meteor } from "meteor/meteor";
import PropTypes from "prop-types";
import { types } from "../../../utils/constants/types";
const FileUpload = ({ fileType, type, typeId, subtype, setFileName }) => {
const handleUpload = event => {
event.preventDefault();
const file = event.target.files[0];
const fileExtension = file.type;
var reader = new FileReader();
reader.onload = function () {
Meteor.call(
"uploadFile",
fileExtension,
reader.result,
type,
typeId,
(err, result) => {
if (err) {
console.log(err);
} else {
setFileName(result);
}
}
);
};
reader.readAsDataURL(file);
};
return (
<div>
<input name="Uploader" onChange={handleUpload} type="file" />
</div>
);
};
and there is my meteor method on the server side:
Meteor.methods({
uploadFile: async function (fileType, data, type, typeId) {
let extension;
let contentType = fileType;
if (fileType.includes("jpeg") || fileType.includes("jpg")) {
extension = "jpg";
} else if (fileType.includes("png")) {
extension = "png";
} else if (fileType.includes("mp4")) {
extension = "mp4";
} else if (fileType.includes("audio/mpeg")) {
contentType = "video/mp4";
extension = "mp3";
} else if (fileType.includes("pdf")) {
extension = "pdf";
} else {
throw new Meteor.Error("format-error", "Only authorized format");
}
const random = Random.id();
const key = `random.${extension}`;
const buf =
extension !== "mp4"
? Buffer.from(data.replace(/^data:image\/\w+;base64,/, ""), "base64")
: data;
const config = {
Bucket: bucketName,
Key: key,
Body: buf,
ContentType: contentType,
ACL: "public-read",
};
if (extension !== "mp4") {
config.ContentEncoding = "base64";
}
const uploadResult = await s3.upload(config).promise();
return uploadResult.Location;
},
});
I think it may come from the reader not managing properly video files but I'm a bit lost there. Any input would be appreciated. Thanks.
I wanted to ask, since I see your code is pretty complicated. Why do you upload files via your server?
I think it is more efficient to push files from the client straight to S3. Please have a look at this package that I am maintaining if you are interested to see the concept (https://github.com/activitree/s3up-meta). It is a re-write of an older hard-tested package from years back. Uploads are signed by the Meteor server but uploaded by the client. You don't want to keep sockets/fibers busy with file uploads while also a lot of data is coming in for each upload (data about the status of upload). You would want to have that all on the user/client side.
Another thing to note is the importance of setting the AbortIncompleteMultipartUpload in your bucket/Management/Lifecycle. This is where your S3 size may grow without control.
Finally, when I upload video, I just upload the file (received from the input type file). It looks like this:
video: File
lastModified: 1573722339000
lastModifiedDate: Thu Nov 14 2019 13:05:39 GMT+0400 (Gulf Standard Time) {}
name: "IMG_7847.MOV"
size: 20719894
type: "video/quicktime"
webkitRelativePath: ""
I do use other helpers to determine the file (and limit the time length of the permitted uploads) and file type.
You could monitor the errors on your server side as you upload or/and activate CloudWatch on S3 side to see where things are failing.

Firebase Cloud Functions - create pdf, store to bucket and send via mail

I'm developing a Firebase Function, which is triggered when a new order is added to the Realtime Database. The first thing it does is to creat a pdf and pipe it to a google cloud storage bucket.
On the .on("finish") event of the bucket stream, the next function gets started, which should send the piped pdf via email to the customer.
Everything seems to work, at least a bit.
First I had the problem, that the attached pdf always was empty. (Not just blank. I also opened it in notepad++ and it really was all empty). When I checked the doc and bucketFileSream vars inside the bucketFileStream.on("finished") function both had a length of 0. A check of the doc var directly after doc.end showed a length of somewhat 612.
I then changed the flow, that in the sendOrderEmail function I also open a new Read Stream from the newly created File in the bucket.
Now I get at least some stuff of the PDF in the attachement, but never the whole content.
When I check the PDF uploaded to the bucket, it looks like it should.
I googled alot and found some answers that were also targeting this topic, but as also seen in comments on these questions, they were not completly helpful.
PDF Attachment NodeMailer
Where to generate a PDF of Firebase Database data - mobile app, or Firebase Hosting web app
How to attach file to an email with nodemailer
I also checked with the nodemailer documentation how to pass the attachement correctly and implemented it as documented. But no success.
I think that the mail gets sent before the Read Stream has finished.
Here the Package Versions I use:
"#google-cloud/storage": "1.5.2"
"#types/pdfkit": "^0.7.35",
"firebase-admin": "5.8.0",
"firebase-functions": "^0.7.3"
"nodemailer": "4.4.1",
Can anyone tell me what I'm doing wrong or provide a working example, which uses current package versions, for this usecase?
Here is the code which drives me crazy...
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const nodemailer = require("nodemailer");
const pdfkit = require("pdfkit");
const storage = require("#google-cloud/storage")({projectId: `${PROJECT_ID}`})
const mailTransport = nodemailer.createTransport({
host: "smtp.office365.com",
port: 587,
secureConnection: false,
auth: {
user: "userName",
pass: "userPassword"
},
tls: {
ciphers: "SSLv3",
}
});
exports.added = function(event) {
const order = event.data.val();
const userId = event.params.userId;
// Load User Data by userId
return admin
.database()
.ref("/users/" +userId)
.once("value")
.then(function (snapshot) {
return generateOrderPDF(snapshot.val(), userId);
});
};
function generateOrderPDF(user, userId) {
const doc = new pdfkit();
const bucket = storage.bucket(functions.config().bucket);
const filename = `/${userId}/test-` + Date.now() + ".pdf";
const file = bucket.file(filename);
const bucketFileStream = file.createWriteStream();
// Pipe its output to the bucket
doc.pipe(bucketFileStream);
// Do creation Stuff....
doc.end();
bucketFileStream.on("finish", function () {
return sendOrderEmail(user, filename);
});
bucketFileStream.on("error", function(err) {
console.error(err);
});
}
function sendOrderEmail(user, filename) {
const email = user.email;
const firstname = user.firstName;
const mailOptions = {
from: "test#test.test",
to: email,
subject: "Order"
};
const bucket = storage.bucket(functions.config().bucket);
const file = bucket.file(filename);
mailOptions.html = mailTemplate;
mailOptions.attachments = [{
filename: "test.pdf",
content: file.createReadStream()
}];
return mailTransport.sendMail(mailOptions).then(() => {
console.log("New order email sent to:", email);
}).catch(error => {
console.error(error);
});
}
The problem in my appraoch was inside the pdfkit library and not inside nodemailer or firebase. The lines below seem to trigger the end event. So the pdf got sent after these lines. After out commenting them everything worked as it should. It was not that finish was never reached like Hari mentioned.
/* doc.lineCap("underline")
.moveTo(72, 321)
.lineTo(570, 321)
.stroke();*/
After finishing the MVP I will take a root cause analysis and post the final answer as comment below this answer.
This is a working sample of Source-Code for this UseCase. It also ensures, that the firebase function won't finish before all work is done. That is handled by wrapping the event driven doc.on() function into a promise, that is resolved when doc.on("end") is called.
exports.added = function(event) {
const order = event.data.val();
const userId = event.params.userId;
// Load User Data by userId
return admin.database().ref("/users/" + userId).once("value").then(function (snapshot) {
return generatePDF(snapshot.val(), userId);
});
};
function generatePDF(user, userId) {
const doc = new pdfkit();
const bucket = admin.storage().bucket(functions.config().moost.orderbucket);
const filename = "/${userId}/attachement.pdf";
const file = bucket.file(filename);
const bucketFileStream = file.createWriteStream();
var buffers = [];
let p = new Promise((resolve, reject) => {
doc.on("end", function() {
resolve(buffers);
});
doc.on("error", function () {
reject();
});
});
doc.pipe(bucketFileStream);
doc.on('data', buffers.push.bind(buffers));
//Add Document Text and stuff
doc.end();
return p.then(function(buffers) {
return sendMail(buffers);
});
}
function sendMail(buffers) {
const pdfData = Buffer.concat(buffers);
const mailOptions = {
from: "FromName <from#example.com>",
to: "to#example.com",
subject: "Subject",
html: mailTemplate,
attachments: [{
filename: 'attachment.pdf',
content: pdfData
}]
};
return mailTransport.sendMail(mailOptions).then(() => {
console.log("New email sent to:", "to#example.com");
}).catch(error => {
console.error(error);
});
}
The main problem in your code is that the stream.on('finish') never completes. I've also encountered the same issue.
Instead of streaming, convert the pdf into buffer and send the same as attachment.
The following works fine for me,
const doc = new pdfkit()
const filename = '/${userId}/test-' + Date.now() + ".pdf"
const file = bucket.file(filename);
const bucketFileStream = file.createWriteStream();
doc.pipe(bucketFileStream);
doc.end();
var buffers = []
doc.on('data', buffers.push.bind(buffers));
doc.on('end',function(){
let pdfData = Buffer.concat(buffers);
'<<nodemailer stuffs goes here>
'attach the doc as content
});

Uploading a file using shrepoint hosted app with REST API

i am getting error like the the site cannot be accessed in share point hosted app.
it is occured when i moving the from one page to another page in the same app. please help me in advance
it is Default.aspx code
<script>
'use strict';
var appWebUrl, hostWebUrl;
jQuery(document).ready(function () {
// Check for FileReader API (HTML5) support.
if (!window.FileReader) {
alert('This browser does not support the FileReader API.');
}
// Get the add-in web and host web URLs.
appWebUrl = decodeURIComponent(getQueryStringParameter("SPAppWebUrl"));
hostWebUrl = decodeURIComponent(getQueryStringParameter("SPHostUrl"));
});
function getQueryStringParameter(paramToRetrieve) {
var params = document.URL.split("?")[1].split("&");
for (var i = 0; i < params.length; i = i + 1) {
var singleParam = params[i].split("=");
if (singleParam[0] == paramToRetrieve) return singleParam[1];
}
}
function F1()
{
window.location.href=sphosturl+'pages/uploadform.aspx';
}
</script>
<div>
<input type='button' value='clickheretoUploadfile' onclick='F1()'/>
</div>
when the user is clicked on clickhere button is redirecting to uploadform.aspx
it is uploadform.aspx code
<script>
'use strict';
jQuery(document).ready(function () {
// Check for FileReader API (HTML5) support.
if (!window.FileReader) {
alert('This browser does not support the FileReader API.');
}
});
// Upload the file.
// You can upload files up to 2 GB with the REST API.
function uploadFile() {
// Define the folder path for this example.
var serverRelativeUrlToFolder = '/shared documents';
// Get test values from the file input and text input page controls.
var fileInput = jQuery('#getFile');
var newName = jQuery('#displayName').val();
// Get the server URL.
var serverUrl = _spPageContextInfo.webAbsoluteUrl;
// Initiate method calls using jQuery promises.
// Get the local file as an array buffer.
var getFile = getFileBuffer();
getFile.done(function (arrayBuffer) {
// Add the file to the SharePoint folder.
var addFile = addFileToFolder(arrayBuffer);
addFile.done(function (file, status, xhr) {
// Get the list item that corresponds to the uploaded file.
var getItem = getListItem(file.d.ListItemAllFields.__deferred.uri);
getItem.done(function (listItem, status, xhr) {
// Change the display name and title of the list item.
var changeItem = updateListItem(listItem.d.__metadata);
changeItem.done(function (data, status, xhr) {
alert('file uploaded and updated');
});
changeItem.fail(onError);
});
getItem.fail(onError);
});
addFile.fail(onError);
});
getFile.fail(onError);
// Get the local file as an array buffer.
function getFileBuffer() {
var deferred = jQuery.Deferred();
var reader = new FileReader();
reader.onloadend = function (e) {
deferred.resolve(e.target.result);
}
reader.onerror = function (e) {
deferred.reject(e.target.error);
}
reader.readAsArrayBuffer(fileInput[0].files[0]);
return deferred.promise();
}
// Add the file to the file collection in the Shared Documents folder.
function addFileToFolder(arrayBuffer) {
// Get the file name from the file input control on the page.
var parts = fileInput[0].value.split('\\');
var fileName = parts[parts.length - 1];
// Construct the endpoint.
var fileCollectionEndpoint = String.format(
"{0}/_api/web/getfolderbyserverrelativeurl('{1}')/files" +
"/add(overwrite=true, url='{2}')",
serverUrl, serverRelativeUrlToFolder, fileName);
// Send the request and return the response.
// This call returns the SharePoint file.
return jQuery.ajax({
url: fileCollectionEndpoint,
type: "POST",
data: arrayBuffer,
processData: false,
headers: {
"accept": "application/json;odata=verbose",
"X-RequestDigest": jQuery("#__REQUESTDIGEST").val(),
"content-length": arrayBuffer.byteLength
}
});
}
// Get the list item that corresponds to the file by calling the file's ListItemAllFields property.
function getListItem(fileListItemUri) {
// Send the request and return the response.
return jQuery.ajax({
url: fileListItemUri,
type: "GET",
headers: { "accept": "application/json;odata=verbose" }
});
}
// Change the display name and title of the list item.
function updateListItem(itemMetadata) {
// Define the list item changes. Use the FileLeafRef property to change the display name.
// For simplicity, also use the name as the title.
// The example gets the list item type from the item's metadata, but you can also get it from the
// ListItemEntityTypeFullName property of the list.
var body = String.format("{{'__metadata':{{'type':'{0}'}},'FileLeafRef':'{1}','Title':'{2}'}}",
itemMetadata.type, newName, newName);
// Send the request and return the promise.
// This call does not return response content from the server.
return jQuery.ajax({
url: itemMetadata.uri,
type: "POST",
data: body,
headers: {
"X-RequestDigest": jQuery("#__REQUESTDIGEST").val(),
"content-type": "application/json;odata=verbose",
"content-length": body.length,
"IF-MATCH": itemMetadata.etag,
"X-HTTP-Method": "MERGE"
}
});
}
}
// Display error messages.
function onError(error) {
alert(error.responseText);
}
<script>
<input id="getFile" type="file"/><br />
<input id="displayName" type="text" value="Enter a unique name" /><br />
<input id="addFileButton" type="button" value="Upload" onclick="uploadFile()">
the problem is when i perform the uploading functionality in the default.aspx page it is working good.but i redirecting from that page to upload page and perform the uploading functionality it is the error
The first question is where is your "sphosturl" parameter in the Default.aspx code? I guess it is the "appWebUrl".
From your code ,it seems you want to make your SharePoint-hosted add-in to upload files to the add-in web, so you must confirm whether you have documents library folder in your app and set correct location to "serverRelativeUrlToFolder" parameter . Otherwise it will threw Access Denied error. Tested code below is for your reference (I have add the documents library in my app):
'use strict';
jQuery(document).ready(function () {
// Check for FileReader API (HTML5) support.
if (!window.FileReader) {
alert('This browser does not support the FileReader API.');
}
});
// Upload the file.
// You can upload files up to 2 GB with the REST API.
function uploadFile() {
// Define the folder path for this example.
var serverRelativeUrlToFolder = 'Lists/DL';
// Get test values from the file input and text input page controls.
var fileInput = jQuery('#getFile');
var newName = jQuery('#displayName').val();
// Get the server URL.
var serverUrl = _spPageContextInfo.webAbsoluteUrl;
// Initiate method calls using jQuery promises.
// Get the local file as an array buffer.
var getFile = getFileBuffer();
getFile.done(function (arrayBuffer) {
// Add the file to the SharePoint folder.
var addFile = addFileToFolder(arrayBuffer);
addFile.done(function (file, status, xhr) {
// Get the list item that corresponds to the uploaded file.
var getItem = getListItem(file.d.ListItemAllFields.__deferred.uri);
getItem.done(function (listItem, status, xhr) {
// Change the display name and title of the list item.
var changeItem = updateListItem(listItem.d.__metadata);
changeItem.done(function (data, status, xhr) {
alert('file uploaded and updated');
});
changeItem.fail(onError);
});
getItem.fail(onError);
});
addFile.fail(onError);
});
getFile.fail(onError);
// Get the local file as an array buffer.
function getFileBuffer() {
var deferred = jQuery.Deferred();
var reader = new FileReader();
reader.onloadend = function (e) {
deferred.resolve(e.target.result);
}
reader.onerror = function (e) {
deferred.reject(e.target.error);
}
reader.readAsArrayBuffer(fileInput[0].files[0]);
return deferred.promise();
}
// Add the file to the file collection in the Shared Documents folder.
function addFileToFolder(arrayBuffer) {
// Get the file name from the file input control on the page.
var parts = fileInput[0].value.split('\\');
var fileName = parts[parts.length - 1];
// Construct the endpoint.
var fileCollectionEndpoint = String.format(
"{0}/_api/web/getfolderbyserverrelativeurl('{1}')/files" +
"/add(overwrite=true, url='{2}')",
serverUrl, serverRelativeUrlToFolder, fileName);
// Send the request and return the response.
// This call returns the SharePoint file.
return jQuery.ajax({
url: fileCollectionEndpoint,
type: "POST",
data: arrayBuffer,
processData: false,
headers: {
"accept": "application/json;odata=verbose",
"X-RequestDigest": jQuery("#__REQUESTDIGEST").val(),
"content-length": arrayBuffer.byteLength
}
});
}
// Get the list item that corresponds to the file by calling the file's ListItemAllFields property.
function getListItem(fileListItemUri) {
// Send the request and return the response.
return jQuery.ajax({
url: fileListItemUri,
type: "GET",
headers: { "accept": "application/json;odata=verbose" }
});
}
// Change the display name and title of the list item.
function updateListItem(itemMetadata) {
// Define the list item changes. Use the FileLeafRef property to change the display name.
// For simplicity, also use the name as the title.
// The example gets the list item type from the item's metadata, but you can also get it from the
// ListItemEntityTypeFullName property of the list.
var body = String.format("{{'__metadata':{{'type':'{0}'}},'FileLeafRef':'{1}','Title':'{2}'}}",
itemMetadata.type, newName, newName);
// Send the request and return the promise.
// This call does not return response content from the server.
return jQuery.ajax({
url: itemMetadata.uri,
type: "POST",
data: body,
headers: {
"X-RequestDigest": jQuery("#__REQUESTDIGEST").val(),
"content-type": "application/json;odata=verbose",
"content-length": body.length,
"IF-MATCH": itemMetadata.etag,
"X-HTTP-Method": "MERGE"
}
});
}
}
// Display error messages.
function onError(error) {
alert(error.responseText);
}

Resources