Graphcool subscriptions: new file uploads - graphcool

in graphcool, is a file upload a mutation that one can subscribe to?
if not: how would I get realtime updates on newly uploaded files?
I adapted the code from the subscriptions-with-apollo-instagram example, but this does not seem to work:
subscription {
File (filter: { mutation_in: [CREATED] }) {
node {
id
name
url
contentType
}
}
}

from the docs:
Currently, no Server-Side Subscriptions are triggered for the File type.

Related

Node.JS PowerBI App Owns Data for Customers w/ Service Principal (set "config.json" from a table in my database)

I'm attempting to refactor the "Node.JS PowerBI App Owns Data for Customers w/ Service Principal" code example (found HERE).
My objective is to import the data for the "config.json" from a table in my database and insert the "workspaceId" and "reportId" values from my database into the "getEmbedInfo()" function (inside the "embedConfigServices.js" file). Reason being, I want to use different configurations based on user attributes. I am using Auth0 to login users on the frontend, and I am sending the user metadata to the backend so that I can filter the database query by the user's company name.
I am able to console.log the config data, but I am having difficulty figuring out how to insert those results into the "getEmbedInfo()" function.
It feels like I'm making a simple syntax error somewhere, but I am stuck. Here's a sample of my code:
//----Code Snippet from "embedConfigServices.js" file ----//
async function getEmbedInfo() {
try {
const url = ;
const set_config = async function () {
let response = await axios.get(url);
const config = response.data;
console.log(config);
};
set_config();
const embedParams = await getEmbedParamsForSingleReport(
config.workspaceId,
config.reportId
);
return {
accessToken: embedParams.embedToken.token,
embedUrl: embedParams.reportsDetail,
expiry: embedParams.embedToken.expiration,
status: 200,
};
} catch (err) {
return {
status: err.status,
error: err.statusText,
}
};
}
}
This is the error I am receiving on the frontend:
"Cannot read property 'get' of undefined"
Any help would be much appreciated. Thanks in advance.
Carlos
The error is because of fetching wrong URL. The problem is with the config for the Service Principal. We will need to provide reportId, workspaceId for the SPA and also make sure you added the service principal to workspace and followed all the steps from the below documentation for the service principal authentication.
References:
https://learn.microsoft.com/power-bi/developer/embedded/embed-service-principal

How to add subscriber role and publisher role to deadletter for google cloud pubsub using nodeJS?

We've been following https://cloud.google.com/pubsub/docs/dead-letter-topics and nodeJS client to create, update our pubsub topics and subscriptions but after following:
async function createSubscriptionWithDeadLetterPolicy() {
// Creates a new subscription
await pubSubClient.topic(topicName).createSubscription(subscriptionName, {
deadLetterPolicy: {
deadLetterTopic: pubSubClient.topic(deadLetterTopicName).name,
maxDeliveryAttempts: 10,
},
});
console.log(
`Created subscription ${subscriptionName} with dead letter topic ${deadLetterTopicName}.`
);
console.log(
'To process dead letter messages, remember to add a subscription to your dead letter topic.'
);
}
We get this in the dead-letter
This Suggests running the command in CLI for each dead-letter but we don't want to do it manually for each subscription is there a way to do this in nodeJS client itself?
Or doing so for all the subscriptions once and for all even for the new subscriptions that will be created in given project later on.
According to this part of the documentation, you need to grant 2 roles to the PubSub service agent service account. And of course, you can do it by API calls. And, it's not so easy!
In fact, it's not difficult, just boring! Why? Because, you can't only "add" a policy, you set the whole policies. To achieve this:
Get all the existing policies
Add your policy in the existing list
Submit the new list of policies.
You need to do this:
Either globally by setting the policies at the project level. Easier but less secure (break the least privilege principle)
Or on each Dead Letter topic and on each subscription with Dead Letter set up.
You have code example in the Client library doc
EDIT1
If you script the grant access mechanism, you don't care to find it or not: it exists, that's all! Maybe you don't view it on the console, but it exists. Only the pattern is important:
service-<project-number>#gcp-sa-pubsub.iam.gserviceaccount.com
If you are looking for it on the console, it's tricky now! You have to go to Access -> IAM. And then click on the check box on the top rigth corner to display the Google technical accounts
In case anyone needs it, here are the functions that I made up from #guillaume blaquiere's answer:
private async bindPolicyToSubscriber(
subscriptionTopicName: string,
subscriptionName: string,
) {
if (process.env.PROJECT_NUMBER) {
try {
const pubSubTopic = this.getClient().topic(subscriptionTopicName);
const myPolicy = {
bindings: [
{
role: 'roles/pubsub.subscriber',
members: [
`serviceAccount:service-${process.env.PROJECT_NUMBER}#gcp-sa-pubsub.iam.gserviceaccount.com`,
],
},
],
};
await pubSubTopic
.subscription(subscriptionName)
.iam.setPolicy(myPolicy);
} catch (e) {
console.error('Error while binding policy.', e);
}
}
}
private async bindPolicyToDeadLetterTopic(deadLetterTopicName: string) {
if (process.env.PROJECT_NUMBER) {
try {
const pubSubTopic = this.getClient().topic(deadLetterTopicName);
const myPolicy = {
bindings: [
{
role: 'roles/pubsub.publisher',
members: [
`serviceAccount:service-${process.env.PROJECT_NUMBER}#gcp-sa-pubsub.iam.gserviceaccount.com`,
],
},
],
};
await pubSubTopic.iam.setPolicy(myPolicy);
} catch (e) {
console.error('Error while binding policy.', e);
}
}
}

Why do I not have access to my firebase storage images?

I have two image files uploaded to firebase storage:
capsule house.jpg was uploaded through the UI (clicking the Upload file button).
upload_64e8fd... was uploading from my backend server (node.js) using this:
const bucket = fbAdmin.storage().bucket('gs://assertivesolutions2.appspot.com');
const result = await bucket.upload(files.image.path);
capsule house.jps is recognized as a jpeg and a link to it is supplied in the right hand margin. If I click on it, I see my image in a new tab. You can see for yourself:
https://firebasestorage.googleapis.com/v0/b/assertivesolutions2.appspot.com/o/capsule%20house.jpg?alt=media&token=f5e0ccc4-7916-4245-b813-dbdf1838556f
upload_64e8fd... is not recognized as any kind of image file and no link it provided.
The result returned on the backend is a huge json object with the following fields:
"selfLink": "https://www.googleapis.com/storage/v1/b/assertivesolutions2.appspot.com/o/upload_64e8fd09f787acfe2728ae73158e20ab"
"mediaLink": "https://storage.googleapis.com/download/storage/v1/b/assertivesolutions2.appspot.com/o/upload_64e8fd09f787acfe2728ae73158e20ab?generation=1590547279565389&alt=media"
The first one sends me to a page that says this:
{
"error": {
"code": 401,
"message": "Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.",
"errors": [
{
"message": "Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.",
"domain": "global",
"reason": "required",
"locationType": "header",
"location": "Authorization"
}
]
}
}
The second one gives me something similar:
Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
The rules for my storage bucket are as follows:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if true;
}
}
}
I'm allowing all reads and writes.
So why does it say I don't have access to see my image when it's uploaded through my backend server?
I'd also like to know why it doesn't recognize it as a jpeg when it's uploaded through my backend server, but it does when uploaded through the UI, but I'd like to focus on the access issue for this question.
Thanks.
By default, the files are uploaded as private, unless you change your bucket settings, as mentioned here. The below code is an example of how to change the visibility of your documents.
/**
* {#inheritdoc}
*/
public function setVisibility($path, $visibility)
{
$object = $this->getObject($path);
if ($visibility === AdapterInterface::VISIBILITY_PRIVATE) {
$object->acl()->delete('allUsers');
} elseif ($visibility === AdapterInterface::VISIBILITY_PUBLIC) {
$object->acl()->add('allUsers', Acl::ROLE_READER);
}
$normalised = $this->normaliseObject($object);
$normalised['visibility'] = $visibility;
return $normalised;
}
You can check how to set that via console, following the tutorial in the official documentation: Making data public
Besides that, as indicated in the comment by #FrankvanPuffelen, you won't have a generated URL for the file to be accessed. You can find more information about it here.
Let me know if the information helped you!
The other answer helped me! I have no idea why the Console had me make those security rules if they won't apply...
Based on nodejs docs (and probably other languages) there is a simple way to make the file public during upload:
const result = await bucket.upload(files.image.path, {public: true});
This same option works for bucket.file().save() and similar APIs.

Uploading multiple files to stripe.files.create (Node-stripe)

I am trying to upload files to stripe which are submitted by the user in my frontend to verify their identity before they can sell on my platform.
Currently, the files are sent via an API request to the backend where I can upload a single file, and afterwards, I attach it to that user's account.
let file = {
data: fs.readFileSync(files.IDFront.path),
name: files.IDFront.name,
type: files.IDFront.type
}
stripe.files.create({
purpose: 'identity_document',
file
}, function(err, file) {
if(err) res.send({sucess:false, error: err})
else {
//attach to user's account
}
This works just fine, but some identity documents require pictures of the front and back, so my question is can I upload two files at once using stripe.files.create? I can't seem to find anything in Stripe's API docs which mentions this, and I don't want to use stripe.files.create twice in one function because I feel that isn't a very efficient way to write the function.
Any suggestions would be greatly appreciated
It is important to note that your documents still need to be sent to Stripe in their own calls in order to get their respective tokens.
The function below takes an object of document names and the returned stripe tokens
{
document: <stripeToken1>,
additional_document: <stripeToken2>
}
You can then iterate through these and append to the update object in one go
// create the document object template
const documentObject = {
individual: {
verification: {},
},
}
Object.entries(imageTokens).map(([_, token]) => {
const [keyToken] = Object.entries(token)
// separate key and token from object
const [documentKey, stripeTokenId] = keyToken
// convert our naming convention to stripes expected naming
const checkedKey = documentKey === 'document_front' ? 'document' :
documentKey
// append to document object
documentObject.individual.verification[checkedKey] = {
front: stripeTokenId,
}
return await stripe.accounts.update(stripeAccountId, documentObject)

Api rest google drive folder not visible after creation by nodejs

When creating a folder via google api rest I cannot see the newly created folder from the google drive console. The code works, it returns the json with all the data of the folders just created.
function createFolder(nameProduct,folderId){
console.log("nome folder : "+nameProduct);
console.log("id parent : "+folderId);
var fileMetadata = {
'name': nameProduct,
'mimeType': 'application/vnd.google-apps.folder',
parent : [folderId]
};
let objFolder = {
auth: jwToken,
resource: fileMetadata,
fields: 'id'
}
return new Promise((resolve,reject)=>{
drive.files.create(objFolder).then(function(response) {
// Handle the results here (response.result has the parsed body).
console.log("Response", response);
resolve(response);
},
function(err) {
// handle error here.
reject(err);
console.error("Execute error", err);
});
});
}
var jwToken = new google.auth.JWT(
key.client_email,
null,
key.private_key, ["https://www.googleapis.com/auth/drive"],
null
);
jwToken.authorize((authErr) => {
if (authErr) {
console.log("error : " + authErr);
return;
} else {
console.log("Authorization accorded");
}
});
The authentication works fine, I'm able to crete file. I have this problem only with folder.
Why is not visible on google drive ? The method return correctly the id
You appear to be using a service account to authenticate. A service accounts are dummy users they have their own google drive account, google calendar account and probably a few more.
When you uploaded that file its uploaded to the service accounts google drive account. There is no web view for a service account the only access you have to it is programmaticlly.
options: upload to your account
Create a directory on your google drive account share that directory with the service account and upload to that directory.
Option: share the service accounts directory with you
Have the service account create a directory and then grant yourself permission on the directory. You will then be able to see that directory in your account.
Note
When the service account uploads files it will be the owner of that file make sure that you grant yourself permissions on these files or you wont be able to access them. Yes you can have a file on your drive account that you dont have permission to access. 😉

Resources