Getting wrong Google Cloud Storage path in Strapi v4 - node.js

currently I'm uploading something file. Succeed when uploading the file to Google Cloud Storage, but when get the file, I have getting error something like this
If I check and see detail, the source is wrong, for example, the correct link should be like this https://storage.cloud.google.com/cms-strapi-storage/thumbnail_cloudsql_ae61374abd/thumbnail_cloudsql_ae61374abd.png
Anyone can help me? Thank you
My reference and package got from this source : https://www.npmjs.com/package/strapi-provider-upload-google-cloud-storage#setup-auth

The issue already solved!
The way to solve that issue is
Cause the env is production mode, in Strapi v4, you should create everything files into config/env/production
Create file plugins.js, fill it like this.
const fs = require('fs');
require('dotenv').config();
module.exports = ({ env }) => ({
upload: {
config: {
provider: 'strapi-provider-upload-google-cloud-storage',
providerOptions: {
serviceAccount: JSON.parse(fs.readFileSync(process.env.GCS_SERVICE_ACCOUNT)),
bucketName: env('GCS_BUCKET_NAME'),
basePath: env('GCS_BASE_PATH'),
baseUrl: env('GCS_BASE_URL'),
publicFiles: true,
uniform: false,
gzip: true,
},
},
},
});
The key is publicFiles, because if value is false it doesn't create public url in Google Cloud Storage and we cannot get and see the image
Addition notes, don't forget to add security in order to get permission from GCS (Google Cloud Storage)
module.exports = [
'strapi::errors',
{
name: 'strapi::security',
config: {
contentSecurityPolicy: {
useDefaults: true,
directives: {
'connect-src': ["'self'", 'https:'],
'img-src': ["'self'", 'data:', 'blob:', 'storage.googleapis.com'],
'media-src': ["'self'", 'data:', 'blob:', 'storage.googleapis.com'],
upgradeInsecureRequests: null,
},
},
},
},
'strapi::cors',
'strapi::poweredBy',
'strapi::logger',
'strapi::query',
'strapi::body',
'strapi::favicon',
'strapi::public',
];

Related

Webpack alliases work in browser, but not in node

To start off, this is my first post on stackoverflow so I hope I'm doing it right...
So my Webpack alliases work in browser, but not in node
I have a package (let's call it Services) with a subfolder dev and another project (let's call it MobX) which uses Services. In Mobx, I have a webpack configuration which is supposed, in development mode, change the paths of Services to have access to the subfolder dev instead.
Here's a sample of the code :
const devConfig = {
...commonConfig,
mode: "development",
devtool: "cheap-module-source-map",
entry: "./src/lib/index.ts",
output: {
path: outputDir,
library: name,
libraryTarget: "umd",
globalObject: "this",
},
optimization: {
minimize: false,
},
plugins: [...commonConfig.plugins, forkTsCheckerWebpackPlugin],
resolve: {
...commonConfig.resolve,
alias: {
"#intuition/services-apis$": "#intuition/services-apis/dev",
},
},
};
const devConfigNode = {
...devConfig,
target: ["node"],
name: "node",
output: {
...devConfig.output,
filename: splitChunks ? "[name].[contenthash].node.js" : "index.node.js",
},
};
const devConfigBrowser = {
...devConfig,
target: ["web", "es5"],
name: "browser",
output: {
...devConfig.output,
filename: splitChunks ? "[name].[contenthash].browser.js" : "index.browser.js",
},
};
I don't understand why, in node, it doesn't go into #intuition/services-apis/dev. In works fine in browser but no matter what I do, in node, it doesn't work. I have been at this day for days now and I can't seem to figure it out.
If my explanation wasn't clear enough and needs more details, please do not hesitate to ask any question.

EISDIR issue in Azure hosted Nuxt Vue webiste

I have hosted a Web app in Azure DevOps, the application built with Vue and Nuxt.
#vue/cli 5.0.1 and "nuxt": "^2.15.8". After hosting the web application works fine, I can login, then it navigates me to the listing page. But from there when I refresh the page it's showing this error. Sorry, check with the site admin for error: EISDIR .. in the browser and throwing a 500 Error in the console. In my login response I get only access token, there is no refresh token, could that be an issue? or any other settings in the Azure side? We tried setting this in the azure pm2 serve /home/site/wwwroot --no-daemon --spa. Still it's not working. Everything works fine in my dev environment.
export default {
ssr: false,
head: {
title: 'BBG Returns Self Service',
meta: [
{ charset: 'utf-8' },
{ name: 'viewport', content: 'width=device-width, initial-scale=1' },
{ hid: 'description', name: 'description', content: '' },
{ name: 'format-detection', content: 'telephone=no' },
],
link: [{ rel: 'icon', type: 'image/x-icon', href: '/favicon.ico' }],
},
plugins: ['~/plugins/clearTokens.js'],
components: true,
buildModules: ['#nuxtjs/style-resources'],
env: {
BASE_URL: 'https://my-api-url',
},
publicRuntimeConfig: {
baseURL: process.env.BASE_URL,
},
router: {
mode: 'history',
},
styleResources: {
// scss: ["~assets/scss/main.scss"],
},
modules: ['#nuxtjs/i18n'],
build: { transpile: [/^#storefront-ui/] },
server: {
port: 4200,
},
i18n: {
locales: [
{
code: 'en',
iso: 'en-GB',
name: 'English',
file: 'en.json',
icon: 'uk.svg',
},
{
code: 'de',
iso: 'de-DE',
name: 'Deutsch',
file: 'de.json',
icon: 'de.svg',
},
],
lazy: true,
langDir: 'i18n/',
defaultLocale: 'en',
detectBrowserLanguage: false,
},
target: 'static',
}
If deploying an SPA app, you need to have both:
target: 'static' (the default being 'server')
ssr: false
This removes quite some benefits regarding SEO + performance but at least, you still get all the benefits of the Nuxt DX and ecosystem.
To host it on Azure, you have several approaches, if you're using:
a static app, you can follow this official documentation for Azure Static Web Apps: https://nuxtjs.org/deployments/azure-static-web-apps/
a SSR app, you can follow this one about Azure Portal: https://nuxtjs.org/deployments/azure-portal
The actual issue was the Azure configuration. The resource should be created as Static website, then it will work fine.
Please follow this official documentation to understand How to Deploy in Azure.
https://nuxtjs.org/deployments/azure-static-web-apps/

Helmet: How to allow images to load from different domain (Err: NotSameOriginAfterDefaultedToSameOriginByCoep)

I am using helmet to set CSP headers. I am using React on the frontend.
I store my images on a subdomain (assets.mydomain.com). For some reason I get the following error message: ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep when loading the images.
I also use a script tag for Google Analytics. This one also gives me an error message: Refused to connect to https://www.google-analytics.com/ because it violates... "default-src 'self'"
This is how I have configured my CSP currently:
app.use(
helmet({
contentSecurityPolicy: {
directives: {
defaultSrc: ["'self'"],
scriptSrc: [
"'self'",
"https://www.googletagmanager.com",
"'self'",
"https://www.google-analytics.com",
"'unsafe-inline'",
"mydomain.com",
],
imgSrc: ["'self'", "assets.mydomain.com"],
},
},
crossOriginEmbedderPolicy: false,
crossOriginResourcePolicy: false,
})
);
What is wrong with my CSP configuration?
So if anyone comes across this question for some reason, I figured it out. As it turns out, the cross-origin-embedder-policy header was giving me troubles. This had to be disabled. Helmet has a built in option to do so crossOriginEmbedderPolicy: false,. More info here.
For most people I guess that'll work. However it did not work for me. The header was still being set. Disabling it with express also did not work (app.disable('cross-origin-embedder-policy');).
I have no idea why the header was still being set, but I had to disable it manually in my nginx configuration: proxy_hide_header cross-origin-embedder-policy;
My config:
app.use(
helmet({
contentSecurityPolicy: {
directives: {
defaultSrc: ["'self'"],
scriptSrc: [
"'self'",
"'unsafe-inline'",
"https://*.google.com",
"https://*.google-analytics.com",
"https://*.googletagmanager.com",
"https://*.hotjar.com",
"https://*.mollie.com",
],
connectSrc: [
"'self'",
"'unsafe-inline'",
"https://*.google.com",
"https://*.google-analytics.com",
"https://*.googletagmanager.com",
"https://*.hotjar.com",
"https://*.mollie.com",
],
imgSrc: [
`'self'`,
`data:`,
`*.domain.nl`,
`*.amazonaws.com`,
],
},
},
//Will work for most, but did not work for me:
// crossOriginEmbedderPolicy: false,
})
);
//# in nginx I manually disabled the COEP header: roxy_hide_header cross-origin-embedder-policy;
I did it!!!
After literally hundreds of test-error I found that this little piece of code at least, it fixed my problem of having 2 different services (2 domains) and now my little app is working fine again, time 1:11am !! :)
app.use(
helmet({
contentSecurityPolicy: {
directives: {
scriptSrc: ["'self'", 'www.google.com www.gstatic.com',"https://*.statcounter.com", "'unsafe-inline'"],
frameSrc: ["'self'", "www.google.com", "https://*.statcounter.com"],
connectSrc: ["'self'", 'https://*.statcounter.com'],
},
},
crossOriginResourcePolicy: { policy: "cross-origin" },
crossOriginEmbedderPolicy: false,
})
);

uploading asset to strapi with cloudinary gives error : ENOTEMPTY: directory not empty

I was following this tutorial to create a web app that uses Cloudinary as a image host , I followed all the steps mentioned in it but when i try to add a image it gives me this error ErrorImage , idk what's causing this , is this some kind of connection error ?
here is my plugins.js
module.exports = ({ env }) => ({
upload: {
config:{
provider: 'cloudinary',
providerOptions: {
cloud_name: env('+++++++'),
api_key: env('8+++++++++++'),
api_secret: env('1T++++++++++++U'),
},
},
my .env file , this file also contains other stuff like jwt keys , host , port etc , here is what i added to it
--hosts ports etc--
CLOUDINARY_NAME = **********
CLOUDINARY_KEY = ***********
CLOUDINARY_SECRET = ***************
my middleware.js file
module.exports = [
'strapi::errors',
{
name: 'strapi::security',
config: {
contentSecurityPolicy: {
useDefaults: true,
directives: {
'connect-src': ["'self'", 'https:'],
'img-src': ["'self'", 'data:', 'blob:', 'res.cloudinary.com'],
'media-src': ["'self'", 'data:', 'blob:', 'res.cloudinary.com'],
upgradeInsecureRequests: null,
},
},
},
},
'strapi::cors',
'strapi::poweredBy',
'strapi::logger',
'strapi::query',
'strapi::body',
'strapi::session',
'strapi::favicon',
'strapi::public',
];

google firebase function tutorial unexpected token =>

I have done some googling and haven't found an answer to my question. I am following the tutorial for google firebase functions here and have copied index.js exactly from the GitHub repository linked on the tutorial as well as copying the code in 'chunks' by following the tutorial and I get this error after running firebase deploy --only functions
error Parsing error: Unexpected token =>
which references this function:
exports.addMessage = (functions.https.onRequest(async (req, res) => { //This line
// [END addMessageTrigger]
// Grab the text parameter.
const original = req.query.text;
// [START adminSdkAdd]
// Push the new message into Firestore using the Firebase Admin SDK.
const writeResult = await admin.firestore().collection('messages').add({ original: original });
// Send back a message that we've successfully written the message
res.json({ result: `Message with ID: ${writeResult.id} added.` });
// [END adminSdkAdd]
}));
Link to index.js file used in tutorial
My eslintrc.js file:
module.exports = {
root: true,
env: {
es6: true,
node: true,
},
extends: [
"eslint:recommended",
"google",
],
rules: {
quotes: ["error", "double"],
},
};
Async functions and await keyword were added in ECMAScript 2017. You need to set ecmaVersion to 8 in your ESLint config.
module.exports = {
root: true,
env: {
es6: true,
node: true,
},
extends: [
"eslint:recommended",
"google",
],
parserOptions: {
ecmaVersion: 8
},
rules: {
quotes: ["error", "double"],
},
};

Resources