I'm trying to connect to my dynamoDB table from inside a React js app. I have AWS credentials set up locally. When I run my app, I get the following error on Chrome Devtools: "Error: Credential is missing".
Oddly, if I run the AWS example found below using pretty much the same code via node on terminal, it works fine. https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/javascriptv3/example_code/dynamodb/src/partiQL_examples/src/partiql_getItem.js
To run the AWS example, I created a new mjs file inside my react SRC folder, so it should have all the same access as the React app, right? No credentials are explicitly added in the mjs file or the react app.
Why doesn't the React environment have access to the credentials? I've tried both ~/.aws/credentials and environment variables. The AWS SDK seems to say that it should just work for Node. Any thoughts?
import { DynamoDBClient, ExecuteStatementCommand} from '#aws-sdk/client-dynamodb';
function App() {
const dynamoDB = new DynamoDBClient({ region : "us-west-2"});
async function loadFromCloud () {
const command = new ExecuteStatementCommand({
Statement: `select * from TableX`
});
try {
const data = await dynamoDB.send(command);
console.log(data);
} catch (error) {
console.log(error);
}
}
Make sure that you have AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY specified as environment variables and that you also specify AWS_SDK_LOAD_CONFIG=1
However, your React app is run in the browser. It should only work in a node.js app.
Related
I built a simple react app with "create-react-app" and I want to use serverless functions with netlify.
I use DataStax Astra Cassandra DB for that purpose, and created a netlify.toml config and .env variables (for the Database) inside my react project.
I set up a serverless functions folder for netlify:
const { createClient } = require('#astrajs/collections')
// create an Astra client
exports.handler = async function (event, context) {
try {
const astraClient = await createClient({
astraDatabaseId: process.env.ASTRA_DB_ID,
astraDatabaseRegion: process.env.ASTRA_DB_REGION,
applicationToken: process.env.ASTRA_DB_APPLICATION_TOKEN,
})
// const basePath = `/api/rest/v2/KEYSPACES/${process.env.ASTRA_DB_KEYSPACE}/collections/messages`
const messagesCollection = astraClient
.namespace(process.env.ASTRA_DB_KEYSPACE)
.collection('messages')
const message = await messagesCollection.create('msg1', {
text: 'hello my name is Marc!',
})
return {
statusCode: 200,
body: JSON.stringify(message),
}
} catch (e) {
console.error(e)
return {
statusCode: 500,
body: JSON.stringify(e),
}
}
it works when I run netlify dev , then my .env variables are injected into the .js file.
However, I am wondering if I should use the nodejs datastax collections here, or the REST API functions from datastax (https://docs.datastax.com/en/astra/docs/astra-collection-client.html)? Because with react, it's essentially running in the browser or not? I am wondering why this still works with nodejs (because its not a nodejs environment with react, or is it?)
I am getting access to my functions via localhost:8888/.netlify/functions/functionName is this served from a nodejs server or is it browser stuff?
it works when I run netlify dev , then my .env variables are injected into the .js file. However, I am wondering if I should use the nodejs datastax collections here, or the REST API functions from datastax (https://docs.datastax.com/en/astra/docs/astra-collection-client.html)? Because with react, it's essentially running in the browser or not?
Correct - you would expose your Astra credentials to the world if you connect to your database directly from your React app.
I am wondering why this still works with nodejs (because its not a nodejs environment with react, or is it?) I am getting access to my functions via localhost:8888/.netlify/functions/functionName is this served from a nodejs server or is it browser stuff?
Netlify functions run serverside so it is safe to connect to Astra in your function code. Here's an example: https://github.com/DataStax-Examples/todo-astra-jamstack-netlify/blob/master/functions/createTodo.js
I am a beginner trying to learn web development. I have built small node js app and deployed in heroku. I am trying to do the same in GCP and learn the platform.
Code used is # https://github.com/unnikrishnan-r/firstgcpdeployment
The issue that I am facing is Unable to access env variables in the app
My index.js
exports.envVar = (req, res) => {
// Sends 'bar' as response
console.log("bbbccc");
console.log(process.env.JAWSDB_URL);
res.status(200).send(process.env.JAWSDB_URL);
return process.env.JAWSDB_URL;
};
.env.yaml is
JAWSDB_URL: testtest456
I then used below to deploy the function to gcloud
gcloud functions deploy envVar --env-vars-file .env.yaml --runtime nodejs8 --trigger-http;
Deployment was successful and was tested using console and
https://us-central1-howtodeployingcp.cloudfunctions.net/envVar
During deploy this happened
Can some one help me here?
It seems this line is problematic:
var dbUrl = envVar.envVar();
You're calling the function without passing in req, res as it's expecting. Perhaps you forgot to comment out this line.
I am trying to move the tiny node-express app I made into firebase functions.
The file have dotenv variables. Earlier I thought If I just deploy and put dotenv in dependency, It will work but that didn't happen so..
So, I went to environment configuration article of firebase to understand how I can set .env
Which states to set things by doing something like this
firebase functions:config:set someservice.key="THE API KEY" someservice.id="THE CLIENT ID"
But I have so many environment configuration and doing that some what seems to be cumbersome task.
So let's say this is environment file
# App port Address
PORT = 8080
# Google Secret
GOOGLE_CALLBACK_URL = http://localhost:8080/auth/google/callback
GOOGLE_CLIENT_ID = 4048108-bssbfjohpu69vl6jhpgs1ne0.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET = lTQHpjzY57oQpO
# Client Address
CLIENT_ADDRESS = http://localhost:3000/
# Meetup Secret
MEETUP_CALLBACK_URL = http://localhost:8080/auth/meetup/callback
MEETUP_CLIENT_ID = ef6i9f7m6k0jp33m9olgt
MEETUP_CLIENT_SECRET = sk3t5lnss2sdl1kgnt
#EventBrite Secret
EVENTBRITE_CALLBACK_URL = http://localhost:8080/auth/eventbrite/callback
EVENTBRITE_CLIENT_ID = UU2FXKGYHJRNHLN
EVENTBRITE_CLIENT_SECRET = NA55QG52FAOF6GDMLKSJBKYOPIGQU4R46HHEU4
How Can I best set up so that when I do firebase firebase serve --only functions,hosting it doesn't throw any errors such as
OAuth2Strategy requires a clientID option
As of Feb 16, 2022 Firebase now supports .env, .env.prod, .env.dev, .env.local files natively!
https://firebase.google.com/docs/functions/config-env
Set your variables in the corresponding environment, and then run firebase use dev or firebase use prod before you deploy.
Your variables can be accessed via process.env.VARIABLE_NAME
UPDATED 2019-06-04
I'm very sorry. This solution is wrong.
I found the correct way.
https://stackoverflow.com/a/45064266/1872674
You should put a .runtimeconfig.json into the functions directory. Your dotenv variables move to .runtimeconfig.json with json format.
This is my solution.
const functionConfig = () => {
if (process.env.RUN_LOCALLY) {
const fs = require('fs');
return JSON.parse(fs.readFileSync('.env.json'));
} else {
return functions.config();
}
};
The functionConfig() was called by your Firebase Function.
exports.helloWorld = functions.https.onRequest((request, response) => {
response.send("someservice id is: " + functionConfig().someservice.id);
});
.env.json is like:
{
"someservice": {
"key":"THE API KEY",
"id":"THE CLIENT ID"
}
}
Finally, run the command with the RUN_LOCALLY variable.
RUN_LOCALLY=1 firebase serve
When we will deploy functions,
don't forget to update the environment configuration in Firebase using the .env.json.
The Firebase CLI currently doesn't allow you to set process environment variables on deployment. This may change in the future. The configuration vars it supports today (that you linked to) are not actually process environment variables - they stored somewhere else that's not actually the process environment.
If you absolutely need to be able to set process environment variables, you will have to deploy your function with gcloud, which means that you also won't be able to use the firebase-functions module to define your function. Start with the Google Cloud Functions documentation to learn about deployment from a Cloud perspective.
If you want to use the Firebase tools, I'd recommend that you find a different way to configure your function that doesn't involve process environment variables.
If you want to have your functions use the process.env variables, you you can set them by going to google cloud console and cloud functions. You will be able to find the deployed firebase functions there. You can select each function one by one and then set the environment variables there.
what I did was create a env.json file into functions FOLDER:
//env.json
{
"send_email_config": {
"FOW_ADMIN_EMAIL": "admin.example#gmail.com",
"FOW_ADMIN_EMAIL_PASSWORD": "adminPassExample",
"FOW_ADMIN_RECEIVER_EMAIL": "adminReceiver.example#gmail.com"
}
}
then I created a file called env.js where I created a function to return the value of functions.config() which is kind of env module
//env.js
// TO UPDATE functions.config().env RUN INSIDE functions FOLDER:
// firebase functions:config:set env="$(cat env.json)"
const functions = require('firebase-functions');
const getEnvConfig = () => {
/*
I return functions.config().env cause I set the env.json values into env
property running firebase functions:config:set env="$(cat env.json)"
*/
return functions.config().env
}
exports.getEnvConfig = getEnvConfig;
exports.PROCESS_ENV = getEnvConfig();
then I just call PROCESS_ENV to access the values that I set on env.json file for example:
const { PROCESS_ENV } = require('./utils/env');
exports.mailCredentials = {
main: {
email: PROCESS_ENV.send_email_config.FOW_ADMIN_EMAIL,
password: PROCESS_ENV.send_email_config.FOW_ADMIN_EMAIL_PASSWORD
},
receiver: {
email: PROCESS_ENV.send_email_config.FOW_ADMIN_RECEIVER_EMAIL
}
}
IMPORTANT!!!!
for this to work you have to deploy functions.config().env with the values of the env.json file
to deploy functions.config().env you just have to run INSIDE the functions FOLDER the next command: firebase functions:config:set env="$(cat env.json)"
and also don't forget to add env.json in your .gitignore
if you have firebase functions FOLDER inside your react project just add */env.json into your react .gitignore file
your .env file need to be in the "functions" folder
Here you have the documentation about this
https://firebase.google.com/docs/functions/config-env
you can access variables like this :
This is your .env :
PLANET=Earth
AUDIENCE=Humans
and a firebase functions
// Responds with "Hello Earth and Humans"
exports.hello = functions.https.onRequest((request, response) => {
response.send(`Hello ${process.env.PLANET} and ${process.env.AUDIENCE}`);
});
For your local environment you can have a .env.local file and the contents of .env.local take precedence over .env when you are using the emulators suite
Using the Node.js admin SDK with Firebase Functions I get a timeout whenever I try to access the Realtime Database. This occurs only when testing a function locally (firebase serve --only functions,hosting) and when the default app is initialized using the functions.config().firebase.
This is a new behavior that started just a couple a days ago. However, if I try to initialize the default app with the serviceAccount.json file everything works as expected.
I'm using firebase-admin version 4.2.1 and firebase-functions version 0.5.9.
I wrote a straight forward http triggered function that fails due to timeout:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const db = admin.database();
exports.testDbConnection = functions.https.onRequest((req, res) => {
return admin.database().ref().once('value')
.then(function(snapshot) {
res.json(snapshot);
}).catch(function(error) {
res.json(error);
});
});
from the documentation
Always end an HTTP function with send(), redirect(), or end(). Otherwise, your function might to continue to run and be forcibly terminated by the system
see https://firebase.google.com/docs/functions/http-events#terminate_http_functions
This might depend on the firebase-tools version that you are using, but looks familiar to this Github issue
The solution for it is to either upgrade to the latest version of the CLI or use the workaround solution:
Go to https://cloud.google.com/console/iam-admin/serviceaccounts
Click “Create service account”, give it a name (e.g. emulator), give it the Project>Owner role.Check “Furnish a new private key”, pick “JSON”.
Save the file somewhere on your computer
Run export GOOGLE_APPLICATION_CREDENTIALS="absolute/path/to/file.json"
Run firebase serve --only functions
I'm trying to call a Heroku environment variable in a Node/Express app.
I set the env variable in Heroku using
heroku config:set GITHUB_TOKEN=<my github api token without quotation marks>
It is set correctly (I checked by running heroku config)
gitUserSearchController.js:
githubUserSearch.controller('GitUserSearchController', ['$resource', function($resource) {
var self = this;
var searchResource = $resource('https://api.github.com/search/users/');
var githubToken=process.env.GITHUB_TOKEN;
self.doSearch = function() {
self.searchResult = searchResource.get(
{ q: self.searchTerm, access_token: githubToken }
);
};
}]);
I get a console error readout of "Reference error: process is not defined" from line 5.
You can't see local environment variables from the client side of a web app. This is desired behavior, of course, because otherwise you'd have just shared your github token with the world!
process.env.GITHUB_TOKEN will work within node.js, but it won't work in the user's browser (that looks like an Angular controller to be run in the browser, correct?)