I am working on SAP cloud foundry environment(Pass) where its architecture is micro services. So I want to build two application using node and deploy them in SAP cloud foundry platform.
The problem or confusion I am facing is, I have some custom helper code in helper.js file that is very useful in both of my node application and I do not want to keep the same code of helper.js file in both the application. So as a alternative I found the below things
1.create a private package
2.use exports(but this works only for files inside a module and not between modules)
Please help me if there's any other may to do it and if not help me understand on creating private package in npm and how can have my own org/user private registry.
Related
I have a small NodeJS app I want to deploy to IBM Cloud as an "action". What I've been doing until now is just zipping the project files and creating/updating actions using the IBM Cloud CLI like this:
ibmcloud fn action create project-name C:\Users\myuser\Desktop\node-js-projects\some-project\test-folder.zip --kind nodejs:12
This was working great, however I'm now testing a new project which has a much larger modules folder, and as such IBMCloud won't accept it. I've turned my attention to using Docker as the below article explains.
https://medium.com/weekly-webtips/adding-extra-npm-modules-to-ibm-cloud-functions-with-docker-fabacd5d52f1
Everything makes sense, however I have no idea what to do with the credentials that the app uses. Since IBM Cloud seems to require you to run "docker push" I'm assuming it's not safe to include a .env file in the docker image?
I know in IBM Cloud I can pass "parameters" to an action but not sure if that helps here. Can those params be accessed from a piece of code deployed this way?
Would really appreciate some help on this one. Hoping there's a straightforward standard way of doing it that I've just missed. I'm brand new to docker so still learning.
I'm working on a Node.js application hosted on Google Cloud, using Google Application Engine. The app has a few settings like the following:
const TASK_BATCH_SIZE = 50;
Currently, every time we need to change some settings like that one to do some tests we need to re-deploy the app, and that happens very often. We are looking for some alternatives inside the Google Cloud ecosystem that allows us to configure our running services without needing to re-deploy.
One of the things we found in the docs was Runtime Configurator, which still seems to be a beta product. For Node.js specifically nodejs-rcloadenv is the only client library we found, but doesn't seem to support the Watcher / Waiter concepts described in the Runtime Configurator docs or any other way to subscribe to variable changes in a configuration resource.
Is Runtime Configurator the solution to our problem? Are there any other services inside the Google Cloud ecosystem, or any other library for Node.js that could help us with this?
If you want to update or to make some settings changes in the service, you need to re-delpoy the service.
You can't use Runtime Configurator as it's for Compute Engine and not for App Engine.
You can create a feature request on Google's Public Issue Tracker for your issue ( to update configurations without re-deploying the service).
I am trying to deploy a node application which imports a private npm module to Google App Engine. I'm still stuck at npm install failing due to Unable to authenticate, need: Basic realm="GitHub Package Registry".
One method of npm authentication is via the NODE_AUTH_TOKEN environment variable. GAE does not accept environment variables via the command line, only app.yaml, so I added my token to the app.yaml during my Github Actions CI process. It turns out that App Engine uses a separate Cloud Build environment to build which doesn't have this environment variable; therefore, failure again. I also tried creating a cloudbuild.yaml and subbed in my environment variable but no luck there. Lastly, I've tried to set my key via .npmrc like so:
//npm.pkg.github.com/gw-cocoon/:_authToken=$NPM_TOKEN
#gw-cocoon:registry=https://npm.pkg.github.com/gw-cocoon
and subbed in the token during CI. This fails for the same reason but I am not sure why. This token is autogenerated on each CI run so I cannot use Google Cloud KMS.
I was disappointed to find that using private npm modules with App Engine Standard is apparently not supported at all. This seems like a pretty glaring limitation given the rising popularity of GitHub packages etc for building modular (private) applications.
Interestingly, Google Cloud Functions apparently supports private npm modules, so perhaps it's just a matter of timing to gain support in App Engine.
I have imported sevral libraries in my app.One of them works only when a separate module is created in the project folder.Finally after connecting to firebase ,it doesn't connects completely.In firebase pane (In Android Studio) it shows: 1 of 2 modules are Connected.
Because of this I cannot access the Database/Storage-Rules in the firebase.What is the correct procedure to fully connect my project?
You don't connect library modules to Firebase. You only connect application modules. Firebase integrates into application modules by adding a google-services.json file along with a gradle plugin in the app's build.gradle.
If you are having a specific problem running your app after integration, post that question instead.
I would like to know what is the best approach in create several deploys from a big code base. The idea is to divide the big API into microservices (each one in it's own server/vm),
The first idea: I could simply create a folder with only the available routes for that microservice, but still using the "common" codebase...
I currently end up with this, and it's a running API in production (with staging environment in heroku with their pipeline):
and I was thinking that I could have something like:
can anyone point me to a good reference on ... where to start? how can I push multiple version of the same base code to a server?
for more detail on the used technologies, I'm using:
mocha and chai for tests
sequelize for mariaDb modeling and access
restify for server engine
When you divide the API into microservices, you have few options:
Make completely separate repos for all of them with some code duplication
Make completely separate repos but sharing common code as Node modules
Make one repo with multiple microservices, each as its own Node module
Make one repo with one big codebase and build multiple modules with needed parts from that
I'm sure you can do it in even more ways
Having a mismatch of the number of Node modules and code repos will cause some troubles but it may have some benefits in certain cases.
Having a 1-to-1 mapping of repos and modules will be easier to work with with some cases, like the ability to add private GitHub repos directy to dependencies in package.json.
If you want to factor out some common functionality then you can do it in several ways:
The npm supports organizations, scoped packages, private modules and private scoped packages with restricted access.
You can host a private npm registry
You can host a module on GitHub or GitLab or any other git server
For more info see:
Node.js: How to create paid node modules?
There are some nice frameworks that can help you with splitting your code base into microservices, like Seneca:
http://senecajs.org/
Or to a certain extent with Serverless if you're using AWS Lambda, Microsoft Azure, IBM OpenWhisk or Google Cloud Platform:
https://serverless.com/