Azure Functions and Node.js - node.js

Let's say I have a Node.js app I want to build with a TimerTrigger on Azure Functions.
What would be the best way to develop it? I already tried setting the NODE_DEFAULT_VERSION to 8.7.0 but it's still giving me problems like incorrect syntax when I know the Node can handle it.
Does anyone have any experience building Node.js apps on Azure?

In the current version of the function runtime, you cannot choose the Node version. But in the next version, which is currently in Preview, you can and will be able to pick 8.7.0. See https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions for details.

Related

Is Java/JVM supported in the current firebase functions execution environment?

Currently, I have implemented Firebase Functions as Node.js.
I installed 'nodejs-java' module and deploy it to 'Firebase Functions', but it failed.
I think the reason for this problem is that Firebase Functions
I thought it was because it doesn't support Java.
The reason for my thinking is that I tried to print out the JAVA_HOME environment variable in the build environment to the console using a module called 'find-java-home', but nothing came out.
The point is, I wonder if Firebase Functions does not support Java language yet.
In addition, is gcc or python supported for firebase functions build environment?
I am not good in English. Thank you for reading it.
Please reply.

Is it recommended to upgrade a deployed v2 azure function to v3 or is it better to create new resources and deploy to them from scratch?

We have a function deployed to azure running the v2 runtime. If I go to the function in the portal to try to upgrade it to v3, I see this message "Cannot upgrade with existing functions: Major version upgrades can introduce breaking changes to languages and bindings. When upgrading major versions of the runtime, consider creating a new function app and migrate your functions to this new app."
However, I was able to change the function in Visual Studio and deploy, using a simple test deploy of right-click publish, and the publishing process upgraded the deployed function to runtime v3.
Are there any gotchas we may run into using this approach? Is it better practice to create new azure function resources to deploy to a clean v3 azure function?
Thanks
Apparently no one else has this question or the answer is obvious. In any case, here's Microsoft's recommendation from their support personnel (grammar/language translation is Microsoft's):
Based on the official documentation changing the version can be made by changing the App Setting "FUNCTIONS_EXTENSION_VERSION" value form ~2 to ~3 however this can cause some problems if there are any dependencies unique to version 2 of the runtime, you can read more about this in the following link
https://learn.microsoft.com/en-us/azure/azure-functions/functions-versions#migrating-from-2x-to-3x
Now based on my experience the safest an easiest way to archive [achieve] this is to deploy the code into a Function app that was created targeting the version 3 of the runtime this is to assure [ensure] that it will only have the dependencies form [from] version 3 and not having anything extra from 2 left after changing the version directly from the portal
The idea with this is to avoid having conflicts from between assets form version 2 or missing assets from version 3
Suppose you are using c# function, if yes, the major update about v2 to v3 is the .net core version. So mostly if your code could work with the update that means your dependencies are compatible with .net core3, so mostly it won't crash.
And there is an official doc about migrating from 2.x to 3.x.
And about deploying, if the runtime is different when you deploy it will prompt to update the portal runtime, so it won't affect the function running.

How to use GCP Runtime Configurator from Node.js?

I'm working on a Node.js application hosted on Google Cloud, using Google Application Engine. The app has a few settings like the following:
const TASK_BATCH_SIZE = 50;
Currently, every time we need to change some settings like that one to do some tests we need to re-deploy the app, and that happens very often. We are looking for some alternatives inside the Google Cloud ecosystem that allows us to configure our running services without needing to re-deploy.
One of the things we found in the docs was Runtime Configurator, which still seems to be a beta product. For Node.js specifically nodejs-rcloadenv is the only client library we found, but doesn't seem to support the Watcher / Waiter concepts described in the Runtime Configurator docs or any other way to subscribe to variable changes in a configuration resource.
Is Runtime Configurator the solution to our problem? Are there any other services inside the Google Cloud ecosystem, or any other library for Node.js that could help us with this?
If you want to update or to make some settings changes in the service, you need to re-delpoy the service.
You can't use Runtime Configurator as it's for Compute Engine and not for App Engine.
You can create a feature request on Google's Public Issue Tracker for your issue ( to update configurations without re-deploying the service).

Which Node.js version should I choose?

I want to create an Azure Web App which will run my React.js app.
Now I have to choose an appropriate Runtime Stack.
Their are a few Node versions where I can choose for.
I have my React App in Node V10.13, but this version is not in the list of the Runtime Stack options.
Does anyone know which version I have to choose?
It supports upto V10.14, you can find it from the portal when you create the web app.

How to deploy custom DataCacheStoreProvider for write-through and read-behind on Azure

Is it possible to deploy custom DataCacheStoreProviders on Azure? I'm currently trying to deploy and test one locally, but I'm not sure how to go about this because the documentation doesn't cover my scenario. Any help would be appreciated.
No it not possible.
You can find all supported features here : http://msdn.microsoft.com/en-us/library/windowsazure/gg278350.aspx
I think your question is specific to cloud Service, isn't it?
I don't have exact experience on how to get all of that working however during one similar discussion the outcome was that you can use DataCacheStoreProviders with Azure if after building and register DataCacheStoreProviders in your machine, you can access them using Windows Azure Cache modules i.e. using DataCacheFactory.GetCache(String cacheName) and process cache items. IF you try and met any problem in any step I can find some ways to help you after you post your issue.

Resources