I have 2 questions:
I need some json data from a json file which I am copying while
building the project. Then I published the code from visual studio
2015 but when the code runs on Azure, it can't access the file in
the folder from where dll is being executed. How can I get data from
that file.
When I pushed the Azure function from visual studio to Azure,
appsettings.json values are not being published to Azure.
Please refrain from asking 2 questions in one.
The answers:
You can get the path to current directory from Execution Context, see Retrieving information about the currently running function.
App.setting file is not used in actual deployment of FunctionApp, you should define your settings in environment variables, see How to manage a function app in the Azure portal.
Related
I'm new to Azure function and here found after the function is published to the portal, but it is not visible in the function list. I have attached the snap of sample code and an empty list of azure. plz, help!
////////////////////////////////////////////////////////////
Adding kudu ui, here I found the only host.json in /wwwroot
Hi All
Added kudu ui, here I found the only host.json in /wwwroot
Update:
From your description, it seems the deployment of your azure function is Interrupted or failed. There will be a host.json in wwwroot by default. If you deploy from local, it means it create function app success but didn't upload files to physical path 'wwwroot' (Azure function is based on azure app services sandbox, so if your deployment is success, all of the related files and folder will be upload to wwwroot, this is the physical path, just like the app service. ) I think you can try other ways to upload these files to physical path. For example, ftp deploy or zip deploy. This is the structure of the C# library azure function:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-dotnet-class-library#functions-class-library-project
This is the screenshot of the success deployed function:
(In bin directory, there are many compiled files, including the dll file. In Function1, there is a function.json. These files will be generated after building. For more information, please refer to the above doc.)
You can first build your function app on local and then upload the compiled files to azure.
These are the tutorials of how to use ftp and the zip deploy to upload files: (Just choose one of them is ok. By the way, when you use the VS 2019 to publish function app, it is essentially a zip deployment.)
FTP deploy: https://learn.microsoft.com/en-us/azure/app-service/deploy-ftp
Zip deploy: https://learn.microsoft.com/en-us/azure/azure-functions/deployment-zip-push
Original Answer:
This is an error of portal ui.
It seems that the new version of ui has not been done. But your function should have been deployed to azure.
If you go to kudu, you will find the files has be upload to wwwroot.
You should follow these steps:
And then copy the host key in this place:
(Both of them can be used.) Copy one of them to the end of your request url.
The request url in your function app should be like this:
yourfunctionappname.azurewebsites.net/api/yourtriggername?code=yourkey
And then you can get response.
You can try again at your time, the problem maybe be fixed.(Whether you can see it in ui, you can trigger this trigger, but you need to give a key to pass the verification. The new version of the function ui still has a lot of updates, and even lacks some basic functions. It is trying to unify towards app service, and it should be stable after a while.)
I have a website hosted as an azure web app. It's an asp.net website that's in a vs solution. On folder of that website is my products documentation, all of it as static resources (html and images). These static resources are located in a folder in another vs solution (this is the actual products solution). Both solutions are TFS based in VSO.
As of now, i have a webjob running in the context of the website that is basically doing a "tfs get" on the documentation folder and placing it's contents into the documentation folder on the website. This is working, however, the vms the website is running on do change quite frequently and the mechanism to create a workspace is bound to the machine, not to the disk drive. Thus, i cannot get only the changes but i must always get all the content which right now takes about 20 minutes and creates unnecessary load on the website. (This is why i'm only running this webjob once a week.)
Now i'm looking for a better way to do this. I would like to only get the files that have changed, making this a lot faster and let cpu/drive costly.
I did not find a way to create a workspace on the webserver that isn't vanishing each time the webservers vm changes. (if it was possible to somehow attach the workspace to the drive instead of to the machine name, that would solve the problem.)
i was also looking at my continues build definition that i have running for the products solution. as part of that, i think it's possibly to create a deployment where the documentation folder is copied to the app services's documentation folder. This way i could get rid of the "special" webjob, but i'd still copy all the docs files each time. (also, the build agent for that is running on premises, so i'd also have to copy those files from premises up to the cloud when they're actually already there inside vso.). So basically, i don't think this option is a lot of use for my case.
Obviously if i moved the static docs resources from the products solution to the websites solution, i could simply use the automatic deployment that is available for website projects from vso to azure web app. Unfortunately, for various other reasons (one of which being, the static resources are partially created automatically from the .cs sources in the products solution) i simply cannot move the docs folder from the products solution to the websites solution.
So does anybody have a suggestion for a method where i could update the documentation folder in the web app based on changes in the corresponding VSO folder?
You can upload the updated files to Azure app service by using Kudu API.
Simple steps:
Create a Continuous Integration build
Check Allow Scripts to Access OAuth Token option in Options tab
Add PowerShell step/task to check the changes with REST API. (Refer to Calling VSTS APIs with PowerShell)
Add Azure PowerShell step/task to upload files to app service by using kudu api. (Refer to Remove files and folders on Azure before a new deploy from VSTS)
Here is what i ended up doing:
Created a ServiceHook in VSO that is wired to "Web Hooks". The hook is called upon each Check-In and filtered based on the directory i want. (All of this can be done using the existing functionality in VSO.)
The hook calls an Azure "functionapp" (which is easy to do, because functionapps have a "HttpTrigger" mechanism which fits in nicely here.)
The hook passes the id of the checkins ChangeSet to the function app.
The FunctionApp puts that id into an Azure (Storage) Queue.
This triggers an azure webjob which listens on that queue. That webjob uses the ChangeSet-Id to get the changes from VSO and acts on the changetype for each change. (e.g.: Downloads or deletes a file.)
How does Function App detects if a folder contains Functions?
As an incentive for the upcoming Azure Function Tools for VS, I followed the article here: https://blogs.msdn.microsoft.com/appserviceteam/2017/03/16/publishing-a-net-class-library-as-a-function-app/
I have successfully created Azure Function as a web-app project in Visual Studio, deployed the Function App using ARM template, then deployed the Functions using release step in Visual Studio Team Services.
Here are the contents of my Web App project, 'CustomerERPChange' is the Function that I want to get it showing in Azure.
Web App Project
What I'm expecting to see is the Function appearing in the Function App, but that doesn't seem to be the case..
Looking at Kudu I can confirm that the content of my project has been successfully deployed to the ../wwwroot directory, renaming the 'run.cs' back to 'run.csx' also didn't help.
Any idea and suggestion would be appreciated, thanks!
There are two ways to create function within Functions App,
1) using UI Create function with template of your choice,
2) from KuDu portal,
go to your_app_name.scm.azurewebsites.net,
create a folder in home > site > wwwroot > (function_name),
add run.csx or run.ps1, and function.json file to the newly created folder.
About these files :
run.csx or run.ps1 is a file which will automatically gets called when the function is executed.
function.json is a file which defines your function(either it is a timer or HttpTrigger, or other).
I've got a website (basic html) and I want to deploy it using Azure Resource manager. It doesn't have a visual studio sln file and I don't want to create one.
I've found this tutorial for an angular website that does something along the lines that I am trying to do. http://www.azurefromthetrenches.com/how-to-publish-an-angularjs-website-with-azure-resource-manager-templates/
The problem I want to solve is that I have the Microsoft Azure SDK for .NET (VS 2015) 2.8.2 which allows me to add resources to my resource group project. The tutorial writes everything itself, rather than use visual studio to create the resources.
Does any one know how to do this?
I've got my application to build the website using a website.publishproj (found at the tutorial) so I have my zip file, what I am now lacking, is how to upload the zip file to azure using the already existing powershell that comes with the 2.8.2 SDK.
So far i've added the below code under the Import-Module statement:
C:\"Program Files (x86)"\MSBuild\14.0\bin\msbuild.exe 'C:\Source\website.publishproj' /T:Package /P:PackageLocation=".\dist" /P:_PackageTempDir="packagetmp"
$websitePackage = "C:\Source\dist\website.zip"
If you're ultimate goal here is the ability to simply deploy and changes to the Azure Web App, one solution is to setup automated deployment from a local Git repository into an Azure Web App. Firstly, you'd create the RG in the Azure portal then configure Continuous Deployment. You can then use something like Visual Studio Code to trigger the deployment from any code changes.
Good run through here: https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-app-using-vscode/
Assuming your website is under source control eg. GitHub - you can use an ARM template to point at the GitHub repo, so when it creates a new website it will automatically pull the content into your newly created site. Great walkthrough here: https://azure.microsoft.com/en-us/documentation/articles/app-service-web-arm-from-github-provision/ or just the code can be found here: https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-github-deploy.
You can use Azure CLI from non-Microsoft world to deploy eg.
azure group deployment create...
If this has helped, please mark as answered.
I'm using the AzureContinuousDeployment.11.xaml Visual Studio 2013 Template for CI builds from visualstudio.com to an Azure website and it's working great.
However, I need to keep additional files on the server (the app creates files). If I was using the "Web deploy" method, I'd simply disable the "Remove additional files at destination" property, but I don't see an option for that using the Azure deployment template.
I should be able to add /p:SkipExtraFilesOnServer=True to the MSBuild arguments in the build definition, but it isn't working. Files are still being deleted from the web server when I deploy.
I've also tried creating a publish profile and adding it to the template. It hasn't worked either.
I don't think there is a way to keep app-created files while using visual studio deployment. I suggest you to create an Azure Storage account and store those files in the storage (blob would be good enough). It seams that you are using ASP.net. See more detail at Microsoft Azure Storage Client Library for .NET.