I have one JAR which gets some data from Database and generates one CSV file. I have deployed this JAR on Azure App Service where it generates the CSV file every minute. So we need to keep the App Service in Stop state to avoid creation of CSV every minute. Whenever we require the CSV we start the App Service and then stop it again after generation of it.
I'm identifying the alternate way to avoid generating the CSV file every minute. Is there any functionality in Azure App Service using which we can run the deployed JAR at specific time only.
OR Is there any other resource on Azure where we can deploy JAR and execute it on timely manner. Can we make use of LogicApp or any other functionality to achieve this scenario.
Related
I have a situation where i have to process a file in a Azure Blob storage(which will be kept by some process on a particular day) using a spring batch. Now the requirement is i have to only process that file on that particular business day and not on the other day.So if i am creating a scheduler/Cron to call the batch on that day it may not work , because in the cron i have to give a particular time when it will trigger the batch.Now if the file does not arrive on that particular time to the blob storage the cron will not find the file to process.
Is there any file watcher in the Azure utility which i can use to check of the file arrives and then i can call the batch.
Please suggest.Thanks in advance.
As Tiny indicated, you can create an Azure function and use Azure blob trigger to monitor new come in files. Once the function is triggered, you can call your spring batch(by some URL exposed by your app) in your Azure function code to handle this file.
If you don't want to code, you can use Azure Logic app and blob trigger to monitor new come in files and call your Spring batch.
I know I can delete old files manually but I need to automate the process. Some cron script would do the job but as far as I know when the app service will be reproviosioned my changes will be lost.
The App Service runs Ubuntu.
Yes, by default, logs are not automatically deleted (with the exception of Application Logging (Filesystem)). To automatically delete logs, set the Retention Period (Days) field (it's one of the way to do that).
You could automate the deletion by leveraging KUDU Virtual File System (VFS) Rest API. For a sample script, checkout this discussion thread for a similar approach.
While WebJobs is not yet supported for App Service on Linux.You could use Azure Functions for running scripts, if your requirement fits.
I have several Spring Batch and Java batch that I want to deploy in Azure.
These jobs process file zip (several text files, xml... inside).
Today they turn in a tomcat (24/24 - 7/7), just for one run per month.
I want to deploy them in azure to reduce costs and launch them as soon as a file arrives in the blob storage.
I saw that there were two options:
Azure function -> But I need to archive the input and output files, and the processing of an entry zip can be quite long
Azure WebJobs -> I feel like we have to let the machine run
Do you have other solutions?
I would recommend to repackage your jobs as Spring Boot apps and run them on demand when necessary. Having a Tomcat running 24/7 on a cloud environment to trigger a batch job once a month is not ideal in terms of resources consumption and in terms of dollars on your bill :-)
Here are two quick guides on how to package a Spring Batch job as a Boot app and how to deploy a Boot app to MS Azure:
Creating a batch service
Deploying a Spring Boot app to Azure
Hope this helps.
I am using the default logging mechanism that Azure web job provides. Type of logger is 'TextWriter'. I have 3 functions in the same web job with extensive logging. A number of logs being generated every minute. As with the default settings of azure web job, all the logs go to the storage account into blobs. I do not want my storage account to just keep on growing with months and months of old logs.
I need a way of cleaning the logs on a periodic basis. Or is there any setting/configuration that can be done so that my logs get cleaned on a periodic basis? Or should I write code to monitor the blob container 'azure-webjobs-hosts' and then the files inside 'output-logs'. Is that the only place where the logs for my application are stored by default by the web job?
I tried searching the web but couldn't find any related posts. Any pointers would be of great help.
Based on my experience, we can achieve this purpose by define the azure storage container name. We can define weekly/monthly/daily as container name. Then use a time trigger function to delete the container. For example, if we need delete weekly data, then we set container for this weekly data, then delete it in the next week via time trigger.
I have an app (.exe) that picks up a file and imports it into a database. I have to move this set up into Azure. I am familiar with Azure SQL and Azure File Storage. What I am not familiar with is how I execute am app within Azure.
My app reads rows out of my Azure database to determine where the file is (in Azure File Storage) and then dumps the data into a specified table. I'm unsure if this scenario is appropriate for Azure Scheduler or if I need an App Service to set up a WebJob.
Is there any possibility I can put my app in a directly in Azure File Storage and point a task to that location to execute it (then it might be easier to resolve the file locations of the files to be imported).
thanks.
This is a good scenario for Azure Functions, if you want to just run some code on a schedule in Azure.
Functions are like Web Jobs (they share the same SDK in fact) so you can trigger on a schedule or from a storage queue, etc., but you don't need an app service to run your code in. There are some great intro videos here Azure Functions Documentation , and here is a link to a comparison of the hosting options between web jobs, functions, flow and logic apps.
You can edit the function directly in the portal (paste/type your c# or node.js code straight in), or use source control to manage it.
If you really want to keep your app as an exe and run it like that, then you will need to use the azure scheduler to do this instead, which is a basic job runner.
Decisions, decisions...!
Looking at https://azure.microsoft.com/en-gb/documentation/articles/scheduler-intro/ it seems that the only actions that are supported are:
HTTP, HTTPS,
a storage queue,
a service bus queue,
a service bus topic
so running a self contains .exe or script doesn't look to be possible.
Do you agree?