Run executable from local storage using Azure Web Role - azure

i'm trying to run a simple executable using an Azure Web Role.
The executable is stored in the Web Role's local storage.
The executable produces a log.txt file once it has been run.
This is the method I am using to run the executable:
public void RunExecutable(string path)
{
Process.Start(path);
}
Where path is localStorage.RootPath + "Application.exe"
The problem I am facing is that when I open the local storage folder the executable is there however there is no log.txt file.
I have tested the executable, it works if I manually run it, it produces the log.txt file.
Can anyone see the problem?

Try setting an explicit WorkingDirectory for the process... I wonder if log.txt is being created, just not where you expect. (Or perhaps the app is trying to create log.txt but failing because of the permissions on the directory it's trying to create it in.)

If you remote desktop into the instance, can't you find the file created at E:\approot\ folder ? As Steve said, using a WorkingDirectory for the process will fix the issue
You can use Environment.GetEnvironmentVariable("RoleRoot") to construct the URL to your application root

Related

Execute shell script for database backup

I have a ReactJS-neo4j application, deployed on a cloud server. Currently, i create backups of my databases manually.
Now I want to automate this process. I want to automatically execute the above query every day
Can anyone tell me how to automate the above process ?
You need to change your neo4j configuration file found in <HOME_neo4j>/conf/neo4j.conf as below. The location of the file is different if you are not using Linux server, like Debian.
apoc.export.file.enabled=true
apoc.import.file.use_neo4j_config=false
The 2nd line will enable you to save the json file from default folder "import" to any folder you want.
Then open a terminal (or ssh) that connects to your cloud server. Go to <HOME_neo4j> directory where cypher-shell is installed. Copy and run this one liner script below.
echo "CALL apoc.export.json.all(\"/home/backups/deploymentName/backup_mydeployment.json\", { useTypes: true } )" | bin/cypher-shell -u neo4j -p <awesome_psw> --format plain
This will save the json file in /home/backups/deploymentName just like what you do in your neo4j browser.
I will leave it up to you on 1) how to add the timestamp YYMMDD0000_ in the filename via linux command and 2) schedule the job every midnight via crontab. Goodluck!

Azure Powershell - load variables from another script file

In Azure DevOps, I have an Azure Powershell task to create some resources using ps1 script in repo. This script working fine.
Now I need to split the script and variables into different files.
I created files SB-Config.ps1 for variables and ServiceBus.ps1 with main script. Moved all vars into SB-Config.ps1 .
Both files are in the same folder and in ServiceBus.ps1 I added:
. .\SB-Config.ps1
But Azure Devops fails with error:
What I'm doing wrong and how to get variables from SB-Config.ps1 script, when running ServiceBus.ps1 file?
I am able to reproduce your situation on my side.
Same issue as yours.
You can run this command to output the location of current work space:
Get-Location
I notice the powershell script file on your side is in the sub folder of Default working directory.
So do you set the work space in the powershell script file you are running first?
Set-Location $env:System_DefaultWorkingDirectory\subfolders
In your situation, I think the issue comes from the current work space is System_DefaultWorkingDirectory , the error output means the script can't get the file you want. This issue only occurs when you select 'file path' to run.

Dotnet Core - Get the application's launch path

Question - Is there a better/right way to get the application's launch path?
Setup -
I have a console application that runs in a Linux Debian docker image. I am building the application using the --runtime linux-x64 command line switch and have all the runtime identifiers set appropriately. I was expecting the application to behave the same whether launching it by calling dotnet MyApplication.dll or ./MyApplication but they are not.
Culprit Code -
I have deployed files in a folder below the application directory that I reference so I do the following to get what I consider my launch path. I have read various articles saying this is the correct way to get what I want, and it works depending on how I launch it.
using var processModule = Process.GetCurrentProcess().MainModule;
var basePath = Path.GetDirectoryName(processModule?.FileName);
When launching this using the comand dotnet MyApplication.dll the above codes path is /usr/share/dotnet
When launching this using the command ./MyApplication.dll the path is then /app
I understand why using dotnet would be different as it is the process that is running my code, but again it was unexpected.
Any help here to what I should use given the current environment would be appreciated. Ultimately I need the path where the console application started from as gathered by the application when it starts up.
Thanks for your help.
This code should work:
public static IConfiguration LoadConfiguration()
{
var assemblyDirectory = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
.....
}

Is that the 'path to war' which I am giving wrong? If yes,how do I do a rollback?

I have been trying to use the command to rollback the last process of deploying the website which was interrupted due to a network failure.
The generic command that I am using while inside the bin directory of server's SDK (On Linux) is :
./appcfg.sh rollback /path_to_the_war_directory_that_has_appengine-web.xml
Is this the way we do a rollback ? If not please tell me the method.
_(I was asked to make a directory war in the project directory and place the WEB-INF folder in that with appengine-web.xml inside it. It may be wrong)_
I am fully convinced that I am making a mistake while giving the path to my app .
Shot where my .war file is there :
Now the command that I am using is (while inside the bin directory of the server's SDK) :
./appcfg.sh rollback /home/non-admin/NetbeansProjects/'Personal Site'/web/war
The following is the representation of the path to war directory :
Where am I wrong ? How should I run this command so that I am able to deploy my project once again ?
On running the above command I get this message :
Unable to find the webapp directory /home/non-admin/NetbeansProjects/Personal Site/web/war
usage: AppCfg [options] <action> [<app-dir>] [<argument>]
NOTE : I have duplicated the folder WEB-INF. There is still a folder named WEB-INF inside the web directory that contains all other xml files.
The error tells you that the folder /home/non-admin/NetbeansProjects/Personal Site/web/war does not exist. If you look carefully the name of the folder is NetBeansProjects (the filesystem in Linux is case-sensitive).
So, you should run instead the command:
./appcfg.sh rollback /home/non-admin/NetBeansProjects/'Personal Site'/web/war
and just to make sure that the directory exists run first
ls /home/non-admin/NetBeansProjects/'Personal Site'/web/war

Runtime.exec() in Hadoop on Azure environment

This question is related to Hadoop on Azure environment.
I am trying to use Runtime.exec() to execute a batch script in the reduce function. I could not get this running in Hadoop on Azure environment while it runs fine in the Hadoop on Linux. I tested the Runtime.exec() code snippet in my desktop (windows 7) environment and it runs fine there. I have made sure that I consume the output and error streams of the sub-process after Runtime.exec().
The batch script contains the below ( a single command):
c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\attempt_201207121317_0024_r_000001_0\work\tool.exe
-f c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\work\11_task_201207121317_0024_r_000001.out
-i c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\attempt_201207121317_0024_r_000001_0\work\input.txt
I distribute the tool.exe and input.txt files using Distributed cache and it creates a symlink from the working directory. tool.exe and input.txt points to the actual files in the jobcache directory.
2012-07-16 04:31:51,613 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /hdfs/mapred/local/taskTracker/distcache/-978619214658189372_-1497645545_209290723/10.73.50.78tool.exe <- \hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\attempt_201207121317_0024_r_000001_0\work\tool.exe
2012-07-16 04:31:51,644 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /hdfs/mapred/local/taskTracker/distcache/-4944695173898834237_1545037473_2085004342/10.73.50.78input.txt <- \hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\attempt_201207121317_0024_r_000001_0\work\input.txt
The reducer gives the below error when it runs.
Command Execution Error: Cannot run program
"cmd /q /c c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\work\11_task_201207121317_0024_r_0000011513543720767963399.bat":
CreateProcess error=2, The system cannot find the file specified
In another case, I tried running the same but without using the absolute paths.. The output stream from the sub-process is shown below:
c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0022\attempt_201207121317_0022_r_000000_0\work>tool.exe -f /hdfs/mapred/local/taskTracker/nabeel/jobcache/job_201207121317_0022/work/1_task_201207121317_0022_r_000000.out
-i input.txt
I do not know how the job working directory paths and distributed cache works in Hadoop on Azure environment. Could you please let me know if I am missing something here (or) there is something I need to take care of while using Runtime.exec() in Hadoop on Azure environment.
Thanks,
.,._
Reply to sender | Reply to group | Reply via web post | Start a New Topic
I am not familiar with Hadoop. But the error message seems to be obvious. It would be better if you can check whether the file exists.
c:\hdfs\mapred\local\taskTracker\nabeel\jobcache\job_201207121317_0024\work\11_task_201207121317_0024_r_0000011513543720767963399.bat
Best Regards,
Ming Xu

Resources