Output to local file using cloud shell in vscode on a MAC - azure

I am new to VSCode on a Mac and I am using the Cloud Shell connected to Azure where I can run all my commands without issue. The problem I have is that if I want to use the export-csv command to export the information to a file I don't know how to point the output file to the Desktop of my Mac.
Is this possible or am I barking up the wrong tree?

When you are using cloud shell all data is executed remotely on azure terminal so the cloud is saved to a blob storage, you can run export-csv command then download the data from the blob storage.
for more details :
https://learn.microsoft.com/en-us/azure/cloud-shell/persisting-shell-storage

Related

Uploading large .bak file to Azure Blob through Powershell

So I am trying to create a powershell script which will upload a large (> 4GB) .Bak file to Azure Blob Storage but currently it is getting hung. This script works with small files which I have been using to test.
Originally the issue I was having was the requirement to have a Content-Length specified (I imagine due its size) so I now calculate the file size of the .bak file (as it varies slightly each week) and pass this through as a request header
I am a total powershell newbie, as well as being very new to Azure blob. (NOTE: I am trying to do this purely in powershell, without relying on other tools such as AzCopy)
Below is my script
Powershell Script
Any help would be greatly appreciated..
There are a few things to check. Since file is big, are you sure it isn't uploading? Have you checked network activity in performance tab of task explorer? AzCopy seems like a good option too that you can use from within Powershell, but if it's not an option in your case, then why not to use native AZ module for Powershell?
I suggest you using Set-AzStorageBlobContent cmdlet to see if it helps. You can find examples at Microsoft docs

Create an Azure virtual machine with premade files and run them?

Is there a way in Azure to create a new virtual machine with preselected files that will always be there when establishing the new machine, as well as run them?
I have a shell script that I have to run on new Ubuntu machines that I deploy and I was wondering if there's a way to make Azure already install Ubuntu with those files and maybe even run them.
You can store the files in a storage account and quickly get the files in your VM: https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows. Alternatively, you can restore a backup of a VM that has all prerequisites installed: https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms.
If this is not what you're looking for, I think you should create an ISO of your VM with all software/files installed that you want. This is however not straightforward, see the discussion here: https://serverfault.com/a/952930
If I do not misunderstand, you're searching for a way to run the shell script when creating a new VM. Then I will recommend you use the cloud-init, it can run your shell script to provision the VM in the creation time. You can follow the notes here to use the shell script.

How to download a folder to my local PC from Google Cloud console

I have a folder I want to download from Google Cloud Console using the Linux Ubuntu command terminal. I have logged in to my SSH console and so far I can only list the contents of my files as follows.
cd /var/www/html/staging
Now I want to download all the files from that staging folder.
Sorry, if I'm missing the point. Anyway, I came here seeking a way to download files from Google Cloud Console. I didn't have the ability to create an additional bucket as the author above suggested. But I accidently noticed that there is a button for exactly what I needed.
Seek keebab-style menu button. In the appearing dropdown you should find Download button.
If you mean cloud shell, then I typically use the gcp storage tool suite.
In summary, I transfer from cloud shell to gcp storage, then from storage to my workstation.
First, have the Google cloud ask installed on your system.
Make a bucket to transfer it into with gsutil mb gs://MySweetBucket
From within cloud shell, Move the file I to the bucket. gsutil cp /path/to/file gs://MySweetBucket/
On your local system pull the file down. gsutil cp gs://MySweetBucket/filename
Done!

Azure - Process Message Files in real time

I am working on Azure platform and use Python 3.x for data integration (ETL) activities using Azure Data Factory v2. I got a requirement to parse the message files in .txt format real time as and when they are downloaded from blob storage to Windows Virtual Machine under the path D:/MessageFiles/.
I wrote a Python script to parse the message files because it's a fixed width file and it parses all the files in the directory and generates the output. Once the files are successfully parsed, it will be moved to archive directory. This script runs well in local disk on ad-hoc mode whenever i need it.
Now, i would like to make this script run continuously in Azure so that it looks for the incoming message files in the directory D:/MessageFiles/ all the time and perform the processing as and when it sees the new files in the path.
Can someone please let me know how to do this? Should i use any stream analytics application to achieve this?
Note: I don't want to use Timer option in Python script. Instead, i am looking for an option in Azure to use Python logic only for File Parsing.

Google Cloud Shell: How to interface via command-line terminal?

I created a very simple python hello_world script on the google cloud shell. I implemented it and ran it within the in-browser google cloud shell and it compiled and outputted correctly.
My question is, what if I want to implement this same hello_world script via a linux server I am on? I currently just downloaded the google SDK and did a 'gcloud init' and set up this hello_world python project onto my linux server by following the google cloud documentation:
https://cloud.google.com/sdk/docs/quickstart-linux
https://cloud.google.com/sdk/downloads
I am just a little confused on how to access these files on my linux server terminal and run them via linux server terminal. I simply want to run them on here and have the same output as I did on the google cloud shell via the in-browser console.
If I'm reading this right, you can simply download the files over to you system. When you have the shell open, the upper right corner there is the 'more' option and from there you will see the ability to download files directly. Sorry, if this isn't what you were looking for.
Rick

Resources