I have a xlsx file in a azure bolb storage. Now, I want to access this xlsx file to do some edits and save it back. I have tried it locally. But I don't know how to do this when the file is in blob storage. The following code are used to do it in locally . Note: I don't want to save the first to my local drive and then edit. I want to directly edit it and save it via powershell.
$workbook = $excel.Workbooks.Open("C:\Users\jubaiaral\OneDrive - BMW\Documents\Book1.xlsx")
$sheet = $workbook.worksheets | where {$_.name -eq 'Sheet1'}
..................my edits come here...................
$workbook.Save()
$excel.Quit()```
Thanks in advance!
I want to directly edit it and save it via powershell.
I dont think you can directly update the file and save it using powershell. You can
Copy the file
Do the work
Copy it back to the storage .
If you want to do this in place you use SPARK ( Azure Synapse / Azure databricks )
This may be helpful : https://community.databricks.com/s/feed/0D53f00001HKHeOCAX
Related
I am accessing a website that allows me to download CSV file. I would like to store the CSV file directly to the blob container. I know that one way is to download the file locally and then upload the file, but I would like to skip the step of downloading the file locally. Is there a way in which I could achieve this.
i tried the following:
block_blob_service.create_blob_from_path('containername','blobname','https://*****.blob.core.windows.net/containername/FlightStats',content_settings=ContentSettings(content_type='application/CSV'))
but I keep getting errors stating path is not found.
Any help is appreciated. Thanks!
The file_path in create_blob_from_path is the path of your local file, looks like "C:\xxx\xxx". This path('https://*****.blob.core.windows.net/containername/FlightStats') is Blob URL.
You could download your file to byte array or stream, then use create_blob_from_bytes or create_blob_from_stream method.
Other answer uses the so called "Azure SDK for Python legacy".
I recommend that if it's fresh implementation then use Gen2 Storage Account (instead of Gen1 or Blob storage).
For Gen2 storage account, see example here:
from azure.storage.filedatalake import DataLakeFileClient
data = b"abc"
file = DataLakeFileClient.from_connection_string("my_connection_string",
file_system_name="myfilesystem", file_path="myfile")
file.append_data(data, offset=0, length=len(data))
file.flush_data(len(data))
It's painful, if you're appending multiple times then you'll have to keep track of offset on client side.
I'm able to use dask.dataframe.read_sql_table to read the data e.g. df = dd.read_sql_table(table='TABLE', uri=uri, index_col='field', npartitions=N)
What would be the next (best) steps to saving it as a parquet file in Azure blob storage?
From my small research there are a couple of options:
Save locally and use https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs?toc=/azure/storage/blobs/toc.json (not great for big data)
I believe adlfs is to read from blob
use dask.dataframe.to_parquet and work out how to point to the blob container
intake project (not sure where to start)
$ pip install adlfs
dd.to_parquet(
df=df,
path='absf://{BLOB}/{FILE_NAME}.parquet',
storage_options={'account_name': 'ACCOUNT_NAME',
'account_key': 'ACCOUNT_KEY'},
)
I need to make a backup to my azure database, I'm trying with this command:
"C:\Program Files\Microsoft SQL Server\120\DAC\bin\SqlPackage.exe" /Action:Export /SourceServerName:"tcp:MyDataServer.database.windows.net,1433" /SourceDatabaseName:MyDatabase /SourceUser:MyUser /SourcePassword:MyPass /TargetFile:C:\backups\backup.bacpac
but it seems that the format that downloads it is "bacpac" and I need it to be ".bak", I tried to change the extension but says: "The TargetFile argument must refer to a file with a '.bacpac' extension "
Any idea how to download the database in ".bak" format?
Any idea how to download the database in ".bak" format?
SQL Azure doesn't provide a native way to generate '.bak' format backup file. If you did need this format file, you could import the BACPAC File to your local SQL Server to create a new User Database. Then you could generate a '.bak' format file from your local SQL Server.
In addition, you also could try a tool named SqlAzureBakMaker which could make '.bak' file for you easily.
I'm trying to foresee a system where tweets are flowing to Azure Blob storage through the Twitter streaming API. I was following a tutorial from Microsoft but it ends in a following scenario:
$writeStream = New-Object System.IO.StreamWriter $memStream
$count=0
$lineMax=1000000
$sReader = New-Object System.IO.StreamReader($response.GetResponseStream())
$inrec = $sReader.ReadLine()
while (($inrec -ne $null) -and ($count -le $lineMax))
{
if ($inrec -ne "")
{
$writeStream.WriteLine($inrec)
}
$inrec=$sReader.ReadLine()
}
$writeStream.Flush()
$memStream.Seek(0, "Begin")
$destBlob.UploadFromStream($memStream)
$sReader.close()
Now the problem is, if I want to use this on large scale I suspect the file will become too big to be sent to Azure in one go. What is the correct approach for this problem? Should I roll the files locally to disk and then send it to Azure?
You might want to check out the new Append Blob. This lets you create a blob and keep appending to it (from multiple locations if needed). Here's some how-to information that may help.
i want to learn ASP.NET, for this, i read some basic tutorials for managing the IIS Web-Server. Im Wondering how i could make a full backup of my site (Configuration and Content). Im running the IIS Server on a Hyper-V Windows Server 2012R2 Core and administrating over Powershell Remote.
In the Intenet, ifound an article about some basic stuff (see here)
This Article said, i can make a full backup of my IIS Configuration and Content over
Backup-WebConfiguration -Name "My Backup"
And Restore it over
Restore-WebConfiguration -Name "My Backup"
The Problem is: It seems it really only makes backup from the Configuration and not from the Content. For Example: It Restores the Websites from IIS:\Sites but not the physical stuff like an Application Folder in it and a default.htm. If i delete the default .htm and the folders, use the Restore-WebConfiguration, it still does not restore it - only the WebConfiguration itself.
From the Articel i guessed, it would be restore also the content ....
Did i make something wrong? How can i do what i want "from scratch" without Scripts from MS Web Deploy 3.0 ?
thanks for help and best regards,
Backup-WebConfiguration only backs up the configuration items detailed in the applicationHost.config file. It does not deal with the actual content, just how that content is handled by IIS.
To do both is easy enough, here's a quick function that creates a zip file (just enter in the path to your inetpub directory) and backs up the configuration. (This requires Powershell v3 or higher) The backup will have a Creation Date automatically set for it (you can see a list of your backups by using the Get-WebConfigurationBackup, so this goes ahead and adds the date and time to the zip file as well so they can be matched up.
If you're making more than one backup in the same day, you'll need to tweak the file name of the compressed file as it only has the date in its file name.
function Backup-WebServer
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Source,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[ValidateScript({Test-Path $_})]
[string]$Destination,
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[string]$ConfigurationName
)
Add-Type -Assembly System.IO.Compression.FileSystem
Import-Module WebAdministration
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
$date = $(Get-Date -Format d).Replace('/','-')
$fileName = "Inetpub-Backup $date.zip"
$inetpubBackup = Join-Path -Path $Destination -ChildPath $fileName
[System.IO.Compression.ZipFile]::CreateFromDirectory($Source,$inetpubBackup,$compressionLevel,$false)
Backup-WebConfiguration -Name $ConfigurationName
}