We hired a guy to push a bunch of PST's to Azure (to be put into mailboxes) and he disappeared. We know he used a SAS URI and we know he did push the data up. We looked in storage explorer and dont see the data in any of our storage accounts. The guy deleted the original PST's so we cant just push the data back up.
as far as we know, he was using this guide https://learn.microsoft.com/en-us/microsoft-365/compliance/use-network-upload-to-import-pst-files?view=o365-worldwide
can we find the SAS URI he used somewhere in azure?
can we explore this data somehow?
Any help is appreciated, thanks so much.
can we find the SAS URI he used somewhere in azure?
Sadly no. Azure does not store the SAS URI anywhere.
can we explore this data somehow?
You would need to know the storage account where the files were uploaded. Without that you will not be able to explore this data.
Related
I want to create a backup for azure SQL databases that are not currently used to some low-cost storage account. I don't want to pay for those dbs as no operation are done on them. But I might need that data in the future. Extraction time is not an issue. So just want to know what are the different methods to do the same.
If I understood the question correctly then I would suggest for that database which you might not need immediately you can take back on azure blob storage.
There is a good blog for it
https://dallasdbas.com/sql-backups-azure-storage/
Just follow it.
Hope this will help you.
I'm building an iPhone app that collects user's physiology data (e.g., timing of heart beats), which will be about 5-50mb per session. I'd like to allow users to upload these data with their ID and comments to Azure, but I'm not quite sure which products to use. I was looking at Azure Blob storage but it does not have write permission to public. Thanks for your help in advance.
I won't comment on any data privacy topics around this and just caution you to look into this, regardless of the technical solution.
To your question: It really depends on what kind of data formats you are talking about. If you want to upload things like images, then yes, Blob storage would be a good fit. You would probably have some back-end service in the cloud, that the app calls and requests a one-time write-access token to a blob storage account. you return that SAS token to the app and the app can then upload data into that blob storage.
https://learn.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
I have created a "blobs" storage on MS cloud and when I tried to read data from it to Jupyter notebook for some analysis which according to this documentation, it requires "container_name" as well as "blob_name". But when I created the storage, as far as I remember I didn't come across the step where I had to assign the blob name. However, apparently I needed it. Unfortunately, so far I couldn't find it but I believed that I could guess the "container_name". I did a quick research on google but couldn't find any resources that says exactly where it is. So, I would like to know how I can find out the "container_name" as well as "blob_name" from the MS Azure panel.
Thank you in advance.
You could access them via your storage account in the azure portal, refer to the screenshot.
Choose a container, you could see the blobs, including names, blob type, size,etc.
After hunting through the net I can find lots of examples of retrieving data from SFTP but none to send from Blob storage to SFTP.
Basically I attempted to do this using a Logic App but Azure only supports files less than 50MB (which is really dumb).
All the Azure docs I have read reference pulling but not pushing.
https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-sftp-connector
etc etc..
Maybe someone with better googling skills can help me find the docs to help me out.
I'm using DataFactory V1.0 not 2.0 cheers
Always check this table to see if a data store is supported as source or sink in a data movement activity.
In this case, SFTP is supported as source but not as sink, this means its possible to extract data from it but not store data on it.
Hope this helped!
I have an account on azure and build a cluster on it (Chemalivethermotest). It has a storage account and I just by mistake erase all data on it (I misanderstood something on the online azure interface). Is there a way to recover the datas? I see there are no files anymore but there are still blob. Are they allowing to recover things?
Thanks you very much for your help.
This is an issue you will have to take up directly with Azure support, there is no technical answer. Good luck!