Issue while creating subfolder in Azure cloud - azure

I am using the Azure Api provided by Microsoft for cloud storage...I am facing an unusual bug while creating the sub folder.
i.e when I create a subfolder within any container it is created easily within seconds. But when I try to create a subfolder again with different name it takes more time as compared to the previous one.
Again I try it is created easily. It means the sub folder no. 1 , 3 ,5 ,7 and so on are created easily and the even no. such as 2, 4 etc sub folders are created with delay.
i.e "Alternate sub folder creation is taking too much time"
Please let me know if there is any solution for this bug...

Are you creating these in blob storage? Folders, strictly speaking don't exist. They are simply a '/' character (or whatever delimiter you specify) are that imbedded in the file names. So you save a file using a path with out implicitly creating the path.
In the documentation, they even refer to these as "vitual folders" because they don't actual exist.

Related

How do I set a logic app in Azure to transfer files from blob storage to sharepoint based on the name?

Good morning all,
I'm trying to build a logic app, that will upload the files from Azure Blob Storage to Sharepoint. It was quite easy to do when all the files were supposed to be uploaded to one folder. I was asked to seperate them by the name. So, if a file contains 'dog' in the name, it should go to folder 1, but if a file contains 'cat' it should go to a different folder on Sharepoint, from the same blob storage.
I've tried to add a condition to the logic app, that if it's 'true' that 'name' contains 'dog', upload it to folder 1, if false, upload it to folder 2 (there is always a file containing either 'dog' or 'cat'), but it still uploaded all of them to the folder with 'false' result. Basically, when I ran the logic app, all the results were false, so the problem is with the condition itself, but as I'm new to this, I wasn't able to figure out, what exactly is failing. Below is the screenshot of the logic app to upload all the files to one folder, I'm not quite sure where to put the condition (I've tried to place it everywhere, same result) and how to configure it properly.
Working solution to upload everything to one folder
If the left hand side of the condition is the name of the blob, based on what you've said you want, the right hand side should literally be the word Library ... no expressions or anything else.
Your condition, in plain English, says ...
If the name of the blob contains the word "Library", do the true side, else, do the false.
If you want to check for the word Library ignoring case, wrap the blob name in a toLower() expression and set the right hand side to be all lower case, like thus ... library

Automatically create Subfolders on SharePoint Document Library

I have been trying to accomplish this for weeks now and end up hitting a wall.
I have a document library on SharePoint Online with the following (close enough) structure.
Clients
-> Schools
-->Client Name
--->Communications
--->Documentation
--->Projects
---->Project Name 1
---->Project Name 2
->Retail
-->Client Name
--->Communications
--->Documentation
--->Projects
---->Project Name 1
---->Project Name 2
... and so on.
Inside the "Projects" folder there is a set of folders as well.
Right now we have a Project template folder that we used to just copy/paste and rename when we had our file server, but now on SharePoint, the copy to process is way too many clicks to get it to that location.
What I am trying to accomplish is be able to create a new project folder and automatically create all the folders under it.
Appreciate the guidance on this.
I was able to figure this out.
My challenge was when creating the folder, it always wanted to create it inside the root of the document library and not the subfolder.
So I created 2 Content types for folders, one for clients and one for projects.
Used SharePoint Designer to create the workflow, but the trick here was to extract the URL from the current item, which is the folder being created, and remove the first x amount of characters from it which equals the SharePoint document library location. The remaining part of the string was the exact location where I wanted the subfolders to be created.
After that, I used that variable to create all other subfolders.

Linked Excel file opens when Access front end opens- How to stop this behavior

I have Ms access- frond end which is distributed to more than 50 Users and its works very well until today i come across a weird behavior.
today, I have linked one excel file in back end and use this linked table in front end a create a form using query (view only) and updated my front end. through my test I found that if I open two instance of my front end and open the newly created form than in second instance access open my the linked Excel file as readonly.
My question is, how can i avoid this behavior? I don't want my users to see this excel file by any way. is there any work around?
That's by design. An Excel workbook hosted on premise cannot be shared for writing.
One workaround is to create a copy of the workbook for each user. For instance, let your application copy the workbook from the shared location to a local subfolder of %LocalAppData%, the user's local data folder, and link to that.

Unable to determine why workbook gets marked as read only when opening another workbook via VBA

Background: I can't seem to find an answer for the blunder I've found myself in. I'm working on a dashboard of sorts for our organization that pulls data from different workbooks in different locations. Different people have different permissions within folders in our company. We're trying to prevent having to change permissions. I was also trying to make it easy as possible for employees by moving the necessary supporting files I pull from to my folder where the dashboard is housed that includes a folder with the supporting files. Since this folder has no restricted access, I added passwords to the whole workbook of each supporting file. I created workbooks in the original locations of where these supporting files used to reside with the hopes that would hopefully allow anyone with access in that folder to use this new file as a backdoor/shortcut of sorts. The goal was to keep people's files where they wanted them and allow them to click it then it takes them to my support file and enter the password for them. Which leads me to the problem...
Problem: When I open the "backdoor" file, everything runs normally and the support file opens up with read and write privileges. However when someone else opens the backdoor file, the support file opens as read only. While I don't have any code yet to determine if someone is in the file (I'll cross that road if my problem is resolvable), I've ensured nobody was in the support or backdoor file when another user attempted to use it.
Sub Workbook_Open()
'I didn't have the next statement orginally. Added it in hopes it'd resolve the issue.
'Tried to move it under the workbooks.open command, too, but to no avail.
SetAttr "M:\Report Writing\Supporting Files\TMR 2017 - Team ABC's SF.xlsm", vbNormal
Workbooks.Open "M:\Report Writing\Supporting Files\TMR 2017 - Team ABC's SF.xlsm", , False, , "XXX"
Workbooks("TMR 2017 - Team ABC's.xlsm").Close
End Sub
I also didn't originally have the "False" in the open.workbooks command but tried to add it in hopes of resolving my dilemma. Any hope is GREATLY appreciated as my whole dashboard is relying on this and we were supposed to deploy today.
It has nothing to do with the VBA code. It is the file itself. On a shared network, depending on how it was set up, when a new file is created everyone can read it, but only the creator is marked as the only one allowed to make changes. If someone Saves the file with another name you wont be able to edit it. You created the file, you can make changes to it.
There is a way to change this. Right click on the file, properties, security, Edit Button, There are some listed users in there but just find the one that says Drive\Users or Authenticated Users, or both, and edit that to provide modify access. This will allow everyone to edit the file.

Delete folder by name from google drive using gdrive

I have read the documentation for gdrive here, but I couldn't find a way to do what I want to do. I want to write a bash script to upload automatically a specific folder from my hard drive. The problem is that when I upload it several times, instead of replacing the old folder by the new one, it generates a new folder with the same name.
I could only find the following partial solutions to my problem:
Use update to replace files. The problem with this partial solution is that new files inside the folder could not get uploaded automatically, and I would have to change the bash script every time a new file is produced in the folder that I want to upload.
Erase the folder by its id from google drive and then upload the folder again. The problem here is that whenever I do this, the id of the uploaded folder chagnes, so I couldn't find a way to write a script to do the work.
I am looking for any method that solves my problem. But the precise questions that could help me are:
Is there a way to delete a folder from google drive (using gdrive) by its name instead of by its id?
Is there a way to get the id of a folder by its name? I guess not, since there can be several folders with the same name (but different ids) uploaded. Or am I missing something?
Is there a way to do a recursive update to renew all files that are already inside the folder uploaded on google drive and in addition upload those that are not yet uploaded?
In case it is relevant, I am using Linux Mint 18.1.
Is there a way to delete a folder from google drive (using gdrive) by its name instead of by its id?
Nope. As your next question observes, there can be multiple such folders.
Is there a way to get the id of a folder by its name? I guess not, since there can be several folders with the same name (but different ids) uploaded. Or am I missing something?
You can get the ids (plural) of all folders with a given name.
gdrive list -q "name = 'My folder name' and mimeType='application/vnd.google-apps.folder' and trashed=false"
Is there a way to do a recursive update to renew all files that are already inside the folder uploaded on google drive and in addition upload those that are not yet uploaded?
Yes, but obviously not with a single command. You'll need to write a short script using gdrive list and parse (awk works well) the output.

Resources