I am working on a Installscript project using installshield 2011. Here I need to create multiple media. So I initialize MEDIA with another media library(second1.cab). When I use FeatureMoveData to transfer files with new MEDIA, it shows error while transferring the files that has destination as given by script defined folder value. So I want to know that, is their any way to set the value of script defined folder for new MEDIA? I tried using FeatureSetTarget function for new MEDIA but it showed same error.
Well using TextSubSetValue in place of FeatureSetTarget worked in my case. Now the files are getting transferred from multiple media using the same script defined folder.
Related
We are using Logic App to move data from a Sharepoint folder to an Azure Blob Storage.
We were using the Sharepoint trigger "When a file is created or modified in a folder". Unfortunately, this trigger has been deprecated and does not work anymore (i.e., when a file is indeed created or modified, no further action is done after running the trigger).
No file is moved around anymore. The trigger does not execute the Logic App even though a file is created or modified in the Sharepoint origin folder. I have been through the various other Sharepoint triggers but they do not seem to fit our use case. We cannot create a Logic App for each file. We are not using Sharepoint lists but classic folders. We could use several triggers pointing directly at each existin file, but as we have many files to move in the same folder, we would have to create many Logic Apps and that is not how we want to do it. Moreover, some new files may be created in the future.
What could we do to keep the same architecture of moving data around from Sharepoint to Blob Storage through the non-deprecated Logic App triggers?
Thank you in advance,
Alexis
You can use When a file is created or modified (properties only) and get the properties of the file that is getting created or updated. Then you can use Get file content using the properties from the previous step. Finally, you can create a blob using the previous steps. Below is the flow of my logic app.
RESULTS:
I created a Logic App that uses the Sharepoint trigger "When a file is created or modified in a folder". It works perfectly when I upload a file in Sharepoint online (in a Sharepoint browser tab). But, it doesn't work when I drop a file in my synced Windows explorer folder.
I read that someone faced the same problem: https://learn.microsoft.com/en-us/answers/questions/41215/logic-app-why-does-sharepoint-file-properties-trig.html. Here it says:
Move files and flow runs When you move one or more files from one
document library to another, the original file is moved from the
source library to the destination library. Moving the file does not
alter any custom metadata, including when the file was created and
modified. Hence, this action does not trigger any flows for those file
updates associated in the library where it was moved.
Syncing files to your OneDrive for business and SharePoint document
libraries When users sync one or more files from one document library
to another, the original file is moved (synced) from your client to
the destination library. Syncing the file will not alter any custom
metadata including when the file was created and modified. Hence, this
action will not trigger any flows for those file syncs in that library
or in your OneDrive for business.
The thing is that I NEED this Logic App to run by just dropping this file in a Windows Explorer folder (which is a Sharepoint folder shared with a certain person). Do you know how can I achieve this?
It started working for me when I used the OneDrive - When a file is created Connector because we use OneDrive for Windows Explorer and need to include the folder where the trigger should be invoked. We must set Include subfolder to true if we want the trigger to be fired while adding any file to the subfolders.
Here are the screenshots of the logic app working
When adding file in subfolders
When adding file in root folder
I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)
I am following this link to try custom functions.
First I put customfunctions.js and customfunctions.html in my local folder, and then replace https://<INSERT-URL-HERE> in the manifest file with their path: I tried \\SOFTTIMUR9FDC\Users\SoftTimur\tmp\EXCEL-CUSTOM-FUNCTIONS and \\Mac\Home\tmp\EXCEL-CUSTOM-FUNCTIONS\, but I could not see any application in SHARED FOLDER in Excel.
Then, I put these 2 files on a website, and then replace https://<INSERT-URL-HERE> in the manifest file with their https address. Now, it worked; I could see the application in SHARED FOLDER in Excel and the custom functions worked.
So is it expected? In other words, when we test custom functions, we could not save these files in LOCAL; we have to save them in a website?
PS: when we develop a normal excel add-in, there is no problem to save the source files in local.
It's possible to host your customfunctions.html file locally, yes. From your description, it sounds like there's something wrong with how you're deploying the manifest, or with the manifest itself. Verify that your only change was the URL for those files, and that it worked properly otherwise.
I have an app that synchronises content with a web server so that the app ends up with an offline and cut down version of the server based web pages. All text and html is stored in a SQLite database but what is the best approach for handling file assets? In my case this is a mix of image and audio files.
The synchronisation is all set up in the core project and my Touch project has a Content directory set up for storing the assets and my intention had been to have a similar setup for Droid. I could pass the list of files needed to the UI projects and download them from there but that seems wrong.
Thanks.
For that I would create a Service in Mvx which the ViewModels you create use for getting the external assets. Take for instance the Daily Dilbert Tutorial. You could consider the daily comics as being very similar to your external assets, where the DilbertService is used to get all the comics and presents them in a List. However your list could be a list of files located on the SDcard or where you decide to store your files.