Why do I get empty lists when listing files/folders in a folder using Sharepoint API? - sharepoint

I've just created a folder "import" in sharepoint, where I've created :
1 emtpy folder "testdir"
2 emtpy files "test1.csv" and "test2.csv".
When I query this "import" folder through API, I get a valid response but which contains an empty list ("value":[]) :
https://mycomp.sharepoint.com/sites/mysite/_api/Web/GetFolderByServerRelativeUrl('/sites/mysite/mypath/import')/Folders
{"odata.metadata":"https://mycomp.sharepoint.com/sites/mysite/_api/$metadata#SP.ApiData.Folders1","value":[]}
https://mycomp.sharepoint.com/sites/mysite/_api/Web/GetFolderByServerRelativeUrl('/sites/mysite/mypath/import')/Files
{"odata.metadata":"https://mycomp.sharepoint.com/sites/mysite/_api/$metadata#SP.ApiData.Files12","value":[]}
What am I missing here ? Why can't I see my "testdir" folder and my 2 test.csv files ?
Would it be possible I have different permissions in API compared to web IHM ? (no directory listing permissions in API ?) for security reason ?
UPDATE : As an additional test, I tried this after creating an empty file in test dir :
https://mycomp.sharepoint.com/sites/mysite/_api/Web/GetFolderByServerRelativeUrl('/sites/mysite/mypath/import/testdir')/Files
And I get :
{"odata.error":{"code":"-2147024894, System.IO.FileNotFoundException","message":{"lang":"fr-FR","value":"Fichier introuvable."}}}
What is happening here ? Is there a delay before I can query newly created folders/files ? Is there a "commit" command in sharepoint to create folder/files in sharepoint ?
UPDATE 2 :
If I use this syntax : "?$expand=Folders,Files",
I get (the non confidential part of the reponse) :
Using "%20" for all " " gives the exact same response.

Related

Azure Data Factory Counting number of Files in Folder

I am attempting to determine if a folder is empty.
My current method involves using a GetMeta shape and running the following to set a Boolean.
#greater(length(activity('Is Staging Folder Empty').output.childItems), 0)
This works great when files are present.
When the folder is empty (a state I want to test for) I get
"The required Blob is missing".
Can I trap this condition?
What alternatives are there to determine if a folder is empty?
I have reproduced the above and got same error.
This error occurs when the folder is empty, and the source is a Blob storage. You can see it is working fine for me when the Source is ADLS.
for sample I have used set variable.
inside false of if.
if folder is empty:
Can I trap this condition?
What alternatives are there to determine if a folder is empty?
One alternative can be to use ADLS instead of Blob storage as source.
(or)
You can do like below, if you want to avoid this error with Blob storage as source. Give an if activity for the failure of Get Meta and check the error in the expression.
#startswith(string(activity('Get Metadata1').error.message), 'The required Blob is missing')
In True activities (required error and folder is empty) I have used set variable for demo.
In False activities (If any other error apart from the above occurs) use Fail activity to fail the pipeline.
Fail Message: #string(activity('Get Metadata1').error.message)
For success of Get Meta activity, there is no need to check the count of child Items because Get Meta data fails if the folder is empty. So, on success Go with your activities flow.
An alternative would be
Blob:
Dataset :
where test is the container and test is the folder inside the container which I am trying to scan (Which ideally doesnt exist as seen above)
Use get meta data activity to check if the folder exists :
If false, exit else count for the files

Azure Synapse Analytics - deleting pipeline Folder

I am new to Synapse and I have to make a pipeline that will delete files from folders in a hierarchy like the attached image. expecting hierarchy. The red half circles mark the files I would like to delete files for example older than 2 months.
As for now I have made a pipline for a single folder and using the for each loop I can get to the files and delete the corresponding one. And it works, since I have about 60-70 folders and even more files I wanted to go a level higher up and make a pipeline for each folder to execute. And with this is a problem. When i use GetMetadata Activity for top folder, and use for each loop to take name folders then i can not acess files in folder just only folder. Could you help me someone how to slove this?
deleting pipline for single folder using for each loop
We can achieve this using nested for each activities with the help of execute pipeline activity. As mentioned, Get metadata with wildcards returns all files without folders and Delete activity is unable to recognize wildcard folder paths(Folder/*).
I have created a similar folder structure for demo. In my pipeline, I have first created an array parameter req_files (sample1.csv and sample2.csv) with names of files required.
Note: If you want to dynamically do this, you can use append variable to build required file names (file09/22 and file08/22).
I used one get metadata to get folder names (which are inside root folder). I am iterating through the output of get metadata in my for each activity (items value is #activity('root folder contents').output.childItems).
Inside my for each, I used another get metadata activity to loop through each of the sub folders (to get file contents).
Now I have the folder name and list of files inside it. I am going to use execute pipeline to implement nested for each. Create 3 parameters in a new pipeline called delete_pipeline (where I perform delete) as current_folder, folder_files and files_needed.
Pass the following dynamic content for each of them from parent pipeline.
current_folder: #item().name
folder_files: #activity('sub folder contents').output.childItems
files_needed: #pipeline().parameters.req_files
Now in delete_pipeline, I have a for each loop to loop through the list of files we are passing (items value is #pipeline().parameters.folder_files).
Inside this for each, I am using an If condition activity. This is because I want to delete files which are not in my req_files parameter (array from parent pipeline which we passed to files_needed parameter in delete_pipeline). The condition for if condition activity will be as following:
#contains(pipeline().parameters.files_needed,item().name)
We need to delete the file only when it is not present in req_files (files_needed). So, when the condition is false, we perform delete.
I have created 2 parameters file_namepath_of_file_to_delete and file_name_to_delete in the dataset I am using for delete activity with following dynamic content.
file_namepath_of_file_to_delete: Folder/#{pipeline().parameters.current_folder}
file_name_to_delete: #item().name
When I run the pipeline, it keeps the required files and deletes the rest. The following are output images for reference.
Debug output: https://i.imgur.com/E6GNVHW.png
My folder after I run the pipeline: https://i.imgur.com/bqN00Dw.png

Azure Copy Files from AZDatalake to AZBlob sorage Dynamic

I want to copy files from azure data lake storage to azure blob storage. when i use static file name it's copying successfully but it is not getting success while using dynamic file name.
I have followed same process as per below blog but getting String type conversion error.
ErrorMessage:
Error code : 2200
Failure type : User configuration issue
Details:
ErrorCode=UserErrorInvalidValueInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to convert the value in 'fileName' property to 'System.String' type. Please make sure the payload structure and value are correct.,Source=Microsoft.DataTransfer.DataContracts,''Type=System.InvalidCastException,Message=Object must implement IConvertible.,Source=mscorlib,'
Reference Blog : https://www.c-sharpcorner.com/article/extract-file-names-and-copy-from-source-path-in-azure-data-factory/
Set the "WILD CARD" filtering in the copy activity.
Set the wildcard folder path : rootFolder/subFolder/*/*/DestinationFolder
Choose the wildcard file name = * or *.json or *.txt or *.csv.., etc according to the requirement.
Check the following link for detailed understanding :
https://learn.microsoft.com/en-in/azure/data-factory/connector-azure-data-lake-store?tabs=data-factory#folder-and-file-filter-examples

How to delete files based older than specified date in Azure Data lake

I have data folders created on daily basis in datalake. Folder path is dynamic from JSON Format
Source Folder Structure
SAPBW/Master/Text
Destination Folder Structure
SAP_BW/Master/Text/2019/09/25
SAP_BW/Master/Text/2019/09/26
SAP_BW/Master/Text/2019/09/27
..
..
..
SAP_BW/Master/Text/2019/10/05
SAP_BW/Master/Text/2019/09/06
SAP_BW/Master/Text/2019/09/07
..
..
SAP_BW/Master/Text/2019/09/15
SAP_BW/Master/Text/2019/09/16
SAP_BW/Master/Text/2019/09/17
I want to delete the folders created before 5 days for each folder of sinkTableName
So, in DataFactory, i have Called the folder path in a for each loop as
#concat(item().DestinationPath,item().SinkTableName,'/',item().LoadTypeName,'/',formatDateTime(adddays(utcnow(),-5),item().LoadIntervalFormat),'/')"
Need syntax to delete the files in each folder based on the JSON.
Unable to find the way to delete folder wise and setup the delete activity depending on the dates prior to five days from now
I see that you are doing a concatenation , which I think is the way to go . But I see that you are using the expression formatDateTime(adddays(utcnow(),-5) , which will give you something like 2019-10-15T08:23:18.9482579Z which i don't think is desired . I suggest to try with #formatDateTime(adddays(utcnow(),-5) ,'yyyy/MM/dd'). Let me know how it goes .

how to access puppet variable in custom facts?

I am extracting a zip at a location, in extracted folder directory there is a install.jar.I want to copy a file to the directory where install.jar is available. Now the zip I am extracting may have different folder structure every time, because of this I can not use
file{'EXTRACTED_PATH/ant.properties':
ensure: present
}
So I wrote a custom fact that will find out path of a install jar & I accessed value in manifest like
$install_jar_location=$::getinstallerpath
Now in facts file I have to give path to search, this path I want to pass as a parameter.
for that I have declared one variable in manifest file
$installer_location="/home/oracle/Installer"
how can I access it in my custom fact.I tried to do following but $installer_locationvariable value is coming up blank.
Facter.add(:getinstallerpath) do
setcode do
Facter::Util::Resolution.exec("echo \"$(dirname \"$(find $installer_location -name \*install*.jar)\")\" ")
end
end

Resources