What is the best way to extract .zip file/folder structure then commit to Github using Github API? - github-api

I'm trying to place some config files under version control using Github. The ONLY way to get these particular config files is via a GET request to the hosting server. Server responds with a .zip file.
.zip file structure looks similar to:
|-Folder1
--|-File1.json
--|-File2.json
--|-Pic1.jpg
|-Folder2
--|-File3.json
--|Folder3
----|Pic2.jpg
|-File4.json
|-File5.json
As you can see, its a mix of text and binary files. Also has a nested folder structure.
I need to:
Commit this .zip file to Github using the Github REST API (no problem there)
But BEFORE committing, how can I programmatically extract the folders and files (LEAVING THE FOLDER STRUCTURE IN PLACE)
Commit the results to Github
I've checked the question/solutions mentioned here, but they don't quite match/are a few years old.
Anyone have a workflow for doing this? I have to imagine SOMEONE has come across similar requirements.

I ended up using Power Automate to handle this.
Basically,
GET .zip file
Add it to OneDrive for Business
Extract it there (folder structure is "flattened", but luckily OneDrive "keeps" it by renaming each file)
Example: Folder1/File1.json in the .zip becomes Folder1_File1.json once extracted
When committing to Github via api, I just did a replace for the filename; INCLUDING the / as part of the path.
Example URL: https://api.github.com/repos/SeaDude/replace(items('FOR_EACH_app')?['properties']?['DisplayName'], ' ', '-')/contents/replace(items('FOR_EACH_source_file_2')?['Name'],'_','/')
Apparently, you can specify an empty (or non-empty) path as part of an API commit. If the directory is empty, Github will create it!

Related

Extracting data from depot files generated by Perfroce

Many years ago I had a Perforce server hosting an Unreal Engine 4 project, but it's no longer active and I unfortunately don't have access to it. All I have left are some depot folders. There's a specific folder with a bunch of FBX files that I need access to, but the file shows as a folder named something like this: file.uasset,d and file.fbx,d and within them are zip files.
Is there anyway for me to convert these folders into actual FBX files? Any tools or anything out there? Or do I need a server to upload these onto a depot for perfroce to understand what to do with them? Any help would be greatly appreciated!
I've tried opening them in Perforce without a workspace or server and there wasn't much I could do with them.
If you have the server root folder (the one with the db.* files), you may be able to start up P4D and just access the depot normally. If you have a checkpoint file, you can use that to reconstruct the db.* files.
If all you have are the depot archives, you can unzip the files inside them (using regular old gzip or similar) to retrieve the original content. E.g. if you have file.fbx,d/1.1234.gz, you can unzip that and you'll have the content of file.fbx as of change 1234. Each gzip file is a complete revision on its own; you don't need to glue them together or anything like that.
Note that without the database (the db.* files), you may not be able to put together the exact original structure of the depot; the back-end archive files don't exactly correspond to the depot layout since archive files may be "lazy copied" to multiple locations in the depot.

What is the best way to retrieve a single folder from a github repository in a python script?

I need to download a single folder from a github repository in a Python 3 script.
Listing all raw file URLs to download is tedious and eventual new files would need to be added manually
Downloading the whole repository as zip takes rather long; there are lots of unneeded files. [This how I do at the moment]
I have read about web services that do what I would need, such as downgit.github.io, but the problem is that generating the relevant URL and fetching it via urllib.request.urlretrive() downloads the website rather than the actual file.
What can I do? Is there a web service that provides raw file links that I can download as described above?
I just found a solution: The ffspec library may be used to download a single folder from a GitHub repository using Python. See https://sebastianwallkoetter.wordpress.com/2022/01/30/copy-github-folders-using-python/

How can I completely delete a file that was uploaded to Gitlab in an issue comment?

Someone uploaded (attached) a file in a Gitlab issue comment. They did not mean to share that file publicly. I can delete the comment, but the file is still available via the original direct url. The file is at:
https://gitlab.com/<username>/<repo>/uploads/<hash>/<filename>
Is there any way to completely remove files from this uploads directory?
Short version: There's server-side Uploads administration | GitLab, but little to nothing else.
TLDR:
For the owner of a repository, there seems to be no way to get hold of these uploads directly, there even doesn't seem to be a way to list all uploads pertaning to a specific repository (or user/owner), let alone modify them.
use-cases where this would be desirable:
deletion of data that should not be exposed but has been erroneously.
down-scaling of oversized files (images, pdfs, etc)
replacing files with updated versions
deleting space-hogs that are no longer needed.
deleting files that got uploaded accidentally by trigger-happy mice or when the result of a previous upload didn't show in time for the impatient user.
Making these files changeable would cause several issues rooted in their current/previous immutable status:
Users aware of this status will frequently re-use the url to an already uploaded file for perusal in other issues, or the associated wiki (even across projects) to avoid duplication. Afaik, there is no such thing as a link-count for upload items, so deleting an item might result in orphaned references, and changing an uploaded file might render other references out-of-context.
It would solve the serious issue of leaked information, though. The only way I have found so far to remove a file would be to send a prayer to the administrator of the gitlab server, and ask him/her to take care of the uploads directory on the server, as described in Uploads administration | GitLab

Unzip and Rename underlying File using Azure Logic App

Possible to rename an underlying file while Unzipping using Logic App? I am calling an HTTP activity to download a ZIP file. That Zip contains only 1 Underlying file with some value appended to the name. I want to store the Unzipped file with a better name so that it can be used further. Is it possible ?
Incoming ZIP File --> SAMPLEFile.ZIP
Underlying File --> SampleTextFile20200824121212.TXT
Desired File --> SampleTextFile.TXT
Suggestions ?
As far as I know, we can't implement this requirement directly in "Extract archive to folder" action. We can just rename the file by copy it from one folder to another folder (shown as below).
You can create a new ticket on feedback page to ask azure team for this feature.

Can we save files in local when developing custom functions?

I am following this link to try custom functions.
First I put customfunctions.js and customfunctions.html in my local folder, and then replace https://<INSERT-URL-HERE> in the manifest file with their path: I tried \\SOFTTIMUR9FDC\Users\SoftTimur\tmp\EXCEL-CUSTOM-FUNCTIONS and \\Mac\Home\tmp\EXCEL-CUSTOM-FUNCTIONS\, but I could not see any application in SHARED FOLDER in Excel.
Then, I put these 2 files on a website, and then replace https://<INSERT-URL-HERE> in the manifest file with their https address. Now, it worked; I could see the application in SHARED FOLDER in Excel and the custom functions worked.
So is it expected? In other words, when we test custom functions, we could not save these files in LOCAL; we have to save them in a website?
PS: when we develop a normal excel add-in, there is no problem to save the source files in local.
It's possible to host your customfunctions.html file locally, yes. From your description, it sounds like there's something wrong with how you're deploying the manifest, or with the manifest itself. Verify that your only change was the URL for those files, and that it worked properly otherwise.

Resources