gitlabs copy/paste upload folder - gitlab

I'm creating a wiki in gitlabs, I copy/pasted an image into a page and it automatically uploaded the image and referenced it like this:
![image](uploads/84329e7811b5d2efb31b764c4767770d/image.png)
How do I access these uploads via the web browser to update or manage them? I've tried the documentation and it just goes on about default physical locations via a shell which I don't have access to (this is a private gitlabs installation).
Also, does anyone know if this is a permanent location or something wiped (e.g. after a server restart).
I've tried all variations of 'uploads' in my url.
thanks.

Related

Kentico Update Media Libary Direct Path to Azure

We have moved the media stoage to Azure. Any new files successfully get uploaded to Azure and showing the correct URL
but how do I change the direct path of the old files? I have already uploaded all files to Azure. Just need to know how to update the direct path of the old files.
You need to either do it manually and change the links of the old direct path URLs or, you can create a script which will check the DB tables and change the URLs. Regrettably, there is no tool or feature to do this available out of the box.

Node Webkit Desktop App - Browser default caching of PDF files

I have built a desktop app using node webkit and need to cache PDF files that are viewed via the App when online so that they are also available offline. I haven't found a solution yet but during testing I noticed that files that I had previously viewed online were available offline even though I haven't written any code for this yet. Therefore these must already be cached automatically. I did a search to find where the files are being saved exactly but couldn't find anything.
Can anyone explain this or point me in the direction of information on this so that I understand how it works and ensure my App can utilise the default behaviour of the browser caching?
********UPDATE***********
I have found a solution to store the PDFs locally, however this isn't my query. I am looking for an explanation as to HOW the PDFs are available when offline without this code I have written. The files must be automatically be stored somewhere otherwise how would they display?
The default caching behavior of node-webkit is controlled by the page-cache property in package.json :
"webkit": {
"page-cache": true
},
Only typical web resources can be cached this way (scripts, style sheets, etc.). To be able to view PDF files offline, you can store them manually.
There are several ways to do that :
Save a file directly to disk (the simple solution, just store the files in App.dataPath)
Use a database
Use Web Storage
Use the application cache
All of these are documented here : Save persistent data in app
The default location to cache your app files is mentioned in your package.json manifest file.When the app is initialized the settings in your manifest files are loaded by default.Since cached files cannot be accessed programmatically,you can overwrite the default files manually.
To get the application’s data path in user’s directory for windows,you can write it in Jason format in your package :
Windows: %LOCALAPPDATA%/
You can read about other cache menthods in node webkit's documentation :
http://docs.nwjs.io/en/latest/References/App/#appclearcache

Kentico v7 - Disable 'GetAzureFile' Permanent url

I'm working in a site on Kentico v7 but i have a problem with the images that were stored in media folder; because i was trying to get on CMS the direct URL link of the image in the folder, but the link that CMS displayed is using the page "GetAzureFile.aspx" to get the image; I was validated in SiteManager -> Content -> Media -> General that the option "Use Permanent URL" is disabled but the problem appeared again.
Any insights would be greatly appreciated!
The Azure projects always use blob storage to store newly uploaded files. This is because technically the only files available physically in file system are the ones that were deployed with the project, and when any Azure instance restarts, it looses its local file system and only deployment package is restored on new instances.
As media library content may change on-the-fly, Kentico uses GetAzureFile links for all files to be able to serve them regardless of their storage.
You can however use hardcoded links directly to file system to the files that were part of the deployment package, e.g. ones that you use for site design.

Unable to upload files; file upload modal displays CP Home

I am tearing my hair out with a file weird file upload issue that I have never run into before. For some reason I’m unable to upload images via the file manager (both in the file manager itself and if I upload with a custom field using the “file” fieldtype). Strangely, if I add files directly to any of the file upload directories, and sync the files, everything works fine.
After selecting the file and hitting “upload file” (see 01_choose_file.jpg) the modal window displays the CP homepage in an iframe (see 02_upload_progress.jpg).
Has anyone else seen this? Does anyone know how I can start troubleshooting this?
Background Info:
I’m running EECMS v2.5.2 - Build Date: 20120606 in MAMP (only 2 out of 15 sites I have set up locally are not working)
I have tried uploading images/files using the latest versions of Chrome, Chrome Canary, Safari, and Firefox (OS X 10.7.5)
This issue is showing up on the two latest sites I’ve started dev’ing locally on, no other site (locally or otherwise)
Things I’ve done:
Checked Apache/PHP error logs; they don’t show anything
Confirmed file upload paths and file upload directory settings are correct – I can sync files that i manually move into the various file upload directories
Permissions are fine; image manipulations and thumbnail creation work fine if I manually add files to the upload directories
Tested various other 2.5.2 installs I dev on locally and they work fine (settings on these two new sites are identical to sites that work)
Only a handful of native add-ons are enabled
“Apply XSS Filtering to uploaded files?” setting Yes or No does not make a difference
Huge thanks for any help!
I can't post images so here are links to the images:
01_choose_file.jpg: http://expressionengine.com/?ACT=51&fid=105&aid=16264_Jiof3p0V1gfEEFrpC55G&board_id=5
02_upload_progress.jpg: http://expressionengine.com/?ACT=51&fid=105&aid=16265_mjGH02xK2fIFZJI6kruP&board_id=5
I have sorted this out. I went back through to make sure I disabled all third party add-ons and I had forgotten to uninstall the "Quickee" extension http://devot-ee.com/add-ons/quickee. For now that seems to be the culprit.
I've submitted the bug to Matt (the developer) and it should be patched up soon.
The ExpressionEngine filemanager sends out a AJAX POST request to the following URL:
http://YOUR_ADMIN_CP_URL?S=0&D=cp&C=content_files_modal&M=upload_file
Have you tried loading that URL yourself? You should get a page like this
But maybe EE is trying to POST to a different URL. You can find it by uploading a large file and while it's uploading using Firebug and in the Network tab at the bottom of the list you will find the URL EE is posting to

How to setup IIS 7 using physical path directing to DropBox?

I'm using multiple computers for development and I want to be able to store my files in my dropbox folder. I went to change the physical path in IIS from c:\inetpup\wwwroot to the dropbox folder but I get this error:
The requested page cannot be accessed
because the related configuration data
for the page is invalid.
I couldn't find the config file so I was wondering if anyone had done this before or whether there a better way to sync everything nicely across several PCs?
I tried it (IIS 7.5, Win 7) and it should work just fine to let your physical path of your web look at your dropfox folder. I would guess your web.config file generally contains malformed XML (see KB942055).
I'd suggest, try to map it to an empty folder just with an index.html file and see if this error still occurs.
As a workaround, I guess you can put Dropbox in your wwwroot folder and set up a virtual directory that points to Dropbox. However, there are some security issues that may hinder you from doing so. I come across a nice tutorial on how to set up Dropbox to IIS as FTP Publishing. Hope it helps.
Hodgin's guide on using Dropbox as FTP publishing.

Resources