Bit of a loose question so if it gets marked down I'll remove it.. but..
I'm using Primefaces/Spring/Hibernate for Java server.
My application knows a load of file names I need to upload. Those files are on my local computer. Is it possible to tell the application the root directory of these files, for it to then setup uploads for each of these files without me needing to browse for each file individually?
I assume this is a browser security issue, i.e. the user needs to explicitly state which file the application is allowed to know about etc?
If not I'll have to do it in a local application but I was hoping there was a way a mass upload could be kicked off from the browser by just setting the local directory of the files.
I decided to use the Primefaces uploader, upload all the files in the directory and let the application sort them out once it has them on the server.
Related
I've created a VueJS App for file uploading in which it will be my admin panel (For CMS) with NodeJS as back-end. Now I want to upload the files that was passed to the NodeJS and move it to another VueJS App which is going to be my primary website in order for me to access the files locally. How can I do it? Any suggestions or different approach will do.
You have many options on how to move the files to another site.
You could store them in a shared bucket or shared directory between your 2 backends or you could add another route to download that file.
You could configure a cronjob to scp or rsync those files to your target machine.
This is really more of a question of how to sync a directory to some place else.
I created a spring-boot web project and uploaded it to server already(centOS7).
currently the img upload to jar file on server is stored inside the static package in jar file
this makes the jar file very large and hard to edit.
can some one give me a idea to store the img somewhere else on server and how to find the position of picture out of jar inside html.
First of all, you have to decide, in which directory you are going to store your files and create it:
mkdir /path/to/your/dir
Then assign a newly created directory to your application user:chown <your user>:<users group> /path/to/your/dir
Then, don't forget to give read/write permissions for the user, under which you run your app - to the already created directory.
chmod 600 /path/to/your/dir - this will allow your app to only read/write to the directory and prevent the execution of files within it (for security reasons).
Then just replace the path you already have - with the new one (to the newly created directory).
Please, be aware that there is a lot of security stuff to consider when you're going to store files on your server.
By the way, please consider reading about different storage options like AWS S3, Google Cloud Storage and Ceph.
Please, take into account - that if you're going to store your files on your server - then you should take care about them (for example: keep eyes on space, make sure you have mirroring across discs and so on so forth). With AWS S3, for example, you don't need to care about all of that stuff and it's very cheap.
I am developing an open source application which should mount webdav share to local drive letter just like NETDRIVE and WEBDRIVE using node.js and electron-js, so in my application at present I am downloading all files from webdav share which takes a lot of time and not reliable for heavy data. Is there any other approach so that whenever user access a file, only that particular file should be fetched from webdav share, I’ve tried to display files meta data(dummy) structure in directory and kept that directory under file-watcher. So that when user tries to open a file then watcher should capture file open event to get which file was user trying to access, so that in background of application a service will triggered to fetch that particular file using file-path as reference, but none of them are unable to capture file open event. Is there any other approach to do so, correct me if I am in wrong direction.
Thank's
I think you want virtual file-system and recommend Dokan library.
Dokan is the start point of Windows virtual file-system application.
Open Source : Dokan (https://en.wikipedia.org/wiki/Dokan_Library)
Commercial: EldoS CBFS (https://www.eldos.com/cbfs)
Google, Naver use Dokan and NetDrive, RaiDrive(mine) use CBFS.
We have to fix some security vulnerability in our system, and one of the items is to: disable execution of uploaded scripts/exe's through file upload control.
We have excel upload facility. Lets say hypothetically hacker changes the .exe to .xls and uploads it (there are ways to block that, but ignore that for now). Also assume that
the upload folder is within pubilc directory from where the website is installed in IIS. OR
Someone can access that file by specifying a full path of file thru some api endpoint of which hacker is aware of
Now given that there is an exe or a script which is accessible to the hacker through above means, is it possible for hacker to run that script/exe in someway, so that it can cause harm to the server where the site is hosted?
I am not really security expert hence cant think ways how that can be possible? How a hacker can remotely run exe/script on server, given that they does not have any access to the server.
One of the things that you should definitely do is to remove IIS handlers permissions from running scripts, otherwise anybody can upload a ".asp" or a ".aspx" or any other script engine file and then execute it by requesting it. One simple way to test that is just create a "test.asp" file with "<%= Now() >" and if that returns you the date, then anybody can upload scripts and run them in your server.
The way to disable that in IIS 7+ would be to add a configuration file in a parent directory and edit the permission for handlers, for example assuming a child folder called "public" you can drop the following web.config to disable that:
<configuration>
<location path="public">
<system.webServer>
<handlers accessPolicy="Read" />
</system.webServer>
</location>
</configuration>
You can test then that it should no longer execute the file and instead block it. If you want to allow download of them, then you'll need to configure the static file handler (and request filtering) to handle everything instead, but make sure you do that for that folder only since you don't want people downloading your source code.
Running the script would require remote access to the server, either directly or by exploiting some bug in the website code (similar to SQL injection). The risk here is mostly in hosting malware, especially if you allow user uploads to be downloaded by other users. While getting malware onto a machine is not as simple as just renaming an executable to another file type (it still has to be run as an executable rather than an Excel spreadsheet, for instance, to be able to function), it is possible to embed malware in various types of files, such that the act of opening that file causes execution of the malware. In that sense, you really can't tell at a glance whether a file is malware or not. It could look like an Excel file even open up properly in Excel, but still wreck havoc. The only way to be safe is scan all user uploaded files with a good antimalware application.
As far as running something remotely goes, though, the access to the server required to run the script would provide a much better avenue for mischief that your upload form, anyways. So anyone who could manage that kind of access isn't going to be trying to exploit you through your upload form, and anyone who uploads something malicious without that access can't really do anything.
So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.