I'm using FileZilla to view both Local files (Rebex Tiny Sftp Server) and remote SFTP files.
My question is:
As we can view non sftp files like file:///Z:/AzureConsole/ etc in any browser.
Can we view same way with SFTP URL?
Generally not. The SFTP protocol is significantly different than HTTP (used by browsers) so there is no support for that.
In some cases, you can view the same files using HTTP (if the SFTP files are available in HTTP root of that server), but generally, your best guess is to copy the files locally and open there.
Related
This may seem like an odd/broad question, but how does a server know not to render Express.js files and not to expose the content similar to how anyone can see a javascript file, and read the script being executed. Do node servers like Heroku protect them ? Sorry just new to express and node. Is it similar to how PHP syntax/scripts are hidden and protected in a Apache server?
It depends on the server configuration. On a poorly configured server, the .js files might be accessible.
With a nodejs/expressjs server you define a base folder that contains public files, e.g. public and files outside of that public folder are not visible, because the server doesn't serve them to the outside. If you configure the wrong directory, e.g. ., then the expressjs code files would be available to browsers and would be rendered as-is to them, potentially revealing unsafe data like configuration, passwords and so on. Since the default configuration and all code examples make sure that public is defined as the public folder, the risk of accidental misconfiguration is low.
If you run an apache httpd or other webserver on the same host, you have to make sure that the node application is not inside the webroot of any vhost, otherwise the files might also be visible, because to the apache httpd they also look like simple static files, ready to be sent as-is to the browser.
It is different from PHP files, at least in the case of apache httpd or nginx, because those are usually configured so that PHP files are files to be executed, not static files to be served to the outside. However, if the apache httpd or nginx doesn't know about PHP, either because it isn't installed or isn't configured, then PHP files inside the webroot would also be shown to the public as-is. Display of files for the apache httpd can be prevented using .htaccess files.
I have recently launched a website on GoDaddy hosting. I have keept some images and JavaScript files used in website, in separate folders. I want to prevent the users from browsing those images and files by simply appending the folder and file name in the website URL. For example
www.example.com/images/logo.png
If I understand correctly, you want to have html file with images, that shouldn't be accessible alone? If yes, then it cannot be done. You can watch for correct HTTP Referrer header, but it can be simply faked and it also makes it inaccessible for browsers that don't send referrer or having sending it forbidden for "privacy" reasons.
If you want hide files to be accessible only by server side scripts, ftp/scp, then you can try to use .htaccess (if GoDaddy runs on Apache) and correct configuration: https://httpd.apache.org/docs/2.2/howto/access.html
Another way could be hiding that files and creating one-shot token like this:
<img src=<?pseudocode GEN_TOKEN("file.jpg") ?> /> with another file serving these hidden files just for generated token, then deleting it from DB. Nevertheless, this will not protect anybody from downloading or accessing these files, if they want...
But, anyway, try to clarify your question better...
If you are keeping images/files in folder which is open to public, I guess you kept in that folder for purpose, you want public to access those images and files.
How public know images file name? Stop file content listing for your web site.
I am not aware which language you are using on web server, but in ASP.NET you may write module/ middle ware which can intercept in coming request and based on your logic (e.g. authentication and authorization) you can restrict access. All modern languages support this kind of functionality.
I have used p:fileUpload to upload an image. I don't really need to upload the image to the server but I just need to get the full local URL(ie.c:/.../../..) of the file(image) which is saved in the local disk, I tried but I just got the filename with the extension. This is an web application which is used locally, so both sever and client are on the same machine. The URL need to be saved in database.
For security reasons, browsers don't send the full client side file path. They only send the file contents and the file name. Ancient browsers and MSIE are the only browsers who expose the security bug of still sending the full client side file path along with the file upload. You should not be relying on this security bug in your application.
You're supposed to grab the file contents in flavor of InputStream of byte[] and write it immediately to a more permanent storage location yourself by FileOutputStream or perhaps via a #Lob to a BLOB column in DB. You can if necessary use File#createTempFile() to autogenerate an unique filename.
Note that a local disk file sytem path can't represent a valid HTTP URL which the client could use to obtain the file. Browsers like Firefox refuse to serve file:// URLs when the initial webpage itself is opened by http:// instead of file://. So you really need to serve those uploaded files back via a web server. It's recommended to just store only the file name (not full path!) in the DB. You can then configure the webserver to publish a certain folder to the web, or create a simple servlet to serve a certain folder to the web.
See also:
How to save uploaded file in JSF
How to convert Part to Blob, so I can store it in MySQL?
Load images from outside of webapps / webcontext / deploy folder using <h:graphicImage> or <img> tag
I have one linux server for my website which contain php code,database and files.
files are being uploaded and downloaded by enduser for their individual tasks. My website is working fine but as website evolove these files volume will be increased so my webserver will be overloaded.
So I want to use seperate server for files so that burden on 1 webserver will be decrease and files will be downloaded and uploaded on another server.
Can anyone suggest me best way to achieve that. I know the files can be transferred to another server by FTP functions of php just after uploading through website but doesn't seems a correct way.
option 1 (simple): you have the upload form post to the second server rather than handling the upload with the server that runs the rest of the application
option 2: (in most cases wrong approach) have the receiving script on server 1 store it in the db, and server 2 when checking for the download pull it from db and cache it locally.
option 4: (my favorite) put a reverse proxy (for example varnish) on server 1, let the application run on server 2, and have the proxy cache static files for , so the reverse proxy will handle the downloads (and other static files like images, javascript etc.) if available. This allows you for a few other tricks to improve your performance as well (like caching sites that can be cached). https://www.varnish-cache.org/
Where would you store files that are meant for sale on an e-commerce website?
Somewhere out of htdocs/wwwroot/etc. You don't want anyone to link to them directly. You should have a page/script that can read that location and send the file back.
On a secure server in a network zone that is not directly accessible from the internet. Your webserver can then access and retrieve files only for authorised users.
Rule of thumb: Not in htdocs (i.e. not accessible from the internet).
What do you want to do with those files? Offer them for downloading after a customer payed? You should manage the credentials by a server sided script (e.g. a PHP script) and give that script access to the file.