setup
Using Dreamweaver CS 5.5 / Windows 7
I have turned off 'Preview [in browser] using temporary files'
I am noticing however every css and js file, as well as some html and php files, have temporary (.tmp) file(s) being created for them.
These files are also being displayed in my Files window, as well as being put when I synchronize the local and remote sites.
How can I stop these tmp files from being created? or if that is not possible how can I hide them and not have them sync to the remote server?
For anyone else who runs into this.
Disable Design Notes from Manage Sites > Advanced Options
Related
Windows 11 (Home Single Language 21H2 OS build 22000.1281) deletes js files from the project's folders every time
System Restore Point used.
It's so frustrating to find out that you have left with an index.html
and CSS files only.
Is there any way to stop that?
The answer from the Microsoft community:
Vista list of monitored extensions is hardcoded and there is no way to change it.
I have connected my windows 7 machine via winscp sftp with a cubietruck that runs on debian jessie server.
I need to watch on windows the updated log of an installed app in the linux server.
With winscp I can see this log file. I set the winscp to refresh the remote panel every 10 seconds. I tried to open this log with the following editors:
1) Sublime text with autorefresh plugin
2) Notepad ++ with update silently set
3) Glogg
Unfortunately even though the remote panel was refreshed and watched the log file to grow in size, the file opened in the editors was not updated.
I also tried to set Keep Local Directory up To Date , which creates a replicate file in a local directory in windows. When I had this file opened with the above editors and the remote log file changed then a new file was created on the local disk, without succeeding in watching the updated log.
Does anyone know any solution to this issue or a working alternative ?
You can try klogg. It is a fork of glogg. Glogg opens files in such a way that may prevent other programs from accessing them, that is described here. That has been fixed in klogg(see the issue for details).
Bit of a loose question so if it gets marked down I'll remove it.. but..
I'm using Primefaces/Spring/Hibernate for Java server.
My application knows a load of file names I need to upload. Those files are on my local computer. Is it possible to tell the application the root directory of these files, for it to then setup uploads for each of these files without me needing to browse for each file individually?
I assume this is a browser security issue, i.e. the user needs to explicitly state which file the application is allowed to know about etc?
If not I'll have to do it in a local application but I was hoping there was a way a mass upload could be kicked off from the browser by just setting the local directory of the files.
I decided to use the Primefaces uploader, upload all the files in the directory and let the application sort them out once it has them on the server.
The question:
Is there any possibility to "watch" specific folders on my workspace for new files and automatically download them to my local project folder?
I would prefer a solution using only PhpStorm, if that's possible, but I am also fine with a Linux one!
The situation:
I work with PhpStorm 2016.1.1 for Windows 8.1 on several different projects. Some of these projects are developed using Laravel, a very nice PHP framework.
All of my projects are cloned to an Open SUSE workspace server in my LAN by Git.
I import every project by using the "Create Project from existing Files" functionality and choosing the option "Files are accessable via network share or mounted drive".
I created the mounted drive using Samba.
As long as I keep developing in PhpStorm, everything works like a charm. Saved files are uploaded to my workspace automatically so I can debug my PHP projects in the browser very easily.
The problem:
Laravel offers a very nice command line tool to use called "artisan". This tool can, amongst other functionality, create specific classes for your projects like events, jobs, migrations, seeds, and so on.
This files created on the command line are, of course, not visible to me in PhpStorm because they are not in my local project folder until I manually start downloading from my workspace.
I do not know if it will help you but there is a Ticket from PhpStorm for a similiar function: WI-1284
It is about 6 Years old so i donĀ“t think that this is coming soon. Perhaps there is another solution for it.
This could help for synchronisation of a remote host: configuring-synchronization-with-a-web-server
So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.