I have a some project in my apps folder and those were deleted permanently. how can i recover it from command line,apps folder contains some folders like project_1, project_2 etc. and project_1 folder contains files.
Depends on how long ago it was. If this was just now, shut down the system immediately and take a disk image. Foremost is the usual tool for file recovery, but as your partition is intact you might try some things like extundelete. If you deleted it a few days ago, it's probably mostly gone.
Related
I accidentally deleted a wrong file in Visual Studio 2019 before commiting the changes. I lost not a lot of work but I want to prevent this in future.
Can I setup VS 2019 somehow that files are not immediately deleted but kept in a cache for a while?
Supposedly there is already a backup folder which VS uses for deleted files, but this folder was empty in my case. And also the fact that VS moves files into the Windows bin doesn't help me because my repository is not on the system drive.
Do you know about any settings in VS or is there maybe an extension?
Not exactly what you are looking for but this extension could help you. It migth not prevent the deletion, but give you the opportunity to recover the code.
I haven`t tested it and I am also not sure if it will work with VS2019.
A visual source code plugin for maintaining local history of files.
Every time you modify a file, a copy of the old contents is kept in the local history. At any time, you can compare a file with any older version from the history. It can help you out when you change or delete a file by accident. The history can also help you out when your workspace has a catastrophic problem. Each file revision is stored in a separate file inside the .history folder of your workspace directory (you can also configure another location, see local-history.path).
https://marketplace.visualstudio.com/items?itemName=xyz.local-history
The answer here is to use a version control system (like git) and keep a full history of your project.
Even on simple personal projects it is worth doing.
I've been searching around and can't find any clear answers to this. I need a small amount of data - talking kilobytes, probably not ever reaching megabyte range - available as a file on my Azure instance, outside the web app itself, for a web job to work with. I won't get into why this is necessary, but it is (alternatives have been explored), and the question is now where to put those files. The obvious answer seems to be to connect to the FTP, create a directory, plop them there and work with them there.
I did a quick test and I'm able to create a "downloads" directory within the "data" directory, drop some files in it, and work with them there. It works great for this very small, simple need that I have.
How long will that data stay there? Is that directory purged at any point automatically by the servers? Is that directory part of any backups that are maintained? How "safe" is something I manually put outside the wwwroot folder?
It will never be purged. The only folder that can get purged is the %TEMP% folder. All other folders that you have write access to will be persisted forever.
How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?
My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.
If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.
What I'm trying to do:
I want to launch files to a .NET based website. Any time the dlls change, Windows recycles the web app. When I rsync files over the app can recycle several times because of the delay instead of the preferred single time. This brings the site out of commission for a longer period of time.
How I tried to solve it:
I attempted to remedy this by using the --delay-updates, which is supposed to stage all of the file changes in temporary files before changing them over. This appeared to be exactly what I wanted, however, giving the --delay-updates argument does not appear to behave as advertised. There is no discernable difference in the output (with -vv), and the end behavior is identical (the app recycles multiple times rather than once).
I don't want to run Cygwin on all of the production machines for stability reasons, otherwise I could rsync to a local staging directory, and then perform a local rsync, which would be fast enough to be "atomic".
I'm running Cygwin 1.7.17, with rsync 3.0.9.
I came across atomic-rsync (http://www.opensource.apple.com/source/rsync/rsync-40/rsync/support/atomic-rsync) which accomplishes this by rsyncing to a staging directory, renaming the existing directory, and then renaming the staging directory. Sadly this does not work in a Windows setting, because you cannot rename folders with running dll files in them (permission denied).
You are able to remove folders with running binaries, however this results in recycling the app every time, rather than just when there are updates to the dlls, which is worse.
Does anyone know how to either
Verify that --delay-updates is actually working
Accomplish my goal of updating all the files atomically (or rather, very very quickly)?
Thanks for the help.
This is pretty ancient, but I eventually discovered that --delay-updates was actually working as intended. The app only appeared to be recycling multiple times due to other factors.
I deleted files accidentally in Dreamweaver.
How to recover the file ?
I use Dreamweaver since many years. I don't know this function. Here are the possibilities I see:
You should recover the file fom your repositoty if you use that.
Or maybe you could also try to recover thatfile with the windows recovery functions.
You could try to run a file recovery program.
Recover that file from its orginal source if that is not your work.
Recover that file from the webspace ehere the file is online.
If it's a local file, it ends up in your Recycle Bin (on Windows at least).
Files deleted through FTP are lost and may be only recovered by storage recovery software.
(I know the question is 2 years old)