OwnCloud Remove all files prompt - prompt

I have a owncloud server and the owncloud desktop client.What I want to do is to be able to delete things server wise and have it automatically delete from the pc. The problem is that the owncloud client displays a warning message of "Remove All Files"? with the choices of Remove all files or to keep files when the files are deleted from the server. Is there a way to not have the prompt come up and automatically remove all files?

In the version 2.2.3 (maybe earlier), you can change the configuration file to disable the prompt.
See the code where the prompt is invoked and the code showing the configuration file property.
If you edit (on Windows): c:\Users\myuser\AppData\Owncloud\owncloud.cfg and add the following, under the [General] section, you will no longer get the prompt.
promptDeleteAllFiles=false

The short answer: You cannot change this currently.
The long answer: The dialog was added as a safe-guard because there were cases where you could lose all your files unintentionally, e.g. if your admin re-created your account and left it empty. The client would assume the files had gone and would replicate this (it could not know better), so it would replicate the data removal locally. The code is still there today just to be safe.
If you are fearless, you can patch Folder::slotAboutToRemoveAllFiles(). Alternatively, you could open a bug report so we can solve this for everyone. What is your motivation to be able to do this without a prompt?
PS: The sources can be found on GitHub. URL and build instructions at http://doc.owncloud.org/desktop/1.5/building.html.

I have a script that processes the files that someone drops into ownCloud and it will then move them to the final storage place. However, this prompt stops the client from syncing until I manually log in to acknowledge it... I guess I will learn how to patch this.. Dropbox doesn't do this. Google Drive doesn't do this. But since I can't use cloud services (compliance issues), I have to use this solution until I can build a new secure upload means.

Related

Python Windows Explorer Force Refresh

I have code, but it doesn't do what I want. I'm trying to delete thumbnail cache via Python script. If I have cache and explorer's open, some of the .db files won't delete because they're in use by explorer. Anyway, in my experimenting, the only way to do it is to change the advanced folder setting for displaying icons/thumbnails, and then restarting explorer. I can change the setting via script in the registry and then restart explorer. BUT, restarting explorer halts the script. So, I don't know. I also tried changing the setting and then sending the WM_SETTINGCHANGE message via SendMessageTimeout(). That didn't do the trick.
So, anyone have any idea how to unlock files (that are safe to delete) from explorer (this could also pertain to other in-use files) without restarting it? Also, I understand this is a sort of dumb project, but I have my reasons for doing it and it's what I want to do.
If I have cache and explorer's open, some of the .db files won't
delete because they're in use by explorer.
Because thumbcache.dll still has an open handle to the local thumbs.db file and does not currently implement a mechanism to release the handle to the file in a more dynamic and timely fashion.
They are only generated for compatibility with outdated applications, and are not required for Windows operations.
To work around the issue, enable User Group Policy setting for "Turn off the caching of thumbnails in hidden thumbs.db files":
Refer: http://support.microsoft.com/kb/2025703?
Also, you can directly edit the registry. Refer:Let me fix it myself
Related: "The action can't be completed because the file is open in Windows Explorer"

I've accidentally used browser-sync without going to the local folder

Am I in trouble?
I've very new to code and just learning how to update my repository or something on github.
Anyway, I went and tried to use browser-sync and used the following command:
browser-sync start -- server -- directory --files "*"
only for my firewall to prompt me about blocking some features of the program nodejs. Like it said from a video, a tab was made on chrome but instead of some specific folder, it showed ALL my folders in my local user like the the Documents, Pictures, Downloads etc. I think this was because I didn't cd to the specific folder I want to show
My pc is new so there really wasn't much in it but I'm scared I may have compromised my security. In panic, just exited both the firewall prompt and the tab so I wasn't able to snip it.
I don't know if I left some vital info out but that's just it. What should I do? I don't care much about my files but I've saved some of my passwords on chrome (social media sites, mostly. Few about my job. Nothing on payments) . Should I just change all of them? What are the steps I should do?

Azure does not update files on a web server (Pyramid wsgi-app)

I have a project Pyramid Application. I store it on git and pull the branch to the server when I need update. Until now I was working on Koding but lately decided to check out azure and it's developer's benefits.
After I've created ubuntu server virtual machine (which actually is what runs under Koding) I've downloaded my project using git pull, but forgot to change the branch to the one I'm working on atm. So I did, but server still shows me the old page (like I didn't checkout the other branch). So I checked sftp and files show me they have been updated.
Why am I still seeing the old page?
Now I know the reason why! (at least I think, but please. correct me if I'm wrong)
I noticed that there was .pyc file for every .py file, and those are "compiled" (bit of simplification?) python files as I understood it. And it seemed to me that they were not "compiled" on app launch. But they compiled with setup.py... edit dates suggest that.
So the reason why I didn't see the changes I did in code was that... http.server was using old "compiled" files instead of the source files! But is that normal/expected behaviour? Dunno. There are many other quetions now, but main question was answered so I mark this as answer until someone gives better answer.

Cyberduck uploads break node-dev/up/hotnode

I've been using Cyberduck 4.2.1 to connect to my EC2 instance to edit my Node projects. I've used Node-dev to reload my project/server as files are updated, but if I save the files through Cyberduck's Edit command, the server never really reloads and usually crashes.
I've tested with a few different editors (TextMate, Dashcode) with the same result. Node-dev restarts correctly when I edit files from the terminal. I have tried a few others that do rougly the same thing, hotnode and up. They all work when editing via Terminal, but fail when I edit files through Cyberduck. I think it has something to do with the way Cyberduck replaces the remote files when it is saved.
Does anyone know what might be causing this, and maybe suggest some changes to these github projects? If not, are there better Mac FTP clients that might not have this issue?
I don't know about Node-dev, but my educated guess is that it crashes because it reads a partially uploaded file. I suggest to try the Upload with temporary filename feature available as a hidden option in Cyberduck.
You can try CyberDuck 6.6.2 ....It works for me

What's the best way to move to a new Perforce server?

My home Perforce server died. I set up a new one.
The project I set it up to support died in the planning phase. The contents of the depot at that point were some prototype code and we never got to setting up a disaster recovery plan.
The dev machines still have the existing code on them. As much as possible, I'd like the change of servers to be transparent to the developers--use the same depositories and the same directories, just change the name of the server to connect to and get back to work.
What do I need to do in order to make this happen?
I assume you don't have access to the perforce depot files from your dead server? I assume you know that you will lose all your history.
If that's the case all you need to do is setup the new server, create a user / client with the same root clientspec path as your original clientspec was using on your dev machine and checkin all the files into perforce. Pretty simple really...
You may need to rebind is SCM binding that you may have in tools like Visual Studio but that's about it.
What Shane suggested will populate the depot with one person's version of the files. But if you have another user who also has a copy then you'll need a couple of extra steps.
Firstly, just set one machine up as suggested by Shane.
You now need to get the second user set up. If you are confident that the version of the code user 2 has exactly matches what you put in the new server, then just create a client spec (probably same name as used before), and then sync using the "Force" flag. This will overwrite all the files on user 2's machine, and - more importantly - ensure Perforce knows which versions you really have.
However, if you are in any doubt as to any differences in code, then do not do the initial sync from the second user's machine. Instead, set up the client spec, then use the "Reconcile offline work" option - from P4V select the workspace, then it's a right click option. Then just walk through the subsequent dialog to sort out what you need.
Finally, if you want a very quick & dirty backup system for your server, I've posted some notes on my blog here - should take you just a couple of minutes to set up.

Resources