Cyberduck uploads break node-dev/up/hotnode - node.js

I've been using Cyberduck 4.2.1 to connect to my EC2 instance to edit my Node projects. I've used Node-dev to reload my project/server as files are updated, but if I save the files through Cyberduck's Edit command, the server never really reloads and usually crashes.
I've tested with a few different editors (TextMate, Dashcode) with the same result. Node-dev restarts correctly when I edit files from the terminal. I have tried a few others that do rougly the same thing, hotnode and up. They all work when editing via Terminal, but fail when I edit files through Cyberduck. I think it has something to do with the way Cyberduck replaces the remote files when it is saved.
Does anyone know what might be causing this, and maybe suggest some changes to these github projects? If not, are there better Mac FTP clients that might not have this issue?

I don't know about Node-dev, but my educated guess is that it crashes because it reads a partially uploaded file. I suggest to try the Upload with temporary filename feature available as a hidden option in Cyberduck.

You can try CyberDuck 6.6.2 ....It works for me

Related

I've accidentally used browser-sync without going to the local folder

Am I in trouble?
I've very new to code and just learning how to update my repository or something on github.
Anyway, I went and tried to use browser-sync and used the following command:
browser-sync start -- server -- directory --files "*"
only for my firewall to prompt me about blocking some features of the program nodejs. Like it said from a video, a tab was made on chrome but instead of some specific folder, it showed ALL my folders in my local user like the the Documents, Pictures, Downloads etc. I think this was because I didn't cd to the specific folder I want to show
My pc is new so there really wasn't much in it but I'm scared I may have compromised my security. In panic, just exited both the firewall prompt and the tab so I wasn't able to snip it.
I don't know if I left some vital info out but that's just it. What should I do? I don't care much about my files but I've saved some of my passwords on chrome (social media sites, mostly. Few about my job. Nothing on payments) . Should I just change all of them? What are the steps I should do?

Azure does not update files on a web server (Pyramid wsgi-app)

I have a project Pyramid Application. I store it on git and pull the branch to the server when I need update. Until now I was working on Koding but lately decided to check out azure and it's developer's benefits.
After I've created ubuntu server virtual machine (which actually is what runs under Koding) I've downloaded my project using git pull, but forgot to change the branch to the one I'm working on atm. So I did, but server still shows me the old page (like I didn't checkout the other branch). So I checked sftp and files show me they have been updated.
Why am I still seeing the old page?
Now I know the reason why! (at least I think, but please. correct me if I'm wrong)
I noticed that there was .pyc file for every .py file, and those are "compiled" (bit of simplification?) python files as I understood it. And it seemed to me that they were not "compiled" on app launch. But they compiled with setup.py... edit dates suggest that.
So the reason why I didn't see the changes I did in code was that... http.server was using old "compiled" files instead of the source files! But is that normal/expected behaviour? Dunno. There are many other quetions now, but main question was answered so I mark this as answer until someone gives better answer.

How can I work on files on my server and keep them in sync?

I have set up a development web server using VMWare and Debian. It's all set up fine, but I have an problem.
I need to be able to work with the files on the server, or a copy of them. But, it's important that both sets of files are in sync. For example, in my text editor if I'm working on index.php I don't want to have to upload with FTP each time, and I don't want to manually keep track of what files I've edited etc.
Any ideas on how I can achieve this?
Besides version controlling you can achieve it with sshfs. It is basically like mounting a remote directory in your local system.
More info:
http://en.wikipedia.org/wiki/SSHFS
https://www.digitalocean.com/community/tutorials/how-to-use-sshfs-to-mount-remote-file-systems-over-ssh
After much searching I felt the best solution for my case is to use lsyncd to upload files to the development server anytime a change is made.
Although I use git I felt setting up a Git server and having to commit and push every time I make a change isn't what I want to be doing. Using lsyncd I'm able to use git on my local machine to keep track of the project.

OwnCloud Remove all files prompt

I have a owncloud server and the owncloud desktop client.What I want to do is to be able to delete things server wise and have it automatically delete from the pc. The problem is that the owncloud client displays a warning message of "Remove All Files"? with the choices of Remove all files or to keep files when the files are deleted from the server. Is there a way to not have the prompt come up and automatically remove all files?
In the version 2.2.3 (maybe earlier), you can change the configuration file to disable the prompt.
See the code where the prompt is invoked and the code showing the configuration file property.
If you edit (on Windows): c:\Users\myuser\AppData\Owncloud\owncloud.cfg and add the following, under the [General] section, you will no longer get the prompt.
promptDeleteAllFiles=false
The short answer: You cannot change this currently.
The long answer: The dialog was added as a safe-guard because there were cases where you could lose all your files unintentionally, e.g. if your admin re-created your account and left it empty. The client would assume the files had gone and would replicate this (it could not know better), so it would replicate the data removal locally. The code is still there today just to be safe.
If you are fearless, you can patch Folder::slotAboutToRemoveAllFiles(). Alternatively, you could open a bug report so we can solve this for everyone. What is your motivation to be able to do this without a prompt?
PS: The sources can be found on GitHub. URL and build instructions at http://doc.owncloud.org/desktop/1.5/building.html.
I have a script that processes the files that someone drops into ownCloud and it will then move them to the final storage place. However, this prompt stops the client from syncing until I manually log in to acknowledge it... I guess I will learn how to patch this.. Dropbox doesn't do this. Google Drive doesn't do this. But since I can't use cloud services (compliance issues), I have to use this solution until I can build a new secure upload means.

Any workarounds for getting swfupload.js working in Linux?

SWFUpload doesn't work on ubuntu, I can see various mentions of it throughout the internets but I'm wondering if anyone here as found any work arounds?
I'm developing on Windows, so the code executes fine. But my colleague is running ubuntu, and SWFUpload crashes instantly. Has anyone encountered that and found a work around? I've tried a couple of things like commenting out things that cause known-issues like progress but to no effect.
Any help appreciated.
Dave.
be sure you are not under proxy that does not support http1.1 as you will receive an error
make sure the folder that will handle the contents will have a chmod 777(read/write/execute) permissions.
if you type: ls -lsa
you can view all your files and folders. you will see the permissions, name, owners and etc.
in my example contents is a folder in which i will upload the files.
4 drwxrwxrwx 2 maryon maryon 4096 2010-10-26 11:21 contents
as you can see the contents folder which i have has rwx(read write execute) for all users. this will allow users to upload files thru SWFUPLOAD.
if you dont have this permission you can try running this:
sudo chmod a=rwx contents
it means the 'a' stands for all users, will have 'rwx'(read/write/execute).
note:
Contents folder where you are saving
your files.
777 is not so secure,
you can change it later and restrict
other users(which you dont want to
give permission to upload) on
uploading files.
I have problems just with large files. With bigger files flash movie freeze after some time and i need to reload page and nothing is uploaded. It is same in firefox and also google chrome. I use kubuntu 9.10. On windows there is no issue at all...
So i think that it is flash problem, because on linux uploadStart and uploadProgress events are not fired...
DO you have any workaround for this ? Flash version is 10.0 r45 (32bit on 64bit system).
I am using Ubuntu 9.04 with swfupload and everything works just fine. Might be an old flash player.

Resources