I've accidentally used browser-sync without going to the local folder - node.js

Am I in trouble?
I've very new to code and just learning how to update my repository or something on github.
Anyway, I went and tried to use browser-sync and used the following command:
browser-sync start -- server -- directory --files "*"
only for my firewall to prompt me about blocking some features of the program nodejs. Like it said from a video, a tab was made on chrome but instead of some specific folder, it showed ALL my folders in my local user like the the Documents, Pictures, Downloads etc. I think this was because I didn't cd to the specific folder I want to show
My pc is new so there really wasn't much in it but I'm scared I may have compromised my security. In panic, just exited both the firewall prompt and the tab so I wasn't able to snip it.
I don't know if I left some vital info out but that's just it. What should I do? I don't care much about my files but I've saved some of my passwords on chrome (social media sites, mostly. Few about my job. Nothing on payments) . Should I just change all of them? What are the steps I should do?

Related

Using haxe to edit remote file?

I've searched in haxelib for a library to use for remotely editing a file on a server using ssh connection with haxe, or listing files in directory..
Has any one done this with haxe?
I want to build a desktop app to create a yaml editor that will change settings files of several servers using a frontend like haxe-ui.
Ok, there are probably a lot of ways you could do it, but I would suggest separating your concerns:
desktop app to create a yaml editor
Ok, that's a fine use case for Haxe / a programming language. Build an editor, check.
change settings files (located on) several servers
Ok, so you have options here. Either
Make the remote files appear as local files via some network file system, or
Copy the files locally, edit them , and copy them back, or
Roll your own network-enabled service that runs on each server, receives commands, and modifies the files.
Random aside: Given that these are settings files, you probably also want to restart some service after changes are made.
I'd say option 2 is the easiest. There are even many ways to do that:
Use scp to both bring the settings files to a local location, edit them locally, and then push them back. And if you setup SSH keys, you won't have to bother with passwords.
Netcat is another tool for pushing bytes (aka files) over the network. It's simpler than scp, but with no security measures.
Or, get creative / crazy, and say, "my settings files will all be stored in a git repo. The 'sync' process will be a push / pull setup."
There are simply lots of ways to get this done.

Can SVG-Edit be made to work in a standalone/offline context?

Because SVG-Edit is such a unique and appealing program, I've been searching for an answer to this question for years, but have come up dry.
After a major struggle, I was able to get it to work by installing Windows IIS, then setting up a web server, etc. However, this is far from ideal.
Is there some reason why it won't (or shouldn't) run in a fully standalone/offline mode? Specifically, what I'd like to do is extract the GetHub zip file to a local folder, and open "svg-editor.html" in a browser. In general, this produces either a blank window, or (in some previous versions) a window with various missing items.
There had been a race condition which was causing svgedit to err, evident in Chrome when loading with file:// URLs, and now fixed in the master branch on Github.
You won't be able to load svg-editor-es.html locally from a file:// URL--svg-editor-es.html being the original source which relies on ES6 Modules to load its files but problematic as they are not permitted to load locally, causing origin errors to show in the console), but the svg-editor.html file (which is the backward compatible way to use svgedit) appears to be working now after the fix--at least for some basic functionality like making drawings.
Some functionality may not be possible to work, however, due to limitations related to limited permissions with file:// URLs, e.g., loading some images. (I seemed to recall browsers previously preventing files outside of their directory or child directories from loading files in parent directories, but this restriction does not seem to apply now, though there are some warnings I see about Ajax not being able to load some images which svgedit attempts to load.)
As such, even with the above-mentioned recent fix, it might not be possible to fully work offline, unless perhaps you opt to disable the security restrictions on your browser, something one should not do lightly. But it does appear to work for some basic drawings at least.
While I figure this may address your direct question about why it doesn't work without a server, there is also another approach to working "offline" which, though it would need a server to initially serve the files, may allow svgedit to store the application files to work completely offline the next time you visit that URL in the browser--and not run into problems with browser security restrictions. Browsers nowadays can work offline even when served from a server (done by something called "service workers"--see https://caniuse.com/#feat=serviceworkers for the browsers that support this).
Service workers are, however, not all that easy to cobble together, and though you should be able to track any future progress on this by subscribing to the issue at https://github.com/SVG-Edit/svgedit/issues/243 (as it is already a requested feature), there is no one currently undertaking to implement this at this time. Hopefully someone will be inspired to implement this.
By the way, if you install svgedit using "npm" (a tool which becomes available if you install Node), svgedit has a start script which you can invoke from the command line with npm start from within the svgedit folder, and that will run a local (Node) server for you, specifically a simple static file server which will simply allow you to load svgedit from http URLs (i.e., http://localhost:8000/editor/svg-editor.html or http://127.0.0.1:8000/editor/svg-editor.html; you can also use the ES6 Modules file if you are on a modern browser: http://localhost:8000/editor/svg-editor-es.html )--without your needing to install any other server.

OwnCloud Remove all files prompt

I have a owncloud server and the owncloud desktop client.What I want to do is to be able to delete things server wise and have it automatically delete from the pc. The problem is that the owncloud client displays a warning message of "Remove All Files"? with the choices of Remove all files or to keep files when the files are deleted from the server. Is there a way to not have the prompt come up and automatically remove all files?
In the version 2.2.3 (maybe earlier), you can change the configuration file to disable the prompt.
See the code where the prompt is invoked and the code showing the configuration file property.
If you edit (on Windows): c:\Users\myuser\AppData\Owncloud\owncloud.cfg and add the following, under the [General] section, you will no longer get the prompt.
promptDeleteAllFiles=false
The short answer: You cannot change this currently.
The long answer: The dialog was added as a safe-guard because there were cases where you could lose all your files unintentionally, e.g. if your admin re-created your account and left it empty. The client would assume the files had gone and would replicate this (it could not know better), so it would replicate the data removal locally. The code is still there today just to be safe.
If you are fearless, you can patch Folder::slotAboutToRemoveAllFiles(). Alternatively, you could open a bug report so we can solve this for everyone. What is your motivation to be able to do this without a prompt?
PS: The sources can be found on GitHub. URL and build instructions at http://doc.owncloud.org/desktop/1.5/building.html.
I have a script that processes the files that someone drops into ownCloud and it will then move them to the final storage place. However, this prompt stops the client from syncing until I manually log in to acknowledge it... I guess I will learn how to patch this.. Dropbox doesn't do this. Google Drive doesn't do this. But since I can't use cloud services (compliance issues), I have to use this solution until I can build a new secure upload means.

Uploading Sawtooth Software ACA Survey to web using personal website hosting

The software created a Web Upload folder for me, which I uploaded to the site using an FTP Client (specifically WS_FTP). The first lines of the pearl files say "#!usr/bin/pearl" that I changed to "/home/calakpsi/pearl". However, when I execute the html file it searches my computer under "/C:/Users/myname/AppData/Roaming/Ipswitch/WS_FTP/Storage/cgi-bin/ciwweb.pl". I made sure the file it's looking for was in that folder, but for some reason the webpage would still not execute.
Any help or step by step solution (since I do not have an in depth technical background) would be much appreciated.
I think the problem is that your server is not configured properly to run perl scripts. Have a look at this, to see if it helps. The answer by Dave Sherohman should help you out.
Once you are able to run perl scripts, it should run (barring other issues which are script specific).
Overall the steps required to execute perl scripts are as follows. You can look up their details on the internet, as I don't know them myself.
Install any mods required for server, for instance mod_perl, on ubtuntu it would be something like sudo apt-get install libapache2-mod-perl2. If you are in windows, perhaps take one of those bitnami or other LAMP installers. They should come installed with it.
Configuration for server/virtual host, so that perl files in the directory are executed
ensure they have correct permission (and you should be all set).

Cyberduck uploads break node-dev/up/hotnode

I've been using Cyberduck 4.2.1 to connect to my EC2 instance to edit my Node projects. I've used Node-dev to reload my project/server as files are updated, but if I save the files through Cyberduck's Edit command, the server never really reloads and usually crashes.
I've tested with a few different editors (TextMate, Dashcode) with the same result. Node-dev restarts correctly when I edit files from the terminal. I have tried a few others that do rougly the same thing, hotnode and up. They all work when editing via Terminal, but fail when I edit files through Cyberduck. I think it has something to do with the way Cyberduck replaces the remote files when it is saved.
Does anyone know what might be causing this, and maybe suggest some changes to these github projects? If not, are there better Mac FTP clients that might not have this issue?
I don't know about Node-dev, but my educated guess is that it crashes because it reads a partially uploaded file. I suggest to try the Upload with temporary filename feature available as a hidden option in Cyberduck.
You can try CyberDuck 6.6.2 ....It works for me

Resources