The problem is not programming.
I'm using an old machine, and microsoft stopped patches for those machines.
Does any one have a solution for the SMB protocol vulnerability that use port 445, which is known for attackers ?
Any tricky solution ?
In other words, I want to use the SMB and I cant change the machines becuse it is too expensive.
Get third-party SMB stack. MoSMB has one for sure and there are others to select from. Some support Windows, some are Linux only.
http://mosmb.com
In general I'd reconsider what you're doing: lack of patches, EOSL & so on is a big red flag for having something in production. Unless absolutely necessary of course.
Then you should check if you are vulnerable with this tool from ESET :https://help.eset.com/eset_tools/ESETEternalBlueChecker.exe
Microsoft stopped patches for software, not for your machine ;-)
If you use Xp (i think that's your OS) you cand look on the net for "KB4012598" security patches!
You can use NQE server for Windows
Related
Is it possible to create GUI firewall that works as Windows and Mac counterparts? Per program basis. Popup notification window when specific program want to send\recv data from network.
If no, than why? What Linux kernel lacks to allow existence of such programs?
If yes, than why there aren't such program?
P.S. This is programming question, not user one.
Yes it's possible. You will need to setup firewall rules to route traffic through an userspace daemon, it'll involve quite a bit of work.
N/A
Because they're pretty pointless - if the user understands which programs he should block from net access he could just as well use one of multiple existing friendly netfilter/iptables frontends to configure this.
It is possible, there are no restrictions and at least one such application exists.
I would like to clarify a couple of points though.
If I understood this article correct, the firewalls mentioned here so far and iptables this question is tagged under are packet filters and accept and drop packets depending more on IP addresses and ports they come from/sent to.
What you describe looks more like mandatory access control to me. There are several utilities for that purpose in Linux - selinux, apparmor, tomoyo.
If I had to implement a graphical utility you describe, I would pick, for example, AppArmor, which supports whitelists, and, to some extent, dynamic profiling, and tried to make a GUI for it.
OpenSUSE's YaST features graphical interface for apparmor setup and 'learning' , but it is specific to the distribution.
So Linux users and administrators have several ways to control network (and files) access on per-application basis.
Why the graphical frontends for MAC are so few is another question. Probably it's because Linux desktop users tend to trust software they install from repositories and have less reasons to control them this way (if an application is freely distributed, it has less reasons to call home and packages are normally reviewed before they get to repositories) while administrators and power users are fine with command line.
As desktop Linux gets more popular and people install more software from AUR or PPA or even from gnome-look.org where packages and scripts are not reviewed that accurately (if at all) a demand for such type of software (user-friendly, simple to configure MAC) might grow.
To answer your 3rd point.
There is such a program which provides zenity popups, it is called Leopard Flower:
http://sourceforge.net/projects/leopardflower
Yes. Everything is possible
-
There are real antiviruses for linux, so there could be firewalls with GUI also. But as a linux user I can say that such firewall is not needed.
I reached that Question as i am currently trying to migrate from a Mac to Linux. There are a lot of applications I run on my Mac and on my Linux PC. Some of them I trust fully. But others I am not fully trusting. If they are installed from a source that checks them or not, do i have to trust them because someone else did? No, I am old enough to choose myself.
In times where privacy is getting more and more complicate to achieve, and Distributions exist that show that we should not trust everyone, I like to be in control of what my applications do. This control might not end at the connection to the network/Internet but it is what this question (and mine is about.
I have used LittleSnitch for MacOSX in the past years and I was surprised how often an application likes to access the internet without me even noticing. To check for updates, to call home, ...
Now where i would like to switch to Linux, I tried to find the same thing as I want to be in control of what leaves my PC.
During my research I found a lot of questions about that topic. This one, in my opinion, best describes what it is about. The question for me is the same. I want to know when an application tries to send or receive information over the network/internet.
Solutions like SELinux and AppAmor might be able to allow or deny such connections. Configuring them means a lot of manual configuration and does not inform when a new application tries to connect somewhere. You have to know which application you want to deny access to the network.
The existence of Douane (How to control internet access for each program? and DouaneApp.com) show that there is a need for an easy solution. There is even a Distribution which seems to have such a feature included. But i am not sure what Subgraph OS (subgraph.com) is using, but they state something like this on there website. It reads exactly like the initial question: "The Subgraph OS application firewall allows a user to control which applications can initiate outgoing connections. When an unknown application attempts to make an outgoing connection, the user will be prompted to allow or deny the connection on a temporary or permanent basis. This helps prevent malicious applications from phoning home."
As it seems to me, there are only two options at the moment. One is to Compiling Douane manually mysqlf or two, switch distribution to Subgraph OS. As one of the answers state, everything is possible - So i am surprised there is no other solution. Or is there?
For Remote Desktop Sessions in Linux, I want to know if there something available equivalent for what Team Viewer does for windows?
The main advantage I find of Team viewer is that it can bypass firewalls, needs no NAT configurations or port forwarding rules to be setup in the router.
One of the vnc family?
You will have to make the computer visible to the client machine, if you don't want to mess around with firewalls you will need a third party reflector service to connect both of you.
The price of dog food being what it is, we should probably plug copilot, although there are probably a bunch of free ones.
Erm, TeamViewer is not only for Windows - besides full Mac implementation, it also has Linux support (although it's beta). I haven't tried, but... Did you?
Are there any secure alternatives to XDMCP (A Linux remote desktop protocol)?
I'd like to set up some thin clients -- UI heads (old computer + mouse + keyboard) connected to VMs on a fast server. ssh -Y doesn't quite cut it, since this would be for non-savvy computer users. I'd like it integrated with kdm/gdm if possible (this seems to rule out Nomachine NX, and I don't like closed source).
I am on a private network, so I guess I'll probably end up going with XDMCP, but it would seem kinda sorry if there aren't any secure open-source alternatives.
This seems like a question for serverfault, but couldn't you just setup a VPN between the client computer and the server? That way, all traffic will be encrypted between the two machines.
Why not use ssh -X ? You could auto-logon locally with a general user and then autorun a script displaying a form for entering user/passwd which connects to a session using ssh -X...
Check out Nomachine NX, which is a secure version of X. They reduced the chattyness of the X protocol in a neat way and tunnel it through ssh. It works really well (but disclaimer - my company does resell the software). Available in free-as-in-beer single user version, or paid for enterprise version. There's also freenx, which is a GPL implementation of the server (the protocol, at least in version 3.x, is GPL).
I know similar questions were already asked and answered, but not exactly the same.
I'm looking for a FTP client that can do TLS/SSL connection and SFTP aswell on Linux with a nice GUI. This is main requirement, though tabbed session are a plus.
FTPRush is my idol for FTP-ing on Windows, something similar on linux would be a rockstar.
Filezilla? http://filezilla-project.org/
If you use Gnome, then I'd recommend just using Nautilus. It will do at SFTP and FTP, I'm not sure about FTP with SSL. It will also do tabs.
Konqueror can do SFTP as well as ftp over ssh.
FireFTP firefox extension.
Try CrossFTP
Wikipedia shows ( that table ) with "Information about what internet protocols the clients support" and (another one) with "The operating systems the clients can run on."
Just merge it!
HTH and Good luck!
gFTP should do as well
http://www.iglooftp.com/unix/
A couple of possibles.
Kasablanca - KDE based.
http://kasablanca.berlios.de
Another one. Igloo ftp
http://www.iglooftp.com/unix/
Not sure if you can do this but maybe run FTPRush in WINE? Just a thought
I've had this same problem so I can safely say that you probably won't find the silver bullet you are looking for. The FTP clients currently available on Linux just don't compare to some of the Windows clients. Having searched and searched I had to settle with gFTP and FileZilla. However there is another that is decent which hasn't been mentioned yet: FireFTP. It is an extension to Firefox so that in itself has it's own set of pro's and con's.
Before you give up though, you should test drive CrossOver and use it to install FTPRush and see if it works. It's at least worth a try.
Also gigolo works to do that. It's very useful when nautilus is not available (i.e. when using Linux with LXDE). Gigolo establish connection and later with a file manager is possible open as a local folder.
ubuntu 9.04, fedora 11, redhat...
what are the differences from a web server/development standpoint?
None. They differ only in how they package things, but they're all essentially the same - same operating system, same software. Some people get quite emotional about this choice, but I've used several, and there's nothing to pick between them these days.
I like to choose linux distros based on whichever ones have the most help available online. I'd probably go with CentOS or Ubuntu for that reason.
Use whatever your hardware vendor is happy to support. If you're serious about running a production system, you will use a supported OS.
Having said that, most vendors don't officially support Centos, however it is sufficiently similar (i.e. almost identical) to Redhat Enterprise that they ignore the difference.
Your code might run anywhere, but your hardware vendor's tools probably won't. You'll want to use those.
For tomcat there is not any difference but as a server: Ubuntu is more cutting edge in terms of kernel and packages. Ubuntu package management is superior and easier. If you will prefer Ubuntu then use server edition it is optimized as a server. CentOS is said to be solid but I haven't got much experience with it. If you are considering a virtual server different distros have different level of support for different virtualization technologies just keep it in your mind.
If you are new to linux, then I defiantly recommend Ubuntu. You can be up and running in now time with apt-get.