Setting up a Developers' environment in a second domain - linux

This may be the wrong forum to ask this question. If so, I'd love links or suggestions as to where to post it.
Background:
In our current environment, developers have Windows desktop machines with decent though not crazy specs and wired Ethernet. Enough to develop, compile, test, commit, etc.
We'll be moving to a new environment (not our choice!) where the desktops are replaced by wifi-only laptops and no fixed workstations. That means, e.g. a dev cannot start a long-job at 5 pm, go home while it runs and have it finish by 8 am the next day, since the wifi connection will be broken. This is an issue for us. An additional complication is that the work is often cross-AD-domain.
Questions:
We're brainstorming other options. Some ideas that have come up:
keep the old desktops, put 'em in a closet and let the devs RDP to them (mostly Windows, a few Linux via ssh). Wire the desktops to a second domain
put in a beefy server or two and allow multiple concurrent logins. Install the required software etc. Wire the servers to a second domain
Set up a VM per dev on a suitable host in the second domain
Containers?
Is #4 even possible? Can you spin up a container that you can remote into in GUI mode in either Linux or Windows?
What other options are worth looking at? What have others done? Trying to learn from others' experiences and not re-invent the wheel.

Related

Can you install lamp onto a network hdd?

I have a WD MyBook Live Network-Attached-Storage device. I want to install a home lamp server on it, just for testing purposes of building a website. I need to have MySQL as well. Is this an option?
You can install things on an external hard drive, but you cannot run them there. You can only run them on the computer to which you've attached the external hard drive. In other words, you need a computer to run a LAMP stack, and if you have a computer, it doesn't matter if the computer is using an internal or external hard drive. The only difference between an internal hard drive and an external drive is that external hard drives are often (but not always) slower.
Take a look at some forums specifically dedicated to your NAS drive. As I noted in the comment above, the newest version of the Mybook is running a fully-featured Debian build.
For example:
http://mybookworld.wikidot.com/mybook-live
Looks like it has some tips and instructions on doing exactly what you want. One thing to watch out for is it's not a standard intel-based-architecture 32/64-bit cpu, it's more like a customized SOC/ARM architecture. So the availability of tools and build procedures are going to be more limited.
EDIT: I also wanted to add that it might save a lot of headaches to just rent the economy linux hosting plan at godaddy (or wherever) for some personal web space. I did this years ago and have used it for tons of things like this. But if you're the kind of person who likes this sort of project (hacking your electronics to put to work, etc) then go for it. But sometimes this stuff is not for the faint of heart.

Linux per program firewall similar to windows and mac counterparts

Is it possible to create GUI firewall that works as Windows and Mac counterparts? Per program basis. Popup notification window when specific program want to send\recv data from network.
If no, than why? What Linux kernel lacks to allow existence of such programs?
If yes, than why there aren't such program?
P.S. This is programming question, not user one.
Yes it's possible. You will need to setup firewall rules to route traffic through an userspace daemon, it'll involve quite a bit of work.
N/A
Because they're pretty pointless - if the user understands which programs he should block from net access he could just as well use one of multiple existing friendly netfilter/iptables frontends to configure this.
It is possible, there are no restrictions and at least one such application exists.
I would like to clarify a couple of points though.
If I understood this article correct, the firewalls mentioned here so far and iptables this question is tagged under are packet filters and accept and drop packets depending more on IP addresses and ports they come from/sent to.
What you describe looks more like mandatory access control to me. There are several utilities for that purpose in Linux - selinux, apparmor, tomoyo.
If I had to implement a graphical utility you describe, I would pick, for example, AppArmor, which supports whitelists, and, to some extent, dynamic profiling, and tried to make a GUI for it.
OpenSUSE's YaST features graphical interface for apparmor setup and 'learning' , but it is specific to the distribution.
So Linux users and administrators have several ways to control network (and files) access on per-application basis.
Why the graphical frontends for MAC are so few is another question. Probably it's because Linux desktop users tend to trust software they install from repositories and have less reasons to control them this way (if an application is freely distributed, it has less reasons to call home and packages are normally reviewed before they get to repositories) while administrators and power users are fine with command line.
As desktop Linux gets more popular and people install more software from AUR or PPA or even from gnome-look.org where packages and scripts are not reviewed that accurately (if at all) a demand for such type of software (user-friendly, simple to configure MAC) might grow.
To answer your 3rd point.
There is such a program which provides zenity popups, it is called Leopard Flower:
http://sourceforge.net/projects/leopardflower
Yes. Everything is possible
-
There are real antiviruses for linux, so there could be firewalls with GUI also. But as a linux user I can say that such firewall is not needed.
I reached that Question as i am currently trying to migrate from a Mac to Linux. There are a lot of applications I run on my Mac and on my Linux PC. Some of them I trust fully. But others I am not fully trusting. If they are installed from a source that checks them or not, do i have to trust them because someone else did? No, I am old enough to choose myself.
In times where privacy is getting more and more complicate to achieve, and Distributions exist that show that we should not trust everyone, I like to be in control of what my applications do. This control might not end at the connection to the network/Internet but it is what this question (and mine is about.
I have used LittleSnitch for MacOSX in the past years and I was surprised how often an application likes to access the internet without me even noticing. To check for updates, to call home, ...
Now where i would like to switch to Linux, I tried to find the same thing as I want to be in control of what leaves my PC.
During my research I found a lot of questions about that topic. This one, in my opinion, best describes what it is about. The question for me is the same. I want to know when an application tries to send or receive information over the network/internet.
Solutions like SELinux and AppAmor might be able to allow or deny such connections. Configuring them means a lot of manual configuration and does not inform when a new application tries to connect somewhere. You have to know which application you want to deny access to the network.
The existence of Douane (How to control internet access for each program? and DouaneApp.com) show that there is a need for an easy solution. There is even a Distribution which seems to have such a feature included. But i am not sure what Subgraph OS (subgraph.com) is using, but they state something like this on there website. It reads exactly like the initial question: "The Subgraph OS application firewall allows a user to control which applications can initiate outgoing connections. When an unknown application attempts to make an outgoing connection, the user will be prompted to allow or deny the connection on a temporary or permanent basis. This helps prevent malicious applications from phoning home."
As it seems to me, there are only two options at the moment. One is to Compiling Douane manually mysqlf or two, switch distribution to Subgraph OS. As one of the answers state, everything is possible - So i am surprised there is no other solution. Or is there?

Generic way to know whether a laptop is located in the office or not?

I develop software running laptops from various companies. The employees are allowed to bring these laptops home or on holidays. I want to be able to reliably detect whether the laptops are in the office or not. The laptops are connected to the company network via some kind of VPN (though various solutions are used), so I cannot say that if they can access internet, they are in the office. To make this question even more interesting, please notice that a company might have multiple locations.
Edit: I need to detect this on the laptop.
Speculation: One thing you could look at are the IP addresses allocated to the machine. If you run a VPN then at home then there is probably one IP for the Internet connection and one for the VPN.
I think the answer from Rob is close but maybe you should take into account the gateway used by the NIC.
And if you have time enough a tracert to a known server in your office.
That will give you the route and the intermediate NIC's between the laptop and the known server.
You only have to make sure in that case that on the office location the route to the VPN concentrator is different but that should be possible with a clever dns/dhcp setup.
You might try a more specific question on serverfault.com
This cannot be done reliably, because branch offices can be setup up the same as a home network. And from experience, I'm not saying "almost the same as a home network". I mean literally the same, with non-clued managers buying network equipment from the cheapest local shop, and running copies of Windows XP HOME.

Advantages of Using Linux as primary developer desktop [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I want to get some input on some of the advantages of why developers should and need to use Linux as their primary development desktop on a daily basic as opposed to using Windows. This is particulary helpful when your Dev, QA, and Production environments are Linux.
The current analogy that I keep coming back to is. If I build my demo car as a Ford Escort, but my project car is a Ford Mustang, it doesn't make sense at all.
I'm currently at an IT department that allows dual boot with Windows and Linux, but some run Linux while the vast majority use Windows.
Here's several advantages that I've came up with since using Linux as a primary desktop.
Same Exact operating system as Dev, QA, and Production
Same Scripts (.sh) instead of maintaining (.bat and *.sh). Somewhat mitigated by using cygwin, but still a bit different.
Team learns simple commands such as: cd, ls, cat, top
Team learns Advanced commands like: pkill, pgrep, chmod, su, sudo, ssh, scp
Full access to installs typically for Linux, such as RPM, DEB installs just like the target environments.
The list could go on and on, but I want to get some feedback of anything that I may have missed, or even any disadvantages (of course there are some). To me it makes sense to migrate an entire team over to using Linux, and using Virtual Box, running Windows XP VM's to test functional items that 95% of most of the world uses.
This is similar but a little different thread going on here as well.
link text
I have to say getting forced into SSH access to a linux development box for PHP/MySQL development has been one of my greatest and fastest growth experiences as a developer (who formerly worked only in windows XP as a dev environment) as well as bridging some of the knowledge gap between development and sysadmin tasks which is great for developers to understand more about, especially if you ever end up in a one-man army kind of situation.
I was all about windows/eclipse and point and click, and now I am all about VIM and keyboard shortcuts. The color coding/auto tab complete stuff is pretty good these days.
Where I work we use Rackspace Cloud servers for production and development. I imaged the production server (2G ram/CentOS 5.2 stack) for a dev server (so the environment IS EXACTLY THE SAME not close but EXACT) and run it on the smallest instance (256M ram) which is only about $12 month for my dev box. My buddy had a mac he did local dev on for the same codebase and he experienced subtle bugs in the code due to the mac environment, that I do not experience on my cloud dev box (or production).
So what I am getting at is with this type of shift (to the cloud for linux dev with no GUI) portability and quick recovery from hardware failure, and productivity (keyboard shortcuts rule over point/click/drag select) are some other major advantages. Obvs you can learn keyboard shortcuts in Windows too, but when forced to work in only a terminal window, you learn a lot more of them out of necessity. I run Windows 7 on a laptop (essentially as a dumb terminal to my cloud devbox), but I SSH into my devbox with putty and work on code with VIM and manage it with git. If my laptop ever fails or gets stolen, all I really need is ANY computer that has an SSH client (and internet connection) and I can be productive on a temporary loaned computer within 30 minutes until my preferred hardware is fixed/replaced. (all my passwords on the laptop are in a keepass encrypted db which is backed up on dropbox.com as well as external HD, occasional gmail to self). And of course configure putty with nice fonts/font size and full-screen window size.
In contrast getting a windows box from clean install to dev environment tweaked exactly how you want might take a couple full-time days plus a couple hours here and there for a month, and still not replicate the production environment to your needs.
Ok, end biased rant - I guess my point is I didn't know what I was missing as a windows guy, and simple non GUI linux tools for web development have proven to be superior to me for how we work. But also note my laptop is Windows 7, so when work is done or a need to do some IE testing, I'm on a "normal" OS. However, I doubt a lot of people would be willing to make such a change if there is no perceptible gain or immediate need.
I just switched to using Ubuntu from Windows XP, here's what I found:
Pro's of Linux
Linux is less likely to be affected by viruses. I lost some time to viruses when I used XP.
As you said, same environment as Dev/QA/Prod which is nice. It's no longer a change of mindset when I connect to one of those machines
Linux is more stable. I usually rebooted XP every week or two.
You get to use the unix tools (find, pkill, grep, etc.). Cygwin is a workaround but seems quite a bit slower than running unix natively.
Performance seems quite a bit better on Linux. This is probably the biggest win for me, I have a memory-intensive Dev environment.
Cons of Linux
Open Office is a bit of a shock to the system compared to Word/Excel (which I have been using for many years).
I miss Notepad++
I need to run VirtualBox to host my local Sql Server Dev database
I need to run VirtualBox when running internet explorer
More of a pain to copy/paste text between Sql Server Management studio and IE if needed because they run in VirtualBox
Remote Desktop is more of a pain. Microsoft's remote desktop allowed me to not have to log out from work before working at home and vice versa
I have one app that only runs with the Wine emulator and won't work at all for me when remote desktop-ing on linux
I agree with the poster who said it's good to give developers a choice - they will appreciate that instead of having one or the other OS rammed down their throats. An added benefit is that you'll then be able to differentiate the good devs from the bad :) Just kidding.
On my first employment, we had been working on HP UX systems. So I really learned love the power of the console and it's elegance:
Use find to work on loads of files
less for really big log or data files without delay
for loops with substring handling to rename thousands of files in seconds.
and many other nice shell hacks to save you time and nerves...
But not many people seemed to agree in my later employments...
However. I only once had the posibility to use a Fedora Linux box for development several years ago. It was a 64 bit system in the first years of their existance. Maybe this was the problem. I was looking forward to use a proper shell again, but was disapointed as Eclipse did not run stable and had a lot of bugs. This was a pitty and a no go. Since then I never again had the chance to use linux as development OS.
As I start to work in a new employment in some days I really think about to give it an other try. Would do you think, is it still unstable? I nearly can't imagine.
You won't have to use Visual Studio.
Since that doesn't seem to be an issue for you, you might provide more details---what languages are you developing in? If it's Java, then you'll be spending most of your time in Eclipse, Netbeans, etc., so it really won't make much difference. What is your budget for the changeover, or what savings do you hope to get?
From your reasons it seems that you're pretty commited to UNIX already.
Why not give the developers a choice?
git runs faster.
...
Okay, not that much of an advantage...
Linux boxes are easier to containerize with solutions like Docker so that you can more easily share your environment with other devs or QA.
Also, if you need multiple boxes talking to each other for your dev setup, then Linux is a more practical solution. I was working on a Windows machine with a .Net solution which had to talk to services on a different box. I chose to install a couple of VM's using the steps described here (http://mytakeon.it/the-complete-steps-to-having-a-virtual-box-up-and-running-on-your-computer/). The Linux VMs were so light weight, easy to manage and faster in booting up.

Will I be able to successfully run this Ubuntu (linux) setup in Virtualbox?

Have 4 DVI output. Seems I hopefully will have driver support on this. Details are sketchy online about supporting 4 outputs, but seems possible.
My question is from the Linux group and Virtualbox pros.... Will the seamless method of VirtualBox allow me to use all 3 of my monitors for multiscreen. I'd like to stick with Ubuntu and run Visual C# and other tools from my VirtualBox. Compiz effects are just too amazing to want Aero Glass.
What do you think, will my system be able to use the multiple monitors with VirtualBox and this graphics card? I've googled for hours and am still searching for answers.
Edit:
I tried virtualbox last night. Pretty slick, though I had an error in installing Visual C#NET. However, it wouldn't let you drag between multiple screens??? Is this something the host must resolve, or does the guest session need to have special settings for multiple monitors? Haven't been able to find anything in google supporting multiple monitors with virtualbox.
You should be able to configure your screens just fine. Don't know the exact details for an ATI setup, but you should be able to use Xinerama to create a single large virtual desktop, and then just run VirtualBox (though honestly, I prefer KVM, which runs on modern CPUs which provide native virtualization support) full-screen on one of those monitors. You would then be able to have three screens dedicated to Ubuntu, and the forth dedicated to Windows.
You might want to look into the non-Xinerama method of multiple displays. Each display is then treated as its own screen (so you'd have :0.0, :0.1, :0.2, and :0.3 for your X displays). You cannot move applications between the screens, but you get four independent desktops. I personally find that more useful than the idea of a single stretched desktop over multiple displays; when I used a laptop as my primary system, that's what I did, and when I get a second monitor for my computer, I'll likely return to that means of doing things. You'll have to investigate the specifics for such a setup with ATI, but the X server supports it, so it's just a matter of looking at your ATI driver's documentation to put the pieces together.

Resources