I'm going to ask the question very generically, please bare with me.
The current setup I'm working with is Old (Linux) Computer in office controlling $millions worth of equipment in Fab. Due to firewalls/Computer being very old, we can't remote desktop to the machine - replacements and modification to current system are not an option. Many problems that come up can be solved in 10 minutes, but require a 30 min commute to the office.
Is there a way to control the input to the current system with a Modern PC? Hardwire the modern PC into the Linux system, and remote to the PC. Ideally we could remote login to the PC, and see and control the Linux system while the Linux system/Fab doesn't know anything has changed.
I'm having a bizarre problem with some virtual servers created to record podcasts. They run on amazon AWS as windows server 2012 instances and a small c# app tells FFMPEG to do the heavy lifting of capturing from the virtual screen and reading from the virtual sound card (Virtual Audio Cable: https://en.wikipedia.org/wiki/Virtual_Audio_Cable) via DirectShow filters
The problem I have is if I leave the machine to do its stuff unattended, the recordings are sometimes silent. If I log in via VNC and watch it doing its stuff the audio is recorded just fine. All other aspects of the test op are the same, and the virtual machine is shut down between successive recordings so each one should theoretically be a clean slate. The app runs under a logged in session (hence the use of VNC rather than RDP)
I'm now wondering if there is some optimisation of the windows sound engine whereby it doesn't bother playing audio if it thinks noone is listening. The confusing thing to me is that not every virtual machine suffers these problems; some of them record fine (and they're all created from the same seed virtual hard disk image) in unattended mode
I'm asking this question with the aim of getting together a list of things I can check/look into/debug.. I don't have much knowledge of how MME/DirectSound/WASAPI work internally...
I have several Linux (mostly Debian) servers running on a Proxmox platform. All of them connecting to Internet through an ADSL line, with only one public IP.
One of them is running OMD (open monitoring distribution) since longer than a year ago to monitor an EXTERNAL server (other network, monitored through that ADSL connecting to Internet.
Now I have received a message from the owners of the remote server saying that they have detected a port scan run in the night from my ADSL public IP scanning their open ports.
It's the second time this happens to me with a Debian system :(
I need to detect the process running that scan
how can I find out what process is launching that portscan from the offending linux box? The difficulty here is that I'd need to run -whatever- to know the process when the scan takes place -which can happen at some moment in the night-.
Is there a way to get a list of processes that have somehow being launched and then finished between two times (i.e. new processes started from 23:00 to 03:00)
Thanks in advance
I would recommend that you place an IDS (Intrusion Detection System) on your network in order to identify the offending system. Snort is a widely used IDS that would be suitable for the job. It would probably be easiest to configure a span port on your router to send all outgoing traffic to the IDS. The Snort rules you can download from their website will generate alerts if port scanning occurs.
While the IDS is running you should simultaneously monitor processes on the Linux systems you suspect to be conducting the unwanted scanning. A simple script could be created to save process listings at a time interval, or list unique processes discovered with timestamps. If something more professional is desired, I'm sure there is some sort of freely available process logging software.
Run the IDS and process monitoring overnight. Find the unwanted port scanning in the IDS alerts to identify the system. Cross-reference the time of the alert with the process logging in order to track down what is creating this traffic.
Good luck.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I want to get some input on some of the advantages of why developers should and need to use Linux as their primary development desktop on a daily basic as opposed to using Windows. This is particulary helpful when your Dev, QA, and Production environments are Linux.
The current analogy that I keep coming back to is. If I build my demo car as a Ford Escort, but my project car is a Ford Mustang, it doesn't make sense at all.
I'm currently at an IT department that allows dual boot with Windows and Linux, but some run Linux while the vast majority use Windows.
Here's several advantages that I've came up with since using Linux as a primary desktop.
Same Exact operating system as Dev, QA, and Production
Same Scripts (.sh) instead of maintaining (.bat and *.sh). Somewhat mitigated by using cygwin, but still a bit different.
Team learns simple commands such as: cd, ls, cat, top
Team learns Advanced commands like: pkill, pgrep, chmod, su, sudo, ssh, scp
Full access to installs typically for Linux, such as RPM, DEB installs just like the target environments.
The list could go on and on, but I want to get some feedback of anything that I may have missed, or even any disadvantages (of course there are some). To me it makes sense to migrate an entire team over to using Linux, and using Virtual Box, running Windows XP VM's to test functional items that 95% of most of the world uses.
This is similar but a little different thread going on here as well.
link text
I have to say getting forced into SSH access to a linux development box for PHP/MySQL development has been one of my greatest and fastest growth experiences as a developer (who formerly worked only in windows XP as a dev environment) as well as bridging some of the knowledge gap between development and sysadmin tasks which is great for developers to understand more about, especially if you ever end up in a one-man army kind of situation.
I was all about windows/eclipse and point and click, and now I am all about VIM and keyboard shortcuts. The color coding/auto tab complete stuff is pretty good these days.
Where I work we use Rackspace Cloud servers for production and development. I imaged the production server (2G ram/CentOS 5.2 stack) for a dev server (so the environment IS EXACTLY THE SAME not close but EXACT) and run it on the smallest instance (256M ram) which is only about $12 month for my dev box. My buddy had a mac he did local dev on for the same codebase and he experienced subtle bugs in the code due to the mac environment, that I do not experience on my cloud dev box (or production).
So what I am getting at is with this type of shift (to the cloud for linux dev with no GUI) portability and quick recovery from hardware failure, and productivity (keyboard shortcuts rule over point/click/drag select) are some other major advantages. Obvs you can learn keyboard shortcuts in Windows too, but when forced to work in only a terminal window, you learn a lot more of them out of necessity. I run Windows 7 on a laptop (essentially as a dumb terminal to my cloud devbox), but I SSH into my devbox with putty and work on code with VIM and manage it with git. If my laptop ever fails or gets stolen, all I really need is ANY computer that has an SSH client (and internet connection) and I can be productive on a temporary loaned computer within 30 minutes until my preferred hardware is fixed/replaced. (all my passwords on the laptop are in a keepass encrypted db which is backed up on dropbox.com as well as external HD, occasional gmail to self). And of course configure putty with nice fonts/font size and full-screen window size.
In contrast getting a windows box from clean install to dev environment tweaked exactly how you want might take a couple full-time days plus a couple hours here and there for a month, and still not replicate the production environment to your needs.
Ok, end biased rant - I guess my point is I didn't know what I was missing as a windows guy, and simple non GUI linux tools for web development have proven to be superior to me for how we work. But also note my laptop is Windows 7, so when work is done or a need to do some IE testing, I'm on a "normal" OS. However, I doubt a lot of people would be willing to make such a change if there is no perceptible gain or immediate need.
I just switched to using Ubuntu from Windows XP, here's what I found:
Pro's of Linux
Linux is less likely to be affected by viruses. I lost some time to viruses when I used XP.
As you said, same environment as Dev/QA/Prod which is nice. It's no longer a change of mindset when I connect to one of those machines
Linux is more stable. I usually rebooted XP every week or two.
You get to use the unix tools (find, pkill, grep, etc.). Cygwin is a workaround but seems quite a bit slower than running unix natively.
Performance seems quite a bit better on Linux. This is probably the biggest win for me, I have a memory-intensive Dev environment.
Cons of Linux
Open Office is a bit of a shock to the system compared to Word/Excel (which I have been using for many years).
I miss Notepad++
I need to run VirtualBox to host my local Sql Server Dev database
I need to run VirtualBox when running internet explorer
More of a pain to copy/paste text between Sql Server Management studio and IE if needed because they run in VirtualBox
Remote Desktop is more of a pain. Microsoft's remote desktop allowed me to not have to log out from work before working at home and vice versa
I have one app that only runs with the Wine emulator and won't work at all for me when remote desktop-ing on linux
I agree with the poster who said it's good to give developers a choice - they will appreciate that instead of having one or the other OS rammed down their throats. An added benefit is that you'll then be able to differentiate the good devs from the bad :) Just kidding.
On my first employment, we had been working on HP UX systems. So I really learned love the power of the console and it's elegance:
Use find to work on loads of files
less for really big log or data files without delay
for loops with substring handling to rename thousands of files in seconds.
and many other nice shell hacks to save you time and nerves...
But not many people seemed to agree in my later employments...
However. I only once had the posibility to use a Fedora Linux box for development several years ago. It was a 64 bit system in the first years of their existance. Maybe this was the problem. I was looking forward to use a proper shell again, but was disapointed as Eclipse did not run stable and had a lot of bugs. This was a pitty and a no go. Since then I never again had the chance to use linux as development OS.
As I start to work in a new employment in some days I really think about to give it an other try. Would do you think, is it still unstable? I nearly can't imagine.
You won't have to use Visual Studio.
Since that doesn't seem to be an issue for you, you might provide more details---what languages are you developing in? If it's Java, then you'll be spending most of your time in Eclipse, Netbeans, etc., so it really won't make much difference. What is your budget for the changeover, or what savings do you hope to get?
From your reasons it seems that you're pretty commited to UNIX already.
Why not give the developers a choice?
git runs faster.
...
Okay, not that much of an advantage...
Linux boxes are easier to containerize with solutions like Docker so that you can more easily share your environment with other devs or QA.
Also, if you need multiple boxes talking to each other for your dev setup, then Linux is a more practical solution. I was working on a Windows machine with a .Net solution which had to talk to services on a different box. I chose to install a couple of VM's using the steps described here (http://mytakeon.it/the-complete-steps-to-having-a-virtual-box-up-and-running-on-your-computer/). The Linux VMs were so light weight, easy to manage and faster in booting up.
Hopefully this still falls within StackOverflow's umbrella!
I'm looking to create a quick boot linux laptop for my wife. All it really needs is to be able to do is browse the internet (with flash and video etc.).
Are there any distros that are made for this, or any guides out there that show good ways to speed stuff up? I've read that I should "remove stuff from the kernel that I don't use" but that's a little out of my skill set.
Thanks!
If you're using Ubuntu (or a variant, like xubuntu or kubuntu), there is a package called BootUp-Manager. There's an article about it over at Lifehacker. It lets you check and uncheck things in the startup and shutdown scripts to optimize things (such as turning off checking for new hardware, or whatever)
You may also be able to gain a simple speed-up by going into System->Administration->Services and disabling any services you don't need.
If you'd like to see how much time is being spent on each part, install the package Bootchart, and that should give you a detailed profile of everything that goes on during startup, and let you focus on the most time-consuming parts, and measure your progress as you tune the system.
I believe Xubuntu is designed for low memory footprint/fast booting and whatnot while still having a decent amount of features. Not a Linux user but it just seems to stick out in my head.
Some guys got an EEE PC netbook booting in 5 seconds running a modified version of Fedora. Might be a good starting point: http://lwn.net/Articles/299483/
Try: Damn Small Linux is a very versatile 50MB mini desktop oriented Linux distribution.
Alternatively, get an Asus motherboard with expressgate - it has an onboard Linux (spashtop) that boots in 3 seconds. Its designed for quick web surfing, IM, music etc whilst still letting you boot into your main OS.
If you really want it to boot fast, I would suggest creating an initrd containing exactly the software you need to do what you want it to do. The initrd will get read from the disk once as one large file, and then everything will run out of ram.
This is not an easy solution, the easiest solution will be jishi's solution of using a Live CD, but, that won't be the fastest solution.
I have been using Kubuntu 14.04.4/5 on a Dell laptop with dual boot. Am not real happy at the moment with it. You are all correct about the slowness of a liveCD.
There are LiveCD-versions of working linux-distros with browser and installed flash, java.
Check out LiveCD
http://en.wikipedia.org/wiki/Live_CD
you will find links to different flavours with download.
There are also USB-drive-versions.