I have a WD MyBook Live Network-Attached-Storage device. I want to install a home lamp server on it, just for testing purposes of building a website. I need to have MySQL as well. Is this an option?
You can install things on an external hard drive, but you cannot run them there. You can only run them on the computer to which you've attached the external hard drive. In other words, you need a computer to run a LAMP stack, and if you have a computer, it doesn't matter if the computer is using an internal or external hard drive. The only difference between an internal hard drive and an external drive is that external hard drives are often (but not always) slower.
Take a look at some forums specifically dedicated to your NAS drive. As I noted in the comment above, the newest version of the Mybook is running a fully-featured Debian build.
For example:
http://mybookworld.wikidot.com/mybook-live
Looks like it has some tips and instructions on doing exactly what you want. One thing to watch out for is it's not a standard intel-based-architecture 32/64-bit cpu, it's more like a customized SOC/ARM architecture. So the availability of tools and build procedures are going to be more limited.
EDIT: I also wanted to add that it might save a lot of headaches to just rent the economy linux hosting plan at godaddy (or wherever) for some personal web space. I did this years ago and have used it for tons of things like this. But if you're the kind of person who likes this sort of project (hacking your electronics to put to work, etc) then go for it. But sometimes this stuff is not for the faint of heart.
Related
This may be the wrong forum to ask this question. If so, I'd love links or suggestions as to where to post it.
Background:
In our current environment, developers have Windows desktop machines with decent though not crazy specs and wired Ethernet. Enough to develop, compile, test, commit, etc.
We'll be moving to a new environment (not our choice!) where the desktops are replaced by wifi-only laptops and no fixed workstations. That means, e.g. a dev cannot start a long-job at 5 pm, go home while it runs and have it finish by 8 am the next day, since the wifi connection will be broken. This is an issue for us. An additional complication is that the work is often cross-AD-domain.
Questions:
We're brainstorming other options. Some ideas that have come up:
keep the old desktops, put 'em in a closet and let the devs RDP to them (mostly Windows, a few Linux via ssh). Wire the desktops to a second domain
put in a beefy server or two and allow multiple concurrent logins. Install the required software etc. Wire the servers to a second domain
Set up a VM per dev on a suitable host in the second domain
Containers?
Is #4 even possible? Can you spin up a container that you can remote into in GUI mode in either Linux or Windows?
What other options are worth looking at? What have others done? Trying to learn from others' experiences and not re-invent the wheel.
I have a project and the files are on Guest OS (Red Hat Enterprise Linux) with Virtualbox, my host OS is Mac OS. I used to coding right in RHEL with editor Atom. But my boss told me that it's inefficient to code in a Guest OS, well, it makes sense because Mac OS or Windows is more responsive than linux, so I changed my way:
Copy the whole project located on RHEL to a share folder between Mac OS and RHEL using rsync
Code with Atom in Mac OS
Copy back the project in share folder to the original project in RHEL by rsync
I'm using Atom (not vim in RHEL) because it can edit the whole project in one window which is convenient for my situation. But there is a problem: after copying back the project in Step 3, git status shows everything has been changed even though I just edited only a few files. That is a little annoying.
Is there any better way to code in such environment? any advice is appreciated.
BretzL's suggestion to use shared folders is a good one, but I think it's important to address the underlying issue: your boss' assumption about coding being inefficient or slow just because you're working on a VM is simply not true.
It sounds like your new workflow, which was instituted as a result of his/her advice, is causing you to have a harder time developing that you did on the VM. The shared folders will help with that, but if you have the VM configured to have access to enough cores and memory, then its performance for most tasks will be fine, and there may not be any problem with developing on the VM directly. I do a significant amount of development on a VM, and haven't had any issues. You may experience slower builds on the VM if you're building whole kernels or other large projects, but if that's not the case, it should be fine.
If you didn't have any performance or productivity problems before forcing yourself to work outside of the VM, then... it wasn't a problem.
(I also have an issue with the assumption that Linux is always less responsive than Windows or Mac OS, but that's a debate for a different day.)
VirtualBox supports shared folders, so you dont need to rsync back and forth. Just mount the shared folder into where your application server on RHEL guest expects the code.
I also recommend you take a look at https://www.vagrantup.com/ for managing developer VMs.
I'm studying for a CS degree, and I need to install Ubuntu for a computer systems class. We are going to do low level Assembly optimizations and stuff like that, so they don't want us to install it in a VMware.
Now, I don't want to do a regular dual-boot install, because I've already done it on my previous computer a couple of years ago, and wrecked my hard-disc with the partitioning. Wikipedia says you can use Wubi to boot Ubuntu from ISO, or install it to a flash drive and boot it from there, and then thus remove the need for partitioning.
Now, my question is - how different it is to program for Ubuntu booted from a regular hard-disc partition, from a Wubi ISO, and from an SD card? I guess the programs will work the same on all options, but we're going to do play with low level Assembly optimizations - can I expect to face any difference in that department?
I'm not sure I'd go this route if it were me personally.
You should be fine in terms of whatever bare metal type stuff you want to do -- you're working with memory, the cache, the chip, etc, so your disk drive shouldn't matter (unless you're doing stuff to the filesystem or something).
Where I think you might get annoyed is the logistics of setting up your development environment. Everytime you boot from your USB stick, you're going to need to sudo apt-get GCC, scite, et. al, load your files on into directories that you want, and then get started. That's a hassle. You could optimize this somewhat by creating a custom ISO of your environment using some kind of tool (you might be able to do it with Clonezilla), but still.. yuck.
I would suggest (speaking of Clonezilla) that you snapshot your hard drive, go ahead and install dual booting with Ubuntu, and then you have a backup if anything goes wrong. Or, I'd think you could get by using the school's machines. Don't they have any Linux boxes that you can ssh into, if not use in labs?
Anyway, good luck. :)
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I want to get some input on some of the advantages of why developers should and need to use Linux as their primary development desktop on a daily basic as opposed to using Windows. This is particulary helpful when your Dev, QA, and Production environments are Linux.
The current analogy that I keep coming back to is. If I build my demo car as a Ford Escort, but my project car is a Ford Mustang, it doesn't make sense at all.
I'm currently at an IT department that allows dual boot with Windows and Linux, but some run Linux while the vast majority use Windows.
Here's several advantages that I've came up with since using Linux as a primary desktop.
Same Exact operating system as Dev, QA, and Production
Same Scripts (.sh) instead of maintaining (.bat and *.sh). Somewhat mitigated by using cygwin, but still a bit different.
Team learns simple commands such as: cd, ls, cat, top
Team learns Advanced commands like: pkill, pgrep, chmod, su, sudo, ssh, scp
Full access to installs typically for Linux, such as RPM, DEB installs just like the target environments.
The list could go on and on, but I want to get some feedback of anything that I may have missed, or even any disadvantages (of course there are some). To me it makes sense to migrate an entire team over to using Linux, and using Virtual Box, running Windows XP VM's to test functional items that 95% of most of the world uses.
This is similar but a little different thread going on here as well.
link text
I have to say getting forced into SSH access to a linux development box for PHP/MySQL development has been one of my greatest and fastest growth experiences as a developer (who formerly worked only in windows XP as a dev environment) as well as bridging some of the knowledge gap between development and sysadmin tasks which is great for developers to understand more about, especially if you ever end up in a one-man army kind of situation.
I was all about windows/eclipse and point and click, and now I am all about VIM and keyboard shortcuts. The color coding/auto tab complete stuff is pretty good these days.
Where I work we use Rackspace Cloud servers for production and development. I imaged the production server (2G ram/CentOS 5.2 stack) for a dev server (so the environment IS EXACTLY THE SAME not close but EXACT) and run it on the smallest instance (256M ram) which is only about $12 month for my dev box. My buddy had a mac he did local dev on for the same codebase and he experienced subtle bugs in the code due to the mac environment, that I do not experience on my cloud dev box (or production).
So what I am getting at is with this type of shift (to the cloud for linux dev with no GUI) portability and quick recovery from hardware failure, and productivity (keyboard shortcuts rule over point/click/drag select) are some other major advantages. Obvs you can learn keyboard shortcuts in Windows too, but when forced to work in only a terminal window, you learn a lot more of them out of necessity. I run Windows 7 on a laptop (essentially as a dumb terminal to my cloud devbox), but I SSH into my devbox with putty and work on code with VIM and manage it with git. If my laptop ever fails or gets stolen, all I really need is ANY computer that has an SSH client (and internet connection) and I can be productive on a temporary loaned computer within 30 minutes until my preferred hardware is fixed/replaced. (all my passwords on the laptop are in a keepass encrypted db which is backed up on dropbox.com as well as external HD, occasional gmail to self). And of course configure putty with nice fonts/font size and full-screen window size.
In contrast getting a windows box from clean install to dev environment tweaked exactly how you want might take a couple full-time days plus a couple hours here and there for a month, and still not replicate the production environment to your needs.
Ok, end biased rant - I guess my point is I didn't know what I was missing as a windows guy, and simple non GUI linux tools for web development have proven to be superior to me for how we work. But also note my laptop is Windows 7, so when work is done or a need to do some IE testing, I'm on a "normal" OS. However, I doubt a lot of people would be willing to make such a change if there is no perceptible gain or immediate need.
I just switched to using Ubuntu from Windows XP, here's what I found:
Pro's of Linux
Linux is less likely to be affected by viruses. I lost some time to viruses when I used XP.
As you said, same environment as Dev/QA/Prod which is nice. It's no longer a change of mindset when I connect to one of those machines
Linux is more stable. I usually rebooted XP every week or two.
You get to use the unix tools (find, pkill, grep, etc.). Cygwin is a workaround but seems quite a bit slower than running unix natively.
Performance seems quite a bit better on Linux. This is probably the biggest win for me, I have a memory-intensive Dev environment.
Cons of Linux
Open Office is a bit of a shock to the system compared to Word/Excel (which I have been using for many years).
I miss Notepad++
I need to run VirtualBox to host my local Sql Server Dev database
I need to run VirtualBox when running internet explorer
More of a pain to copy/paste text between Sql Server Management studio and IE if needed because they run in VirtualBox
Remote Desktop is more of a pain. Microsoft's remote desktop allowed me to not have to log out from work before working at home and vice versa
I have one app that only runs with the Wine emulator and won't work at all for me when remote desktop-ing on linux
I agree with the poster who said it's good to give developers a choice - they will appreciate that instead of having one or the other OS rammed down their throats. An added benefit is that you'll then be able to differentiate the good devs from the bad :) Just kidding.
On my first employment, we had been working on HP UX systems. So I really learned love the power of the console and it's elegance:
Use find to work on loads of files
less for really big log or data files without delay
for loops with substring handling to rename thousands of files in seconds.
and many other nice shell hacks to save you time and nerves...
But not many people seemed to agree in my later employments...
However. I only once had the posibility to use a Fedora Linux box for development several years ago. It was a 64 bit system in the first years of their existance. Maybe this was the problem. I was looking forward to use a proper shell again, but was disapointed as Eclipse did not run stable and had a lot of bugs. This was a pitty and a no go. Since then I never again had the chance to use linux as development OS.
As I start to work in a new employment in some days I really think about to give it an other try. Would do you think, is it still unstable? I nearly can't imagine.
You won't have to use Visual Studio.
Since that doesn't seem to be an issue for you, you might provide more details---what languages are you developing in? If it's Java, then you'll be spending most of your time in Eclipse, Netbeans, etc., so it really won't make much difference. What is your budget for the changeover, or what savings do you hope to get?
From your reasons it seems that you're pretty commited to UNIX already.
Why not give the developers a choice?
git runs faster.
...
Okay, not that much of an advantage...
Linux boxes are easier to containerize with solutions like Docker so that you can more easily share your environment with other devs or QA.
Also, if you need multiple boxes talking to each other for your dev setup, then Linux is a more practical solution. I was working on a Windows machine with a .Net solution which had to talk to services on a different box. I chose to install a couple of VM's using the steps described here (http://mytakeon.it/the-complete-steps-to-having-a-virtual-box-up-and-running-on-your-computer/). The Linux VMs were so light weight, easy to manage and faster in booting up.
Hopefully this still falls within StackOverflow's umbrella!
I'm looking to create a quick boot linux laptop for my wife. All it really needs is to be able to do is browse the internet (with flash and video etc.).
Are there any distros that are made for this, or any guides out there that show good ways to speed stuff up? I've read that I should "remove stuff from the kernel that I don't use" but that's a little out of my skill set.
Thanks!
If you're using Ubuntu (or a variant, like xubuntu or kubuntu), there is a package called BootUp-Manager. There's an article about it over at Lifehacker. It lets you check and uncheck things in the startup and shutdown scripts to optimize things (such as turning off checking for new hardware, or whatever)
You may also be able to gain a simple speed-up by going into System->Administration->Services and disabling any services you don't need.
If you'd like to see how much time is being spent on each part, install the package Bootchart, and that should give you a detailed profile of everything that goes on during startup, and let you focus on the most time-consuming parts, and measure your progress as you tune the system.
I believe Xubuntu is designed for low memory footprint/fast booting and whatnot while still having a decent amount of features. Not a Linux user but it just seems to stick out in my head.
Some guys got an EEE PC netbook booting in 5 seconds running a modified version of Fedora. Might be a good starting point: http://lwn.net/Articles/299483/
Try: Damn Small Linux is a very versatile 50MB mini desktop oriented Linux distribution.
Alternatively, get an Asus motherboard with expressgate - it has an onboard Linux (spashtop) that boots in 3 seconds. Its designed for quick web surfing, IM, music etc whilst still letting you boot into your main OS.
If you really want it to boot fast, I would suggest creating an initrd containing exactly the software you need to do what you want it to do. The initrd will get read from the disk once as one large file, and then everything will run out of ram.
This is not an easy solution, the easiest solution will be jishi's solution of using a Live CD, but, that won't be the fastest solution.
I have been using Kubuntu 14.04.4/5 on a Dell laptop with dual boot. Am not real happy at the moment with it. You are all correct about the slowness of a liveCD.
There are LiveCD-versions of working linux-distros with browser and installed flash, java.
Check out LiveCD
http://en.wikipedia.org/wiki/Live_CD
you will find links to different flavours with download.
There are also USB-drive-versions.