Install frameworks in IDE or system wide? - linux

When installing Node.JS or PyTorch for example, I can either install those in the regular system wide linux terminal or I can use the PyCharm or VS Code terminal.
With PyCharm from my understanding I can create virtual environments to manage different Python versions. Is it the same in VS Code?
If not, is there a difference between installing Node.JS through VS Code or the system terminal?

A virtual environment is much better, because you have control of the version. It is easy e.g. when you have one version in production, and one in development, and you are try to see if you can upgrade the tools.
Pycharm now is collaborating with Anaconda, to improve integration of the two tools, so with conda environment you can have not only python virtual environment, but npm and other non python programs/utilities.
The disadvantage: some more command to learn (and it is more complex), but you will make mistakes, with virtual environment you just remove such environment and you start again. On system it is more difficult to know what you installed, what was on base system etc

All the options have their pros and cons.
Installing anything by an IDE would make your development dependant on that IDE. Furthermore, you lose the options to learn the package management of the software what you use.
Learning npm or pip is really not a serious thing, the IDE shortcuts are for really beginners on this sense.
However, in many cases they may make your IDE to "know" about the packages better.
I would strongly disagree to install anything system-wide. They can have unwaited interactions with your system.
Best if you install them as user, into your home, but independently from your IDE. However, this requires the most learning.

Related

Where to install a Python package - system wide or in a virtual env?

I have Python 3.7 installed (system-wide) along with packages like Numpy, Pandas, pptx, xlsxwriter, and some more. Recently, I learned about virtual environments (yes I am very late to the party), and clearly see the benefits. But I am confused about a few things. Hopefully I can get answers here.
In most of my projects, I use common packages like Numpy, Pandas, Matplotlib, mysql.connector, etc. Should I install these system-wide? Or should I install them in each virtual environment that I create for each project, which, for example, amounts to installing Pandas 10 times in 10 different virtual environments?
If I install packages (system wide) that are not part of the Python Standard Library, for example, pptx and mysql.connector, AND I create a virtual environment, will I have access to these packages from within a virtual environment, or should I install them in the virtual environment as well?
What about a module like jupyter notebook, where it is not part of any project specifically, but I love using it for simple code development and testing (simple plots, etc.). Should this be installed system wide?
I am considering un-installing Python 3.7 and all the packages from my computer, and doing a fresh install of Python 3.8. I want to approach this the "right" way. I would think packages like Numpy and Pandas (I do quite a bit of data manipulation/plotting) should be installed system wide, and each virtual environment should have access to it automatically, and a more specialized package (for me at least) like pptx should be installed in a particular virtual environment for a particular project. Is my assumption correct about this?
Please advise on how to best approach package installation in the context of having virtual environments.
EDIT: based on Ni's reply, I would ask one more question: Are there modules (for example, python-dateutil, that might be used in many projects) and/or circumstances where it makes sense to install a module system-wide?
In general, I never install packages system wide.
You might install packages which require specific versions of Numpy, in your environments. In those cases, if you update the system wide version of Numpy, the package in the environment might break and you won't know that happened.
Yes, you can access them from virtual environment. But in general, don't install packages system wide
Again, I wouldn't install that system wide. For e.g., you might have environments running different python versions, which might not be compatible with the same version of Jupyter
Seems like you're doing a lot of data science work - you might want to use Anaconda to help you manage your virtual environments and package installations

Deploy Python Command-Line Application

So I wrote a little Command-Line App and want to provide an easy way to clone and use it. Problem: My app has some dependencies like Hashlib and Pyperclip.
Is there a best practice to get missing packages installed on a different machine?
Can I do so without people needing to install global pip packages?
It is a good idea to ship it as a venv?
Can like compile them into a single python file (somewhat like a binary)?
If I cannot do so, how to I get packages installed on a different machine?
Is there a nice routine to allow the user to completely remove my app once installed?
You could use the py2exe module to make a standalone application for the Windows platform. You can use py2app to make standalone apps on the Mac platform. For Linux, you should prepare a package using the .deb format to target the Ubuntu/Debian environment (which is one of the most popular flavors of *nix). Each other *nix flavor has its own packaging system that you would have to follow to target it as a platform.

Does anyone know of a good set of installation instructions for Node.js?

I'm trying to get started with Node.js on a Windows machine. Yes, I found the installer on their site. That worked just fine and I can run it. However, after that there's no instructions or requirements. Some issues I ran into:
I learned that most of these cool modules need to be built locally.
I was told I needed Git installed
I found I needed Python to build modules
I discovered I needed Visual Studio to compile
Once things are built they should be executable. However, they are not natively found in the path. I discovered them under %APPDATA%\npm, but there's no mention of adding that to the PATH.
What else am I going to discover? Is there a guide to this anywhere?
Altough I might suggest you to develope node on a unix based os (Ubuntu 12.04+WebStorm is my favorite combination for many reasons I can mention) I found my self in your situation at work when Windows 7 is a must.
I found this video really helpful
Once youe have node installed on your machine (window or any other) I (and most community) would recommend you to use WebStorm as IDEA it contains every inch of support to make your development process easy and clean, mange your global and local modules and build/debug your code easily.
It sounds like you've actually installed Node.js fine, but are having problems with the packages built by people in the community, some of which use Python or a native C compiler. Git shouldn't be necessary unless you're perhaps cloning projects from a remote repository? Or maybe the packages have dependencies on projects hosted in GitHub?
Keep in mind that Node is separate from all the modules and packages available in the community, accessed through the npm registry. Node provides you the ability to execute JavaScript locally, additional APIs, and an ecosystem to build additional packages which can do, as you've said, really cool things. But each of these packages can have unique installation requirements.
Most packages have dependencies of their own, and are often installed using the npm install command. This (usually) downloads other packages from https://www.npmjs.org/, and in some cases requires compiling additional files. This might have been the issue you hit.
The other thing to keep in mind is that a lot of people might assume things are installed and available since they have it installed for them, or are running a different operating system than you. I've often found that folks will hard code / somewhere in their scripts, which cause problems on Windows based systems. This can lead to problems with the executables that are created as part of the node packages created by the community.
To better understand what Node has and what's available, I'd recommend the nodeschool.io projects. These cover some of the main areas provided by the base Node platform, and get you used to playing around with things from GitHub and npm. Maybe if you run into specific issues there folks can help more directly.

RHEL5 Qt compiler/linker/qmake issues... advice?

I have about a few problems with a new install of the Qt SDK. I probably only need advice, but specific answers are also welcome. Before I begin a mini-story, I am running RHEL5 on academic license under VirtualBox on OSX 10.6. Using Qt version 4.5.3. This is my situation...
1.) I couldn't compile because g++ wasn't found. I fixed this by creating a link: g++ -> g++34. This allowed me to compile but it generated more errors at link-time. I had installed the framework in my home directory unintentionally so I uninstalled/reinstalled the entire SDK to /usr/local/qt.
2.) At this point I could compile but the linker complained about a missing freetype package. I had that already installed but wasn't sure why it couldn't be found. So I installed a few packages that I thought might be missing like libqt4-devel and libqt4-devel-debug. I also installed a few other general programming packages for later use.
3.) Somehwere in this process I can no longer run qmake. I ran it before and I have it installed at /usr/local/qt/qt/bin/qmake. I could create a link to it (though I shouldn't have to OR I could ensure that the location was in the PATH var). However, at this point Qt Creator says there's no Qt installation found. I re-pointed it to the installation location (using Tools/Options) but it still won't run qmake or anything else for that matter...
I only need this linux install to compile and test my Qt projects which I am developing in OSX. So my question is, should I just wipe this RHEL install and start over? And if so, should I use something else like Ubuntu? I am having plenty of hassles that I don't want to deal with as is. Note, this project will require good OpenGL support.
Is there a particular reason that you don't simply use the Qt package that's part of RHEL?
If for some reason you need to build your own, you can get all of the build dependancies with:
$ yum install yum-utils
$ yum-builddep <whatever the qt package's name is>
#scotchi is right, and you should try to use the Qt package that comes with your system unless you need a very different version. I don't know what version of Qt comes with RHEL but if its not up-to-date enough for you (and it might not be, see below) then you could consider changing OS versions. I would only do this after trying his suggestion though, because you may be able to get things working without the hassle of a full OS install.
Now, as to why you might want to switch: RHEL is, as its name ("Enterprise Linux") indicates aimed at companies who want to run servers, or large deployments of desktops. It emphasizes stability and reliability over being cutting edge. Fairly often the version of the compiler and development libraries lag a little behind the curve. This is what their clients want: a stable platform they can develop against and run programs on for a period of time, not constantly needing to keep up with the latest changes, and thoroughly tested. But for people doing development at home it may not be necessary to stay that conservative. I don't know if this is for work, school or personal programming, but it sounds to me like you should move to one of the more desktop-oriented distros. Ubuntu is great, as is Fedora. If you prefer a RHEL-like environment, then choose Fedora.

Linux Development C/C++/bash/python on windows-7

Before resorting to stackoverflow, i have spend a lot of times looking for the solutions. I have been a linux-user/developer for few years, now shifting to windows-7.
I am looking for seting-up a development environment (mainly c/c++/bash/python) on my windows machine. Solutions i tired -
VirtuaBox latest, with grml-medium (very light debian-based distro)
some how managed to install it in VBox, but lots of issues still regarding Guest-Additions, sharing files, screen-resolutions. Tired with it, now.
MinGW
installed it, added to %PATH%, along with GVIM. Now i can use powershell, run gvim, vim, and mingw from the shell as bash. But no manpages, its a lot of convenience to have them availble, locally and offline. But i think it gives me a gcc development
Do i need mySys now. i can installed it if it provides me with manpages and ssh.
Cygwin
Has avoided till now. But i think it will give me manpages, gcc-utils, python-latest.
Something called Interix.
any taker for that. is it recommened.
What are the best practices? What are you guys following, i dont have a linux-box to ssh to, well if Vbox things works fine at some point of it, i can then ssh to my VBox. I have lost of time setting it up, so abandoning it for a while.
I think only VirtualBox solution will let try things like IPtables, or other linux-system-frameworks.
I checked this
Best setup for Linux development from Windows?
do you recommend coLinux or its derivatives. If yes advices or consideration before i try that.
I recommend VirtualBox+Ubuntu. Cygwin just doesn't cut it for certain tasks and is in beta for Win7.
Here is what I do for Python development on Windows:
EasyEclipse for Python (includes eclipse, subclipse, pydev)
GNU Win32 Native Windows ports for GNU tools
Vim and Emacs (for non-IDE editing work)
I would see if MSysGit can provide what you want first. also since man pages aren't really anything hugely impressive... it might just be possible to just copy them. I've had problems with cygwin, although to be honest I'm not happy with MSys, MSysGit, or Cygwin. I wish someone would build one that was more... linux like. I would if I had to use windows every day, fortunately I only have to use windows sparingly.
IMO I'd say VirtualBox + Gentoo Linux + KDevelop4, Gentoo will give you the control you need over your environment.
I'm doing exactly the opposite of you, I have gcc/qt4 installed on wine to compile for windows and using Linux primarily.
If you want to do development of POSIX applications (mostly command line), with all the familiar Linux tools, then cygwin is your best bet.
It probably include everything you are used to.
But if you will try to do Windows development (anything with UI, drivers, services), then Visual Studio is really gold.
And in general Visual Studio is just great for anything, if you want to spend the time and money. Good IDE, great debugger. I highly recommend it. And if you are in Rome, do what the Romans do :-)
I would recommend Bloodshed DevC++ as a good basic non-microsoft specific Windows solution for developing ANSI C/C++ code. Personally I just use Visual Studio 2008 and ignore all the Microsoft specific extensions.
For Python there is the wonderful Komodo Edit software that is free, personally the IDE version is what I prefer, but I use an old 3.5.3 version that works for me. And they have a very popular Python package called ActivePython as well, that has a bunch of Windows specific extension modules.
Personally cygwin just feels and acts like a hack to me and is painful to setup and maintain. I think running Linux/Unix in a Virtual Machine is much less hassle if you are looking for a *nix environment. Getting a really genuine *nix environment feel is going to be very hard under Windows.
The following suggestions hold if you are not going to do complex template programming as the c++ IDE's other than visual studio SUCK, they cannot efficiently index modern C++ code (the boost library).
I would suggest using Netbeans (it has far better support for C++ than eclipse/CDT) with the following two build environments. Both are important if you want to cross-compile and test against POSIX and win32. This is not a silver-bullet, you should test on different variants of UNIX once in a while:
I would suggest installing Mingw and Msys for windows development, its nice when you can use awk, grep, sed etc on your code :D generative programming is easier with shell tools as well -- writing generative build scripts is a bitch to do effectively of the command line in windows (powershell might have changed this).
I would ALSO suggest installing Cygwin and using that on the side. Mingw is for programming against the win32 low-level API, Cygwin is for programming against the POSIX standard. Cygwin also compiles a lot of software that you would otherwise have to port.
Also once you get your project up and running you can use CMAKE as build environment, its the best thing since sliced bread :P You can get it to spit out build definition for anything and everything -- including visual studio.

Resources