In a node.js project, I'm using Go for a critical part of it that node isn't adequate enough to handle. I want to split the Go code into a sockets package and a main package, with sockets containing required structs/interfaces for the main package to run. The problem I'm having is that from what I can gather from Go's documentation, I can only use external packages like sockets remotely from github/gopkg. I don't want to split the repository for the project into one containing the Go code and one containing node's. How can I make the sockets package available for main to import locally while making it possible to rebuild the binaries for the two packages if any updates to their source code are made?
Edit: importing the packages is no longer an issue, but rebuilding packages on update still remains
It happens the same to my team too and we end up using vendor it's pretty easy to manage all the external packages. So, whoever checkout your repo will have all the packages inside vendor.
Understanding and using the vendor folder
And Please refer this site lots of other option out there too:
Golang Package Management Tools
Related
I'm new to Golang, I come from NodeJS and I'm a little concerned about how the dependency management works.
In Node you can rest assured that an NPM dependency will never cease to be available since it's hosted on NPM.com, and they don't allow owners to remove them. However, in Github an owner could pretty much remove the entire repo and leave every project in the world that depends on it, unusable.
I'd like to know how does this work. Is there a mirror on Golang's side that keeps the packages safe? Or is there a way to achieve something similar to Node's approach wiwhout having to host the packages inside your project?
What happens to a Golang project when a dependecy package's owner removes the repository from github?
Nothing dramatic.
If you are not using a Module Proxy and package moved to a different hosting site: Replace the import paths
If you are not using a Module Proxy, package is moved to a different hosting site and package/module used vanity import path that are kept constant: No action required.
If you use a Module Proxy: No action required.
Most likely you use the default proxy already. The problem is far less problematic than anything in the npm world.
I alter some code in a package at
C:\Users\usr\AppData\Local\.meteor\packages\accounts-ui-unstyled\1.3.0\web.browser\login_buttons.js
The thing is , after I alter the code and run “meteor” in the command line the changes are not implemented, I even deleted the whole package mentioned before and run the app and it was like … nothing happened, it’s like the application have some sort of a cache of the packages that he doesn’t have to go to that path to get them , instead it uses what it had from it before.
Can anyone please explain this to me ? What’s happening here ?
The correct way of "changing" a package is to git clone the package from git (or otherwise retrieve it's source) into either a project internal /packages folder or a project external folder (requires environment variable METOER_PACKAGE_DIRS).
If the package is, as in your case, a Meteor internal package, you can also copy only the package into your project and even add it to your versioning.
In this package you then apply your changes. It will be used in favor of the atmosphere package.
A good practice is to also increment the package version, so it is known for everyone that a custom version is in use.
Why you should not change packages inside the users \Users\...\.meteor installation packages folder?
This is the path to packages, that will be used as defaults for every new meteor project you create. Deep changes can create deep damage to your projects since changing a package will apply to all dependent projects.
Think also about project specific customization. The above described method will allow this, too.
I lately helped out on a project, where I added a really small dependency - in fact, it only contained a regular expression (https://www.npmjs.com/package/is-unc-path).
The feedback I got from the developer of the project was that he tries to minimize third-party dependencies if they can be implemented easily - whereby he - if I understand it correctly - asks me to just copy the code instead of adding another dependency.
To me, adding a new dependency looks just like putting some lines of code into an extra file in the repo. In addition, the developers will get informed by an update if the code needs a change.
Is it just a religious thought that drives a developer to do this? Are there maybe any costs (performance- or space-wise, etc) when adding a dependency?
I also had some disputes with my managers once concerning the third party libraries, the problem was even greater he got into believing that you should version the node_modules folder.
The source of any conflict usually is the ignorance.
His arguments were:
you should deliver to the client a working product not needing for him to do any other jobs like npm install
if github, npm is down in the moment when you run npm install on the server what you will do ?
if the library that you install has a bug who will be responsible
My arguments were:
versioning node_modules is not going to work due to how package dependencies work, each library will download his own node_modules dependencies and then your git repository will grow rapidly to hundreds of mb. Deploy will become more and more slow, downloading each time half a gb of code take time. NPM does use a module caching mechanism if there are no changes it will not download code uselessly.
the problem with left-pad was painfull but after that npm implemented a locking system and now for each package you just lock to a specific commit hash.
And Github, and npm does not have just a single instance service, they run in cloud.
When installing a dependency you always have some ideas behind and there are community best practices, usually they resume to: 1. Does the repo has unit tests. 2. The download number 3. When was the latest update.
Node.js ecosystem is built on modularity, it is not that node is so popular cause of some luck, but cause of how it was designed to create modules and reuse them. Sometimes working in node.js environment feels like putting lego pieces together and building your toy. This is the main cause of super fast development in node.js. People just reuse stuff.
Finally he stayed on his own ideas, and I left the project :D.
Is there a way to install meteor packages globally?
So, having the once globally installed packages installable without internet connection for projects created later, avoid repetitive downloading, and other benefits one may imagine.
Like in Node.js, using npm command (of Node Package Manager) with -g flag, npm install -g, doing so npm installs node packages into a global directory and when wanted to be loaded from javascript programs, loading from there if available, as well as looking in and loading packages from project's node modules folder.
Meteor already downloads packages into a global repository that all your local apps benefit off of.
So if you meteor add iron:router#1.0.7 it is downloaded and added to your project. Next time another project requires the same version, it is used off that same spot.
Also, there is a PACKAGES_DIR environment variable, when set, allows you to keep your own local packages centrally, so that you can share them among projects. In fact, you can keep that on a network drive (NFS) which your whole team can mount and consume centrally.
Yet, there is an inherent problem. Meteor's version resolver looks up for updates unless you pin down your package dependency versions so that is exactly why meteor seems to be so desperate to be connected.
Even if you pin your dependencies, the packages you depend on may not have (which apparently is the case for most packages) so Meteor keeps looking for updates to the whole package tree and downloads those that it deems satisfying the version constraint resolver.
The good news is, they are constantly improving their tooling, requiring lower number of lookups, faster builds, better search etc.
All in all, in essence, there is not much you can do unless Meteor provides some way of hosting an entire mirror of its package repository for you to consume offline. And I guess it is very unlikely to happen.
Meteor is a tool for the connected world and it does assume your connectivity. Heck, the whole journey begins with a curl https://install.meteor.com/ | sh
And yes, it would be great if we could hack away on a remote beach, or the 12 hour flight to that beach.
Until then, happy coding online ;)
I am in the process of introducing NuGet into our software dev process, both for external binaries (eg Moq, NUnit) and for internal library projects containing shared functionality.
TeamCity is producing NuGet packages from our internal library projects, and publishing them to a local repository. My modified solution files use the local repository for accessing the NuGet packages.
Consider the following source code solutions:
Company.Interfaces.sln builds Company.Interfaces.1.2.3.7654.nupkg.
Company.Common.sln contains a reference to Company.Interfaces via its NuGet package, and builds Company.Common.1.1.1.7655.nupkg, with Company.Interfaces.1.2.3.7654 included as a dependency.
The Company.DataAccess.sln uses the Company.Common nupkg to add
Company.Interfaces and Company.Common as references. It builds
Company.DataAccess.1.0.8.7660.nupkg, including Company.Common.1.1.1.7655 as a dependent component.
Company.Product.A is a website solution that contains references to all three library projects (added by selecting the
Company.DataAccess NuGet package).
Questions:
If there is a source code change to Company.Interfaces, do I always need to renumber and rebuild the intermediate packages (Company.Common and Company.DataAccess) and update the packages in Company.Product.A?
Or does that depend on whether the source code change was
a bug fix, or
a new feature, or
a breaking change?
In reality, I have 8 levels of dependent library packages. Is there tooling support for updating an entire tree of packages, should that be necessary?
I know about Semantic Versioning.
We are using VS2012, C#4.0, TeamCity 7.1.5.
It is a good idea to update everything on each check-in, in order to test it early.
What you're describing can be easily managed using artifact dependencies (http://confluence.jetbrains.com/display/TCD7/Artifact+Dependencies) and "Finish Build" build triggers (or even solely "Nuget Dependency Trigger").
We wrote our own build configuration on the base project (would be Company.Interfaces.sln in this case) which builds and updates the whole tree in one go. It checks in updated packages.config files and .nuspec files along the way. I can't say how much of a time-saver this ended up being for us, even if it might sound like overkill at the beginning.
One thing to watch out for: the script we wrote checks in the files even if the chain fails somewhere in between, to give us the chance of fixing it on our local machine, check in the fix and restart the publishing.