"Fody not properly installed" error exception. (Xamarin.forms) - svg

I'm making app with using Xamarin.forms. (PCL Project)
Today, I added new three solution packages named SVG.Forms.Plugin.Abstractions, SVG.Forms.Plugin.iOS, SVG.Forms.Plugin.Android on workspace that downloaded from github.
I have used realm for Xamarin.
But After I added new packages, "Realms.RealmException has been thrown".
Message is "Fody not properly installed. allbX.Baby is a RealmObject but has not been woven."
Is it Fody's problem or Realm's or new packages'(SGV Control)?
And could you let me know how to solve it?

Better Answer
The check which is delivering that message is because Fody is not running.
So, they may have a RealmObject in their component but Fody doesn't get run building in your solution so weaving doesn't occur.
The easiest fix is to just use NuGet to add Fody to your main application project. That should install it in the right place for the solution.
Background
NuGet manages dependencies so if a package relies on Realm, it will go on in turn and install Realm. Realm itself relies on Fody, for example, so will in turn trigger a Fody installation.
You can manually install Realm but it is a little fiddly, having to add a couple of lines to your csproj to specify imports. We have chosen to only document installation via NuGet at this stage.
If you want to manually add Realm to another solution without using NuGet, I suggest you take a new clean solution, save a copy, and diff with the changes made to that solution by adding Realm via NuGet. You will then see the lines to copy into your existing solution.

Related

Packages that are not updated when running meteor

I alter some code in a package at
C:\Users\usr\AppData\Local\.meteor\packages\accounts-ui-unstyled\1.3.0\web.browser\login_buttons.js
The thing is , after I alter the code and run “meteor” in the command line the changes are not implemented, I even deleted the whole package mentioned before and run the app and it was like … nothing happened, it’s like the application have some sort of a cache of the packages that he doesn’t have to go to that path to get them , instead it uses what it had from it before.
Can anyone please explain this to me ? What’s happening here ?
The correct way of "changing" a package is to git clone the package from git (or otherwise retrieve it's source) into either a project internal /packages folder or a project external folder (requires environment variable METOER_PACKAGE_DIRS).
If the package is, as in your case, a Meteor internal package, you can also copy only the package into your project and even add it to your versioning.
In this package you then apply your changes. It will be used in favor of the atmosphere package.
A good practice is to also increment the package version, so it is known for everyone that a custom version is in use.
Why you should not change packages inside the users \Users\...\.meteor installation packages folder?
This is the path to packages, that will be used as defaults for every new meteor project you create. Deep changes can create deep damage to your projects since changing a package will apply to all dependent projects.
Think also about project specific customization. The above described method will allow this, too.

How do I remove a package from Package Control but not uninstall it for current users?

I have a ST3 package hosted on GitHub and available through Package Control. It has been superseded by a new package that I wrote, but I keep getting bug reports for the old one since many people are still using it.
What is the correct way to remove the option to install the original package from package control, and ideally from GitHub if possible, without messing anything up for users who currently have the old package installed?
Specifically, will submitting a pull request to Package Control to remove the old package, and/or deleting the old package's github repo, cause the old package to disappear from people's Sublime Text?
I strongly suggest reading through the package developer docs, especially the section entitled Renaming a Package, as they explain everything in detail. Essentially, the easiest path would be to following the directions for renaming a package, and at the same time change the URL to your new Github repo. This way, the old packagecontrol.io page will no longer be available, and upon restart users of the old package should be upgraded to the new one.
I'd also recommend reading through the Package Control Channel's issues to see if this issue has come up before. Worst case scenario, you submit your PR and it gets rejected for some reason, but they'll explain what you need to do differently.

Resolve missing references using Nuget failed

I checked out a project form Team Foundation. As you can see in the picture, I used nuget to restore the missing packages. However, the reference problems are not resolved at all.
When I right click on my solution and choose Manage Nuget Packet for Solution, here is what I've got
I thought it means that I have download all the package but they are not added to my project because there are still many build errors. If I use Package Manager Console to download each package separately, the version will conflict with the original. I would like to see if there are any automatic way to resolve this problem.
Thanks in advance
Remember to check out the package folder outside the project folder.

NuGet: Difference in behavior between Update-Package and nuget.exe update?

I'm using NuGet to create a 'web framework' package containing code, master pages, css, javascript, etc.
In an attempt to speed up the build / test process I'm running nuget.exe update packages.config but I've noticed that it behaves differently than the package manager console's Update-Package command.
nuget.exe update seems to leave the previous version of the package still installed, resulting in multiple versions of the package installed. This usually doesn't cause problems but the Package-Manager Get-Package command shows many versions installed and sometimes the project will fail to build.
Update-Package actually uninstalls the package then reinstalls it, this is cleaner but slower
My questions are:
1. Is there documentation about the difference / relationship between these commands
2. Is the nuget.exe update behavior of installing multiple versions a bug?
3. Is there a better method for creating a package in one project and updating it in another project in a fast & automated manner?
Unfortunately, there's not much official guidelines or documentation except from piecing together forum and work item threads.
Current package manager console behavior was first included as a result from discussion in this thread, which later derived in a work item (sorry, apparently not enough rep to post more links).
However, as others already noted, behavior is not consistent with nuget.exe, where there's no such switch.
So, in answer to your questions:
VS Package Manager Console and nuget.exe do have different behaviors and seem to be updated independently (which is very unfortunate).
nuget.exe update behavior of installing multiple versions side-by-side has been a design feature from the start, as you can find from a comment on David Ebbo's blog about NuGet command line (again, I would have given you the link, but SO still doesn't trust me).
Unfortunately I haven't found anything about using package manager console cmdlets during build. What you could try is manually deleting all folders with your packageId on a build event and then packaging and installing using nuget.exe. Essentially replicate what Update-Package does manually, since as David Ebbo says, the way you uninstall a package through the command line interface is by, well, deleting the folder (again, can't post a reference, this is a bit annoying...)

Project references v NuGet dependencies

I am in the process of introducing NuGet into our software dev process, both for external binaries (eg Moq, NUnit) and for internal library projects containing shared functionality.
TeamCity is producing NuGet packages from our internal library projects, and publishing them to a local repository. My modified solution files use the local repository for accessing the NuGet packages.
Consider the following source code solutions:
Company.Interfaces.sln builds Company.Interfaces.1.2.3.7654.nupkg.
Company.Common.sln contains a reference to Company.Interfaces via its NuGet package, and builds Company.Common.1.1.1.7655.nupkg, with Company.Interfaces.1.2.3.7654 included as a dependency.
The Company.DataAccess.sln uses the Company.Common nupkg to add
Company.Interfaces and Company.Common as references. It builds
Company.DataAccess.1.0.8.7660.nupkg, including Company.Common.1.1.1.7655 as a dependent component.
Company.Product.A is a website solution that contains references to all three library projects (added by selecting the
Company.DataAccess NuGet package).
Questions:
If there is a source code change to Company.Interfaces, do I always need to renumber and rebuild the intermediate packages (Company.Common and Company.DataAccess) and update the packages in Company.Product.A?
Or does that depend on whether the source code change was
a bug fix, or
a new feature, or
a breaking change?
In reality, I have 8 levels of dependent library packages. Is there tooling support for updating an entire tree of packages, should that be necessary?
I know about Semantic Versioning.
We are using VS2012, C#4.0, TeamCity 7.1.5.
It is a good idea to update everything on each check-in, in order to test it early.
What you're describing can be easily managed using artifact dependencies (http://confluence.jetbrains.com/display/TCD7/Artifact+Dependencies) and "Finish Build" build triggers (or even solely "Nuget Dependency Trigger").
We wrote our own build configuration on the base project (would be Company.Interfaces.sln in this case) which builds and updates the whole tree in one go. It checks in updated packages.config files and .nuspec files along the way. I can't say how much of a time-saver this ended up being for us, even if it might sound like overkill at the beginning.
One thing to watch out for: the script we wrote checks in the files even if the chain fails somewhere in between, to give us the chance of fixing it on our local machine, check in the fix and restart the publishing.

Resources