Preventing JHipster import-jdl from overwriting changes when updating entities - jhipster

With my current workflow I have to check in git and manually read changes like added functions after every import-jdl that touches an entitiy I changed.
Is there a way to add functions to the classes JHipster creates without actually changing the files? Like code generation with anotations or extending JHipster created clasees? I feel like I am missing some important documentation from JHipster, I would be grateful for pointers in the right direction.
Thanks!

I faced this problem in one of my projects and I'm afraid there is no easy way to tell JHipster not to overwrite your changes.
The good news is you have two ways of mitigating this and both will make your life much easier.
Update your entities in a separate branch
The idea is to update your entities (execute the import-jdl command) in a different branch and then, once the whole process is finished merge the changes back to master.
This requires no extra changes to your code. The problem I had with this approach is that sometimes the merges were not trivial and I still had to go through a lot of code just to be sure that everything was still in place and working properly.
Do not change the generated code
This is known as the side-by-side practice. The general idea is that you never change the generated code directly, instead you put your custom code in new files and extend the original ones whenever possible.
This way you can update your entities and JHipster will never remove or modify your custom code.
There are two videos available that will teach you (with examples) how to manage this:
Custom and Generated Code Side by Side by Antonio Goncalves
JHipster side-by-side in practice by David Steiman
In my opinion this is the best approach.
I know this probably isn't the answer you were looking for, but to my knowledge there's no better way.

Related

Isolating scenarios in Cabbage

I am automating acceptance tests defined in a specification written in Gherkin using Elixir. One way to do this is an ExUnit addon called Cabbage.
Now ExUnit seems to provide a setup hook which runs before any single test and a setup_all hook, which runs before the whole suite.
Now when I try to isolate my Gherkin scenarios by resetting the persistence within the setup hook, it seems that the persistence is purged before each step definition is executed. But one scenario in Gherkin almost always needs multiple steps which build up the test environment and execute the test in a fixed order.
The other option, the setup_all hook, on the other hand, resets the persistence once per feature file. But a feature file in Gherkin almost always includes multiple scenarios, which should ideally be fully isolated from each other.
So the aforementioned hooks seem to allow me to isolate single steps (which I consider pointless) and whole feature files (which is far from optimal).
Is there any way to isolate each scenario instead?
First of all, there are alternatives, for example: whitebread.
If all your features, needs some similar initial step, maybe background steps are something to look into. Sadly those changes were mixed in a much larger rewrite of the library that newer got merged into. There is another PR which also is mixed in with other functionality and currently is waiting on companion library update. So currently that doesn't work.
Haven't tested how the library is behaving with setup hooks, but setup_all should work fine.
There is such a thing as tags. Which I think haven't yet been published with the new release, but is in master. They work with callback tag. You can look closer at the example in tests.
There currently is a little bit of mess. I don't have as much time for this library as I would like to.
Hope this helps you a little bit :)

How to write feature file and when to convert them to step definition to adapt to a changing business requirement?

I am working on a BDD web development and testing project with other team members.
On top we write feature files in gherkin and run cucumber to generate step functions. At bottom we write Selenium page models and action libraries scripts. The rest is just fill in the step functions with Selenium script and finally run cucumber cases.
Sounds simple enough.
The problem comes starting when we write feature files.
Problem 1: Our client's requirement keeps changing every week as the project proceed, in terms of removing old ones and adding new ones.
Problem 2: On top of that, for some features, detailed steps keep changing too.
The problem gets really bad if we try to generate updated step functions based on updated feature file every day. There are quite some housecleaning to do to keep step functions and feature files in sync.
To deal with problem 2, I remembered that one basic rule in writing gherkin feature file is to use business domain language as much as possible. So I tried to persuade the BA to write the feature file a little more vague, and do not include too many UI specific steps in it, so that we need not to modify feature files/step functions often. But she hesitate 'cause the client's requirement document include details and she just try to follow.
To deal with problem 1, I have no solution.
So my question is:
Is there a good way to write feature file so that it's less impacted by client's requirement change? Can we write it vague to omit some details that may change (this way at least we can stabilize the step function prototype), and if so, how far can we go?
When is a good time to generate the step definitions and filling in the content? From the beginning, or wait until the features stabilize a little? How often should we do it if the feature keep changing? And is there a convenient way to clean the outdated step functions?
Any thoughts are appreciated.
Thanks,
If your client has specific UI requirements for which you are contracted to provide automated tests, then you ought to be writing those using actual test automation tools. Cucumber is not a test automation tool. If you attempt to use it as such, you are simply causing yourself a lot of pain for naught.
If, however, you are only contracted to validate that your application complies with the business rules provided by your client, during frequent and focused discovery sessions with them, then Cucumber may be able to help you.
In either case, you are going to ultimately fail, if there's no real collaboration with your client. If they're regularly throwing new business rules, or new business requirements over a transome through which you have limited or no visibility, then you are in a no-win situation.

Tracking changes to a (functional) design document

I am looking for a good way to keep a design document up to date with the latest decisions.
We are a small team (two developers, game designer, graphic designer, project manager, sales guy). Most of our projects last a couple of months. At the start of the project a design is made but we generally find ourselves making changes or new decisions throughout the project. Most of these changes are improvements, so we want to keep our process like that. (If the changed design results in more time needed this is generally taken care of, so that part is OK)
However, at the moment we have no nice way of capturing the changes to the initial design document and this results in the initial design quickly being abandoned as a source while coding. This is of course a waste of effort.
Currently our documents are OpenOffice/Word, and the best way to track changes in those documents will probably be adding a changelist to the top of the document and making the changes in the text in parallel — not really an option I'd think as ideal.
I've looked at requirements management software, but that looks way to specialized. The documents could be stored in subversion but I think that is a bit too low level to give insight in the changes.
Does anyone know a good way to track changes like these and keep the design document a valuable resource throughout the project?
EDIT: At the moment we mostly rely on changes to the original design being put in the bugtracker, that way they are at least somewhere.
EDIT: Related question
Is version control (ie. Subversion) applicable in document tracking?
I've found a wiki with revision logging works well as a step-up from Word documents, provided the number of users is relatively small. Finding one that makes it easy to make quick edits is helpful in ensuring it's kept up to date.
Both openoffice and word include capaiblities for showing/hiding edits to your document. Assuming there's resistance to changing, then that's your best option - either that or export to text and put it into any source control software.\
Alternatively, maintain a separate (diffable using the appropriate tool) document for change-description text, and save archive versions at appropriate points in time.
This problem has been a long standing issue in our programming shop too. The funny thing is that programmers tend to look at this from the wrong optimization angle: "keep everything in one place". In my opinion, you have two main issues:
The changes' descriptions must be easy to read ("So what's new?")
The process should be optimized for writing of the specification to agree upon, and then get to work already!
Imagine how this problem is solved in another environment: government law making. The lawbook is not rewritten with "track changes" turned on every time the government adds another law, or changes one...
The best way is to never touch a released document. Don't stuff everything into the same file, you'll get the:
dreaded version history table
eternal status "draft",
scattered inconsistencies,
horribly rushed sentences, and
foul smelling blend of authors' styles
Instead, release an addendum, describing only the changes in detail, and possibly replacing full paragraphs/pages of the original.
With the size of our project, this can never work, can it?
In my biggest project so far, I released one base spec, and 5 consecutive addenda. Each of around 5 pages. Worked like a charm!
I don't know any good, free configuration management tools, but why not place your design under source control? Just add it to SVN, CVS, or whatever you are using. This is good because:
1) It is always up to date (if you check it in, of course)
2) It is centralized
3) You can keep track of changes by using the built-in compare feature, available in almost any source control system
It may not be the 'enterprisish' solution you'd want, but you are a small team of developers anyway, so for that situation, it is more than perfect.
EDIT: I see now that you already mentioned a source control system, my mistake. Still, I think it should work well.
Use Google Docs. Its free, web based, muti-user in real time, you can choose who has access to your documents, and keeps versioning. You can also upload all your word documents and it will transform them for you.
For more information: http://www.google.com/google-d-s/intl/en/tour2.html

SubSonic: What if something changes?

I am using SS 2.1 and just starting out with it. I got everything loaded and it works for the simple tests I've been doing, but a general question: Is there a way to update my build without having to rebuild the entire thing, an example would be if we change the layout of a table. Lets say we have a id, and name, and then later on add id, name and disabled. Is SS smart and able to pick that up or would it require a new build? Thank you very much for your time.
Cheers
I believe you use a command-line app to generate your mapping files, so that command-line app would have to be re-run for that to happen. Second, the mapping code would have to be compiled on the fly after insert...most .net application do not do this.
But the biggest reason you would not want the mappings to be generated on the fly: speed. It takes time to do that, several seconds at least. Then how would you time it? Not every call -- that would be insane. Once a day? when during the day?
So no, SubSonic only generates the mapping files when you ask it to. If you change the database you risk breaking your application.
If you are using the build provider with ASP.NET, building your project will make SubSonic catch the change and update the generated classes.
Otherwise you will need to use SubCommander to generate the classes again.

Should I keep solutions and features in a 1-1 ratio?

I have a complex sharepoint deploy with multiple EventReceivers and Workflows.
I also have schema changes to existing lists, adding new columns of metadata and changing existing columns.
Should I package a single feature, eventreceiver or workflow, to a single solution, or should I put multiple features inside the single solution since they all work together?
One major reason I am asking is for future code upgrades. If the features are seperated, then an upgrade in one portion of code would not require a re-deploy of all the features in the solution. Is this something I should worry about or does the "stsadmin -o upgradesolution" take care of any issues with the upgrade of a solution with many features?
Let me know if this makes sense to any SharePoint gurus out there.
Thank you,
Keith
Update:
Looking at the website drax referenced, I found this reference site: http://msdn.microsoft.com/en-us/library/aa543659.aspx
This statement seems to put a large handicap on upgrading features in solutions:
Solution upgrade can only be used to
replace files. You can add new files
in a solution upgrade and remove old
versions of the files, but you cannot
install Features or use Feature event
handlers to run code for Feature
installation and activation. The
following operations are not supported
in solution upgrade.
Removing old Features in a new
version of a solution.
Adding new Features in a solution
upgrade.
Updating or changing the receiver
assembly for existing Features in a
new version of a solution.
Adding or changing Feature elements
(Element.xml files) in a new version
of a solution.
Adding or changing Feature
properties in a new version of a
solution.
Changing the ID or scope of old
Features in a new version of a
solution.
Removing Feature elements
(Element.xml files) in a new version
of a solution.
Removing Feature properties in a new
version of a solution.
So... What can you do with a solution upgrade?
I would advise against splitting everything into multiple solutions. Maintaing that can quickly become nightmare. Try to structure your project, which should is used to create WSP, in same manner as 12 folder of sharepoint. Then you can use WSP builder, last stable version brings a lot of useful stuff.
Also i've not noticed any problems with redeploying solutions. According to this article and to my experience deployment of WSP takes care of synchronization between versions. So if you will add some new features they will appear and if you remove/change features they will be modified accordingly.
EDITED:
So I did some quick research on MOSS Updating topic. According to MS there are two ways of updating solutions:
In-place update
Incremental update
Basically, in-place update is standard way of updating. Meaning you are relying on build-in functionality as described in this (same document as posted before) document. Problem with this solution is that it lacks quite a lot of functionality (versioning, changing of ID's of features,...).
Incremental update (this is how MS calls it probably) don't rely on build-in solutions. That means it is up to everybody to implement it by themselves :(. What is even better I was not really able to find any guidelines for this approach. I suppose that approach you would like to take is example of incremental update (splitting project into many independent solutions).
Also note that Incremental update is not officially supported by MS.
So I don't really know what advice should I give to you. Single WSP is more maintanable than buch of them, also if you are doing just some minor changes updates work perfectly. But if you need to make some bigger structural changes problems start to show.
I'll probably wait and see if people with more MOSS expertise can say something about this topic.
Basically (for the reasons you've mentioned), you should think of solutions as you would .Net assemblies - atomic units of code that can be deployed separately from others. Using upgradesolution will cause a redeploy of all the contained features - if nothing's changed, then nothing should change for the sites that use that feature. But, if that makes you nervous, consider splitting it up.
UpgradeSolution is really handy if you are just updating the assembly and leaving the provisioned files intact.
Unless you specify -local then upgradesolution will perform a full iisreset across your infrastructure. This is really worth noting for when you are planning the right time to perform upgrades.

Resources