Is there a systematic way to incorporate data migration scripts with an SSDT publish?
For instance, deploying a new feature that requires a new permission be added to the database in the form of a series of inserts.
I know I can just edit the generated publish script, but if the script needs to be regenerated you have to remember to re-add the data migration scripts.
The recommended practice is to use a post-deployment script. This blog post describes the approach further:
http://blogs.msdn.com/b/ssdt/archive/2012/02/02/including-data-in-an-sql-server-database-project.aspx
Related
I want to automate my RDB. I usually use SQLDeveloper to compile, execute and save my PL SQL scripts to the database. Now I wish to build and deploy the scripts directly through gitlab, using ci/cd pipeline. I am supposed to use Oracle Cloud for this purpose. I don't know how to achieve this, any help would be greatly appreciated.
Requirements: Build and deploy PL-SQL scripts to the database using gitlab, where the password and username for the database connection are picked from vault on the cloud, not hardcoded. Oracle cloud should be used for the said purpose.
If anyone knows how to achieve this, please guide.
There are tools like Liquibase and Flyway. Those tools do no do miracles.
Liquibase has a list of changes (XML or YAML) to be applied on a database schema (eventually with undo step).
Then it has a journal table in each database environment, so i can track which changes were applied and which were not.
It can not do mighty schema comparisons like SQL Developer or Toad does.
It also can not prevent situations where applied DML change on prod database goes kaboom, because the DML change was just successfully tested on 1000x smaller data set.
But yet it is better than nothing and it can be integrated with ansible/gitlab and other CI/DC tools.
You have a functional sample, using Liquibase integration with sqlcl in my project Oracle CI/CD demo.
To be totally honest
It's a little out-of-date, because I use a trick for rollback because in the moment of writting, Liquibase tagging was not supported. Currently it's supported
The final integration with Jenkins is not done, but it's obvious
I have create my app with jhipster in july, and i have put it in production.
Now i need edit a entity.
When i use jhipster entity the subgenerator update the initial xml generate for the entity, but not is thes the correct work, the code need create a new xml with the update like: mvn liquibase:diff
Searching on the web i have foud this answere: Add new field to existing entity with JHipster. At a certain point the user say:
create a new migration file to add only the new column (see addColumn
documentation), revert the creation migration to its original content
using git, run your app to apply changes to your database. This is
what you would do when your app is in production.
In reference the last phrase, is the true? Jhipster entity subgenerator not support update of db but only initial creation?
tks
Yes it's correct.
Either you write a manual Liquibase migration or you use liquibase:diff to help you in this process.
See official docs: https://www.jhipster.tech/development/#database-updates-with-the-maven-liquibasediff-goal
We are looking at removing developers from production and want a simple kind of deployment management tool. One suggestion that some members are using with SalesForce is Jenkins. I have never used Jenkins or any kind of deployment tool before. I normally just copied my code from IDE and updated the file in the SuiteScript file cabinet.
Does Jenkins work for NetSuite? Or what do you recommend for this purpose?
We are planning to use Bit Bucket (which runs Git in the background) as our version control in case that matters.
Thank you for any help
IMO the greatest challenge in integrating with any CI environment(be it Jenkins or any other) is the fact that you can move code files from one system to another using code/APIs but, NOT things like scripts, custom records, fields its deployments , etc. for which you need a bundling process and hence, manual intervention.
NetSuite in recent Suiteworld 2015 said that its coming up "Change Management" which would allow you to put everything that is part of your app to version control system such as git. Please see SuiteAnswer Id 42387, when this feature is rolled out, you can integrate with your CI tool to automatically copy/deploy your app details to an another NetSuite account and run your tests there and accordingly pass/fail your build.
Why do you want to remove developers from Production? This will severely hamper their ability to create solutions for your NetSuite account and will create a ton of overhead for them.
If you must have them out of Production, then probably your "best" option would be to have them build their solutions in Sandbox and then use SuiteBundles for deployment to Production. A Production Admin would need to update the appropriate Bundle(s) for all Production migrations.
NetSuite has also built a SuiteCloud IDE plugin for Eclipse which allows uploading and downloading files (no copy-paste necessary), so if you're not using that I would recommend it.
We are using Jenkins for our own internal automated testing, but not for deployment into NetSuite. I do not know if someone has already built a NetSuite plugin for Jenkins; it is likely you would have to build your own file upload mechanism using the NetSuite Web Services SOAP API, but that would still only allow deployment of source files. Developers will most likely also need to be creating and updating custom records, fields, lists as well as Script records and Script Deployment records, which you will not be able to do through Jenkins or any other tool that I know of.
We are running CRM in an enterprise style environment, with 4 environments that changes need to propagate through before entering production. We are also dealing with a very large dynamic dataset and complex logic implemented in SQL server working from this dataset, along side the CRM components.
Our environments are as follows:
Development
Test
Staging
Production
To facilitate development, testing, etc we restore our dataset and CRM from Production back to the various environments as needed, in Development we restore very frequently (usually nightly).
One of the challenges we have encountered with this is that when restoring the CRM tennant automatically using the powershell tools, we need an accurate UserMap XML file available which contains all the users existing in our production environment. Currently whenever a new user is added to production we our automated restore fails and we have to manually update the file.
In looking to automate file creation, I was unable to find the answer to the following question: Does the UserMap.xml file need to also include disabled users?
To resolve this we produced a new UserMap.xml file from the production environment and manually checked to confirm that in face Disabled users ARE included in the UserMap.xml file.
Is there a way to rollback a Windows Azure Website and SQL deployment / publishing?
I published a website, now it's causing a lot of errors, and I want to get back to the previous state and work on the code some more.
Is this possible?
If you use Git or TFS you can take advantage of deployment rollback by selecting the previous deployment (as explained by Nathan Totten):
To rollback the database you can do several things. The easiest could be to use EF Migrations:
Run EF Migrations from your Application_Start (not sure if this is something you want to do
(Manually) call migrations.exe:
Migrate.exe MyApp.exe /startupConfigurationFile=”MyApp.exe.config” /targetMigration=”MyPreviousVersion”