I'm new to Liferay Portlet development. We have an existing development pipeline that is basically local development until the code is committed and PR'd. Once a PR is approved, Jenkins builds an artifact and Puppet pushes that artifact to an instance of JBoss running in a development environment. Our deployments to QA and production environments use the same artifact.
From my (admittedly limited) understanding of Liferay administration, I know that there's a way to upload an artifact for a Portlet and then upload a new version through the Liferay control panel. Unfortunately, requiring a human to login and deploy new versions of jar files for each changed portlet won't work for our build pipeline.
What is the recommended way to deploy new versions of portlets to Liferay without having to use the UI?
You'll just copy them to Liferay's deploy folder, conveniently located in Liferay's home folder.
The natural state of the deploy folder is "empty", thus you'll need appropriate permissions for Liferay on that folder, so that deployed plugins can be deleted from there.
Related
My company has recently moved to self-hosted Gitlab instance, and now I’m trying to wrap my head around it to see how we could use the CI/CD features of it for our cases. For instance:
We have a PHP-based front end project which consists of several PHP, CSS and JS files that as of now are being copied to our Apache2 folder in the same structure as we use for development.
Is it possible to configure Gitlab to do it for us, let’s say, implementing such an algorithm:
we make a DEPLOY branch in our repository
when we’re done with changes in other branches, we merge those branches into DEPLOY
Gitlab CI/CD detects the new activity and automatically puts the latest version of the files from this branch into our Apache2 directory (connected as a remote folder)
Any advice and guidance is appreciated. I’m currently lost in many manuals that describe Docker deployment etc. which is not yet used in our project.
Where is defined destination of deployed portlets (wars)?
When I run command blade deploy, it deploys all portlets inside my liferay workspace and put their wars inside osgi/war folder. I want to have this wars in deploy folder, because I start this liferay application with docker and docker want to have them in /mnt/liferay/deploy.
Thanks.
In order to deploy plugins to the runtime environment, you'll copy them into the ${liferay-home}/deploy directory (in your case, that's /mnt/liferay/deploy - where that is on the docker host, you'll have to figure out). The runtime environment will process your plugin (differing for WAR and JAR plugins) and move them to a location that you shouldn't have an eye on any more.
In order to deploy: Just copy them where they shall go. The directory will be empty again, once your plugin is deployed. The final destination location is irrelevant, as you can't change anything within it anyway (or: don't expect any live changes to have an effect on the running system).
We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak
The company I currently work for has a solution with 3 projects.
_Common... which was originally a Web project, but then changed to a Class Library.
Website1... Which is a website
Website2... Completely different website.
In Azure, we have a Deployment configuration so whenever our BitBucket Repo gets a checkin, it should build and deploy the solution.
Unfortunately, it seems like _Common is getting built as the target project. Which is breaking Azure.
ALSO Unfortunately, we have Two DIFFERENT Products (Website1 and Website2) which are in Two Different Azure Apps. They are both looking at the same BitBucket Repo, and both building whenever a checkin happens... but both are building _Common.
Can I have our "Website1" Azure App build the solution with Website1 as the startup project, and have our "Website2" Azure App build the solution with "Website2" As the startup project?
This obviously won't work to set the default project in Github, since we still need two different builds with two different startup projects.
You certanly can!
Go to your "Website1" Web App and go to Settings > Application Settings.
If you are using .Net Framework 4.5 or lower
Add a setting called Project, which value points to the CSPROJ file you want to build, using the full path from the repository root folder.
If you are using ASP.NET Core 1.0 / ASP.NET 5
Add a setting called Project, which value points to the folder that contains the project.json file of the project you want to deploy, do not include the filename in the path.
I have an SVN repository on my server(windows-7) and my application is runnin on Tomcat.
On server every time i check out my application(which was committed from different machines) and then manually place the files in webapps folder to deploy the app onto tomcat.
Is there a way where i can set up a svn linking with webapps folder, wherein whenever the users commit there code it should be directly deployed on to tomcat webapps folder.
This question comes up often enough that it's in the FAQ
Aside from doing it strictly via a hook script (which requires admin-level access to the repository, on the repository server), you can set up a Continuous Integration server to monitor the repository and perform your deployments that way.