Suppose we have 3 environments for LR namely , Development, Staging and Production.
For a content driven application, assume we have created all contents etc. in dev. site. Now we have to move them to Staging for customer verification and followed by Production.
Yes we can very well do this using Lar file. But is there any alternative to that.
In IBM WebSphere portal, we have option called syndication on click of this option , it automatically move the contents and relevant artifacts quickly. Is there any equivalent in LR.Lar file is very manual process so i want to do automation.Any ideas please share.
First, please note that Content Creation is Not a Development Activity!. Liferay has staging (local or remote) that will help you get this done on sites that contain productive content. But you must understand that there's quite a difference between code that you deploy and database content (i.e. content in the sense of the word)
That being said, up until Liferay 6.2 that was only a single step (from the staging to the production system), while Liferay 7 is implementing a multiple step staging process.
Related
I just started working with EPiServer (Sitecore before) and looking for a way how to synchronize content between environments automatically (developer-developer and developer-QA env).
We have our QA environment on an azure virtual machine and need to synchronize content during CI/CD.
EPiServer DXC Service doesn't meet the requirements because we are not working on a web app service.
Any ideas? Is there any already existing way to achieve it?
Automatic synchronization between environments has [more-or-less] gone away within the Episerver platform. The old way of doing this was using mirroring, but that's not available in the DXP, and is eventually being removed from the platform in favor of other strategies:
For moving bulk data (content items, content types, categories, visitor groups, etc.) between different environments, without touching code or the database, use the "Import Data" and "Export Data" tools within Admin mode. More information here: http://webhelp.episerver.com/latest/en/cms-admin/exporting-importing-data.htm
For bigger bulk migrations of data between environments, typically a database backup and restore is done between environments. Obviously, this is a bit more risky when involving a production environment.
If the content (or a content type change) is required as part of a deployment, you can build a content migration step. More information here: https://www.gulla.net/en/blog/renaming-an-episerver-page-property-using-a-migration-step/ and https://world.episerver.com/documentation/developer-guides/CMS/Content/Refactoring-content-type-classes/
If you are simply wanting to move authored content from a staging environment to production, it's suggested to create all content in production and use Episerver's Projects feature. More information here: https://webhelp.episerver.com/latest/en/cms-edit/projects.htm
If you are using Azure DevOps for CI/CD you might want to look into Epinova DXP Extension to Azure DevOps - that uses the Episerver deployment API but makes this easier to set up your pipelines. See https://www.epinova.no/en/folg-med/blog/2020/episerver-dxp-content-harmonization-with-epinova-dxp-deployment/ for more info.
Take a look at the Deployment API. There might be something in there for you. https://world.episerver.com/documentation/developer-guides/digital-experience-platform/deploying/episerver-digital-experience-cloud-deployment-api/
I am in process of making first Drupal 7 website and for that searching for the web hosting. And I found that, several hosting says 1-click Drupal installation. But as I was searching the net for standard practice of site development, many sources explains Develop the site on local environment and then transfer to the web server (which includes, transfer of database, and whole drupal with modules) which is quite convincing that you develop locally and transfer to web so it start working there as it was working on locally.
On the other hand, what is the use of 1 click drupal installation on web server, I believe it will install the fresh core drupal, so from again initially I have to start developing, installing each module so, starting from square 1.
So, which is the BEST practice for making web site live, shall i develop locally first or develop directly on web server?
Simultaneously, what is the best practice about maintaining site, I read that, there should be One live site where visitor will come, Second Test site which is similar to live, One local site, So what is the standard practice for this, and how to maintain?
Very thanks in advance.
In my previous answer in point 2 i outline 4 servers: DEV, STAGE, QA and PROD. this is usually the process on kind of "biggish" company where lots of people might be working in infrastructure, development and qa department. This said, if you are not working in a complex environment, you might just have 2 drupal installations, one for testing , on DEV (e.g dev.mysite.com) and one live (eg.mysite.com). The different url can be arranged from you cpanel or personal panel in case of a shared server.
They might run on the same server, however the dev site is the one you will be working on while creating the site, then you will clone the dev and make it live once the site is ready.
You will keep the dev site as a space to test new features, fix bugs, test updates of module or core files. Once these new features are implemented, or bug fixed, you will replicate the same steps on the live site.
GIT is a version control system: it allow you to keep track of the code you are working on, you might want to create 2 branches: DEV and MASTER.
You will be working on DEV to create the site or update files or fix bugs, and you will merge to LIVE and pull on the live server the code once it's stable. I hope this clarify a bit.
1) one click installation process are usually offered on shared servers, they then might have lower performance and memory limit that your local lamp set up. It is the good to check what is the version of PHP and MySQL that runs on the server, as well as max upload file limit or connection time. I prefer start working locally and then publish on my server, BUT if you install on your server first you will have a good idea of how drupal will perform in the real world, then you can always clone the site and db on your local, and also you will avoid the ugly surprise of trying to move your site from local to your server and find out bugs or migration issues.
2) in enterprise dev environment you might have 3/4 steps, DEV ("wild west") STAGE (release candidate) QA (quality assurance server) PROD (live site). You usually sync (eg with GIT) your local to DEV or STAGE , than you push to QA, then if it's all good to PROD
We are using Sitecore 8.1 with LUCENE search provider, 1 CM and 2x CDs. The solution is hosted in Azure Web Apps.
We noticed that when content author publishes or updates the article, the changes is seen my some users/browsers and not for others.
I suspect this is due to index not being built on one of CDs (as history engine is not enabled). In the past I could troubleshoot this by RDP to Azure Web Role VM or similar and analyse the lunene index files data time.
Above is not possible with WEB APP as you can't RDP or FTP to specific instances.
So..
Is there a way in Sitecore to find out whether index has been 100% built for N number of CDs?
Is it true that History Engine MUST be turned on if we have more than 1 CDs?
If there are N (where N > 1) number of CDs, does one of the CD gets rebuilt instantly after publish end? This is what we have noticed and it confuses me.
Any reason why History Engine section might be missing out of box?
Thanks.
Don't know.
My understanding that you need to have History Engine "on" if you have ANY CDs.
The combined instance (that has CM and CD on the same instance) does not need a History Engine as it gets updated instantly.
I would expect it to be missing, as the default installation is not intended for scaling. Also, I would mention, that you need all your CD instances that you publish to explicitly listed in a web.config (or added through Include). Please see this post from Alen Pelin: http://blog.alen.pw/2012/06/lucene-index-isnt-updated-on-cd.html
Live to Development Migration
We are currently migrating some sharepoint sites from external live environments to development environments hosted on vm's. The sites are a mixture of websites and intranets. We have not had access to the live environments so can not specify structure of the sites.
The sites do have some customisations applied. Some are customisations are packaged via wsp packages for which we have the source code (somewhere previous developers have left it need to find it)
The sites setup we have no knowledge so the objective is to restore live back to a development vm so we can bug fix and make enhancements moving forward.
What steps should be go through for this.
We have outlined the following steps:
Take a copy of the content databases/s
Take a copy of the wsp packages straight from the live environment (using powershell)
Create site collections from live on dev
Restore the content databases from live on to these.
Deploy the wsps from live on to dev.
Activate the features from live on our development vm's.
What other steps are missing as I am sure they are.
What I would add here are:
Make sure your notifications don't go to the users of live environment
Make sure your BDC and custom connectivity components don't modify or otherwise load production external data sources
Document and verify (using PowerShell) that all assets are deployed accurately, because sometimes you'll face issues such as event receiver registration, etc.
Make sure your InfoPath forms are reconfigured to use the updated data sources
Make sure your Alternate Access Mappings and Incoming/Outgoing Email settings are adequate
I am working on a SharePoint 2010 Server and i have following items in my SharePoint solution
Couple of web parts
State Machine Work flow (which will be integrated to an Infopath form library)
Infopath task edit forms
Lets say this solutions is deployed to http://[SharePoint201Server]:[PortNumber-x]/
This is the only server i have (No extra server for UAT), and what ever has been done so far needs to be given for user acceptance testing (UAT). I may create one more site at http://[SharePoint201Server]:[PortNumber-y]/ for it.
My problme is that how do i maintain two copies of my source code ? I will have to continue development on the same set of source code for next UAT release. I simply can not create simple copies of the source code as it will have same Assembly name and feature id, etc..As if i do so, any changes on under Development source code would affect UAT assemblies.
One thing i can do is to create seperate projects for UAT and Development with different Features and Assembly names, but is not that too much of an unneccesory work ?
What can be the best approach in such situation ?
i would suggest use sand boxed solution and create two site collection one for UAT and other for production. deploy your solutions individually on each sandbox solution. Sandbox solutions are deployed in bin not in GAC so you would not be having trouble code duplication
There is no best approach for this. You need to have different environments for development and UAT.
If the availability of machine is limitation, One possible solution is to have VM environment of the same machine for development and actual environment for UAT.
That obviously requires sophisticated hardware configuration to run both simultaneously.