I am aware that I could use Task Scheduler to schedule an unattended installation. However, is it possible to run the Setup, select options and then have it start the install process at a set time or with a countdown timer? If possible, how could this functionality be created?
You basically need two installers.
An actual installer.
A wrapper installer that will just store/install the actual installer somewhere and schedule a task to run the actual installer at a certain time.
It can actually be just one installer (binary), running with a different set of (command-line) arguments. But that's a bit more difficult to implement.
See also How to add a scheduled task with Inno Setup
Or consider using RunOnce registry key to schedule the upgrade for the next login (if that helps).
Related
I am new to Azure DevOps and trying to create my first Azure pipeline. I have a ASP.NET MVC project and there are a few NuGet packages that need to be restored before the MSBuild step.
Unfortunately, the NuGet restore is failing with the following error:
The pipeline is not valid. Job Job_1: Step 'NuGetCommand' references
task 'NuGetCommand' at version '2.194.0' contains an execution handler
that relies on NodeJS version '6' which is restricted by your
administrator.
NodeJS 6 came disabled out of the box so we are not going to enable it.
My Questions:
Is there an alternative to NuGet restore that does not use NodeJS?
Is there a way to update the NodeJS6 to a higher version?
update 23-Nov-2021
I have found a work around for the time being. I am using a custom PowerShell script to restore NuGet Packages and build Visual Studio project
$msBuildExe = 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\Bin\MSBuild.exe'
Write-Host "Restoring NuGet packages" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /p:Configuration=Release /p:platform=x86 /t:restore
Note: $path here is the path to my .csproj file
Apparently, other people are also getting the same issue and it is just a matter of time that the task is updated by the OpenSource community.
Here are some similar issues being faced in other tasks as well:
https://github.com/microsoft/azure-pipelines-tasks/issues/15526
https://github.com/microsoft/azure-pipelines-tasks/issues/15511
https://github.com/microsoft/azure-pipelines-tasks/issues/15516
https://github.com/microsoft/azure-pipelines-tasks/issues/15525
It's AzureDevOps' NuGetCommand task that uses NodeJS, not NuGet itself. Therefore, you can find a way to restore without using Azure DevOps' NuGetCommand task.
Idea 1: use DotnetCoreCli task instead. However, this probably won't work for you since you said your project is ASP.NET MVC, rather than ASP.NET Core. Also, it also appears to need NodeJS to run.
Idea 2: Use MSBuild restore. You can test on your local machine whether or not this works by clearing your global packages folder, or temporarily configuring NuGet to use a different path, and then running msbuild -t:restore My.sln from a Developer PowerShell For Visual Studio prompt. If your project uses packages.config, rather than PackageReference, you'll need to also pass -p:RestorePackagesConfig=true (although maybe this is currently broken). I'm not an expert on Azure Pipelines tasks, so I don't know what it means that this task defines both PowerShell and Node execution entry points, but maybe it means it will work even if your CI agent doesn't allow NodeJS.
Idea 3: Don't use any of the built-in tasks, just use - script: or - task: PowerShell#2, but even that is a little questionable whether it'll work since even the powershell task defines a Node execution entry point. I'm guessing it will work, but I don't have access to a CI agent where NodeJS is forbidden, so I couldn't test even if I wanted to. Anyway, if this works, then you can run MSBuild yourself (but it might also be your responsibility to find msbuild.exe if it's not on the path). Or you can download nuget.exe yourself and execute it in your script. The point is, if you can get Azure Pipeline's script task working, you can run any script and do everything you need yourself.
Idea 4: Use Microsoft Hosted agents. They have documented all the software they pre-install on the machines, which includes Node JS. Downside is that once you exceed the free quota it costs money, and I've worked for companies where it's easier to get money to buy hardware once-off, and pretend that maintenance of the server is free, even though it reduces team productivity, rather than pay for a monthly service. So, I'll totally understand if this is not an option for you.
Idea 5: Talk to whoever maintains your CI agents and convince them to allow & install NodeJS. It's clearly a fundamental part of Azure Pipelines. The tasks are open source on github, and you can see that pretty much all of them use NodeJS to orchestrate whatever work it does. Frankly, I thought the agent software itself was a NodeJS application, so I'm surprised that it runs without NodeJS.
We have a self-hosted build agent on an on-prem server.
We typically have a large codebase, and in the past followed this mechanism with TFS2013 build agents:
Daily check-ins were built to c:\work\tfs\ (taking about 5 minutes)
Each night a batch file would run that did the same build to those folders, using the same sources (they were already 'latest' from the CI build), and build the installers. Copy files to a network location, and send an email to the team detailing the build success/failures. (Taking about 40 minutes)
The key thing there is that for the nightly build there would be no need to get the latest sources, and the disk space required wouldn't grow much. Just by the installer sizes.
To replicate this with Azure Devops, I created two pipelines.
One pipeline that did the CI using MSBuild tasks in the classic editor- works great
Another pipeline in the classic editor that runs our existing powershell script, scheduled at 9pm - works great
However, even though my agent doesn't support parallel builds what's happening is that:
The CI pipeline's folder is c:\work\1\
The Nightly build folder is c:\work\2\
This doubles the amount of disk space we need (10gb to 20gb)
They are the same code files, just built differently.
I have struggled to find a way to say to the agent "please use the same sources folder for all pipelines"
What setting is this, as we have to pay our service provider for extra GB storage otherwise.
Or do I need to change my classic pipelines into Yaml and somehow conditionally branch the build so it knows it's being scheduled and do something different?
Or maybe, stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before?
(I did try looking for the same question - I'm sure I can't be the only one).
There is "workingDirectory" directive available for running scripts in pipeline. This link has details of this - https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
The number '1' '2'...'6' of work folder c:\work\1\, c:\work\2\... c:\work\6\ in your build agent which stands for a particular pipeline.
Agent.BuildDirectory
The local path on the agent where all folders for a given build
pipeline are created. This variable has the same value as
Pipeline.Workspace. For example: /home/vsts/work/1
If you have two pipelines, there will also be two corresponding work folders. It's an except behavior. We could not configure pipelines to share the same build folde. This is by designed.
If you need to use less disk space to save cost, afraid stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before is a better way.
I am working on a B2C project.
We are using customized bundle implementation in our project.
However we were struggling with making bundle synchronization work.
With some effort, we managed to find the ImpEx to synchronize the Bundle:
UPDATE CatalogVersionSyncJob;code[unique=true];roottypes(code)[mode=append];
;$syncJobCode;BundleTemplate,BundleTemplateStatus;
However, I am worried whether this will also synchronize the ChangeProductPriceBundleRule available out of the box.
Also how do I synchronize this from backoffice?Is running this cronjob from the Backoffice the only solution?How do Backoffice users see a Synchronize button?Is creating a separate node necessary?
How to synchronize from backoffice?
Go to Backoffice
System -> Background Processes -> Jobs
Search for your sync job and use Sync button for panel
Is running this cronjob from the Backoffice the only solution?
You can schedule this job to run on specific interval (Cron Job)
Is creating a separate node necessary?
Not necessarily but you can customize if you want.
Hope it helps!
I've just started to get to grips with Jenkins. It currently performs the following tasks:
Pulls the latest codebase from git
Uploads the codebase via sftp to my environment
Sends a notification email to the testers and the PM to inform them of a completed deployment.
However for it to be truly useful I need it to perform two more tasks:
Delete the robots.txt and .htaccess file which exists in the git repo and replace it with a predefined version which is specific for the server
Go through all the code and remove specific code-blocks (perhaps something in between comments: eg. /** Dev only **/ Code to be removed goes here /** Dev only **/ or something like that).
Are there any plugins which can accomplish these things or would I have to read up on writing groovy scripts for this sort of thing (I don't know anything about those yet).
On a related note: I'd also love it if it could combine kit and SASS files, however I can't see a plugin for these things, however I assume I can just install compass on my build server and then run it via command line in the build process. Is that correct?
Instead of putting your build tasks directly into the Jenkins job, I recommend writing a build script to accomplish your publishing/deployment tasks.
Jenkins is great for having a single point of automation that is easy to run, can publish build results, and can track successes and failures. In my experience though, you're better off not putting your individual tasks and configuration steps into the Jenkins job configuration. At some point, you'll want to be able to run this job without Jenkins, either because you want to test local changes, or you want to handle multiple jobs and trying to keep job configurations in sync is not fun, or because you're moving to another build/deployment system. Also, putting the build script into a file allows you to put it into your source control system and track changes.
My advice: choose a scripting language (Python, Ruby, Perl, whatever you're comfortable with) or build system (SCons and Rake are options) and write a build script. In Python Ruby, and Perl, it's easy to manipulate files (#1) and all have a wide choice of templating systems that will accomplish #2. Then the Jenkins job becomes running your build script on the command line (or executing through a language-specific builder). And the build script can include running any of the tasks that you decide to put in your build (compass, etc).
How can I schedule a build without tag over Windows, Linux and WCE in Hudson using a shell script and generate a report that will be sent to a specified server?
And so the conditions are :
1. How can I create the build without creating a new tag?
2. How is it possible to excute .sh over windows and WCE (Windows Mobile), is it simply by going through Cygwin? Moreover, having a cross-platform (3 platforms) build does it mean that I must run the build 3 times?
3. How to generate a report and save it in a directory of a server that I'm authorized to access to?
I know that I asked many questions at once. It is because this is my first use of Hudson and these are kind of details. Moreover, I don't want to make a mistake by creating new tags during my tests. The 1st and 3rd questions are the most important. If anyone gives me the right answer to them, I'll choose it as the right answer.
Thank you a lot.
first, people nowadays mostly use jenkins instead of hudson (open source, better support)
build can be started manually in hudson / jenkins, just click the green arrow. It will create a new build but won't change your repository (unless the last step of your build is creating a tag, in that case, just remove that step for testing)
Usually, .sh scripts run in shell excecutables (ash, sh, bash, csh...) and are not supported of the shell on windows. You'll have to go through cygwin or have a platform specific build command
kind of not clear for me. If you use jenkins to build a matric build (with the matrix axis being your target platform), you'll have automatically a nice report in jenkins itself (status of each build). You can keep artifacts (use post-build action : archive the artifacts) or use another plugin to publish the file you like (exemple : ftp reporting)
Sorry not being able to be more precise, that's how far I understand your questions.