I'm extremely new to this, so apologies if it's a dumb question but I couldn't find anything about it either here, at help.octopusdeploy.com, or google.
Additionally, I'm a DevOps engineer, not a developer and have been using TC and Octopus for about 3 weeks. I'm loving it so far, but it's probably best if you consider me a total rookie ;)
I currently have a build configuration in TeamCity that on a successful build run, creates a release in Octopus and deploys the project to a test server on a succssful build. It is kept separate but deployed alongside the master build. So, in IIS it looks like:
IIS Sites
site.domain.com (master build)
featurebuild1-site.domain.com (feature branch 1)
featurebuild2-site.domain.com (feature branch 2)
etc...
Obviously, this makes life really easy for the devs when testing their feature builds, but it leaves a hell of a mess on the test and integration servers. I can go in and clean them up manually, but I'd vastly prefer it to not leave crap lying around after they've removed the branch in TeamCity.
So, the Project in TeamCity looks like:
Project Name
Feature
/Featurebuild1
/Featurebuild2
/Featurebuild3
Master
Assuming all three feature builds run successfully, I will have 3 feature build IIS sites on the test server alongside the master. If they decide they're done with Featurebuild3 and remove it, I want to somehow automate the removal of featurebuild3-site.domain.com in IIS on my test server. Is this possible? If so, how?
My initial thoughts are to have another Octopus project that will go in and remove the site(s), but I can't figure out if I can/how to trigger it.
Relevant details:
TeamCity version: 9.1.1 (build 37059)
Octopus Deploy version: 3.0.10.2278
Ok, it took me a little while to figure it out, but here's what I ended up doing (just in the event that anyone else is attempting to do the same thing).
I ended up bypassing TeamCity entirely and using our Stash repositories as the source. Also, as I didn't need it to clean up IMMEDIATELY upon deletion, I was happy to have it run nightly. Once I'd decided that, it was then down to a bunch of nested REST API calls to loop through each project and team to enumerate all the different repositories (apologies if I'm butchering terminology here).
$stashroot = "http://<yourstashsite>/rest/api/1.0"
$stashsuffix = "/repos/"
$stashappendix = "/branches"
$teamquery = curl $stash -erroraction silentlycontinue
At this point, I started using jq (https://stedolan.github.io/jq/) to do some better parsing of the text I was getting back
$teams = $teamquery.content | jq -r ".values[].link.url"
Foreach ($team in $teams)
{
# Get the list of branches in the repository
# Feature branch URL format be like: http://<yourstashsite>/projects/<projectname>/repos/<repositoryname>/branches #
$project = $stashroot +$team +$stashsuffix
$projectquery = curl $project -erroraction silentlycontinue
$repos = $projectquery.content | jq -r ".values[].name"
Foreach ($repo in $repos)
{
Try
{
$repository = $stashroot +$team +$stashsuffix +$repo +$stashappendix
$repositoryquery = curl $repository -erroraction silentlycontinue
$reponames = $repositoryquery.content | jq -r ".values[].displayId"
Foreach ($reponame in $reponames)
{
#write-host $team "/" $repo "/" $reponame -erroraction silentlycontinue
$NewObject = new-object PSObject
$NewObject | add-member -membertype NoteProperty -name "Team" -value $team
$NewObject | add-member -membertype NoteProperty -name "Repository" -value $repo
$NewObject | add-member -membertype NoteProperty -name "Branch" -value $reponame
$NewObject | export-csv <desiredfilepath> -notype -append
}
}
Catch{} # Yes, I know this is terrible; it makes me sad too :(
}
}
After that, it was simply a matter of doing a compare-item against the CSV files from two different days (I have logic in place to look for a pre-existing csv and rename it to append "_yesterday" to it), outputting to a file, all the repositories/builds that have been nuked since yesterday.
After that, it strips out the feature branch names (which we use to prefix test site names in IIS, and loops through looking for any sites in IIS that match that site prefix, removes them, the associated application pool, and deletes the directory on the server that stored the site content.
I'm sure there are far better ways to achieve this, especially if you know how to code. I'm just a poor little script monkey though, so I have to make do with what I have :)
Related
I am trying to create a site script from an existing team site but when I run the script that I followed from this Microsoft document it asks me for the WebURL WebURL Prompt, even though it is in the script, then when I provide it at the prompt it gives me the error Error.
I am not new to site scripts but have not used them much. I would like to create one from an existing site that I have created for the PM team. Please advise. I am using the latest SharePoint Online Management Shell and I am logged in. I am also a Global Admin.
Any assistance would be helpful as I have done everything I can think of to do and Googled my heart out but cannot figure out what is going on.
Welcome to StackOverflow!
From the screenshot you posted, it looks like you did not include ` (back tick) at the end of each row in your command. Those are necessary in order for PowerShell to understand entirety of your command which spans across multiple rows, like in the following example:
Get-SPOSiteScriptFromWeb `
-WebUrl https://contoso.sharepoint.com/sites/template `
-IncludeBranding `
-IncludeTheme `
-IncludeRegionalSettings `
-IncludeSiteExternalSharingCapability `
-IncludeLinksToExportedItems `
-IncludedLists ("Shared Documents", "Lists/Project Activities")
Alternative would be to enter entire command and all parameters in single row, like this:
Get-SPOSiteScriptFromWeb -WebUrl https://contoso.sharepoint.com/sites/template -IncludeBranding -IncludeTheme -IncludeRegionalSettings -IncludeSiteExternalSharingCapability -IncludeLinksToExportedItems -IncludedLists ("Shared Documents", "Lists/Project Activities")
Hope this helps :)
Dragan
Good morning Team.
Need some guidance on New-AZVM ----- --AvailabilitySetName .
ALL the normal params work just fine, but to use AZ load Balancer, you need a Scale set or -Availability Set for VMs.
Normal Centos VM build works 100% without -AvailabilitySetName
When I Get-AzAvailabilitySet I can see my pre-staged Set listed.
When I pass the Get-AzAvailabilitySetName as given by Get-AzAvailabilitySet I get Parameter cannot be resolved.
Does anybody have a working example, Tech-net only states it must be passed as a string .
I validated my Variable with GM and it is. What am I missing?
Found my own answer after two days of playing around, for those who might have a similar problem, it appears the new-azvm -availabiltysetname does not work. You need to define the AzSET with .id where you declare the other VM basics like size with -availabiltysetID parameter, like this > #Define the parameters for the new virtual machine.
$VirtualMachine = New-AzVMConfig -VMName $azureVmName -VMSize $azureVmSize -AvailabilitySetId $azureAvailSetID
Oh like all the others, it needs to be a string
My scenario is that we want to run test on devops multithreaded. Mainly things should run and pass..So no pngs or video is needed to record.
BUT, say we have issues, I have a runsettings file that has the to record videos. It seems to have folders for every test that has run, but only the main window that is displayed is actually being recorded. So, my solution to this is to turn off the Parallelize option for the tests. To see if the video recorder can only start and stop on each individual tests. How would I change that setting based on the runsetting files...how can I turn that off at the application level. Or how do I successfully record video for each test run in parallel.
I actually found another way to do it. During the devops process, after checking out the code, I am doing a bit of powershell, which is below. Then only a singular tests run at a time.
task: PowerShell#2
inputs:
targetType: 'inline'
script: |
(Get-Content -Path '$(Build.Repository.LocalPath)\code\properties\AssemblyInfo.cs') |
ForEach-Object {$_ -replace 'Workers = 0','Workers = 1'} |
Out-File '$(Build.Repository.LocalPath)\code\properties\AssemblyInfo.cs'
errorActionPreference: 'continue'
I've changed paths (polling uri's) to XML with data but Windows still requests the old one xml url.
I was updating xml url in the following steps:
Turn live tiles off
Unpin tile
MS Edge browser cache and history clearing
Delete all content within C:/Users/user_name/AppData/Local/Packages/Microsoft.MicrosoftEdge_randomized_hash/LocalState/PinnedTiles
Delete file iconcache.db inside C:/Users/user_name/AppData/Local
Disk Cleanup
So I start MS Edge again and pin tiles to start menu. Then I see that Windows still requests the old xml path via server logs.
How to update it? There must be some system cache I suppose ...
I've spent a lot of time and would appreciate any advice!
Microsoft Support replied to my question. So the reason was MS Edge cache.
These steps helped me, I hope it'll help someone else.
Please try the below steps to reset the edge browser and check. Please
know that resetting the Microsoft edge will remove your bookmarks and
History. Follow the instruction provided below and check.
a. Navigate to the location:
C:\Users\%username%\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe
b. Delete everything in this folder.
c. Type Windows Powershell in search box.
d. Right click on Windows Powershell and select Run as administrator.
e. Copy and paste the following command.
Get-AppXPackage -AllUsers -Name Microsoft.MicrosoftEdge | Foreach
{Add-AppxPackage -DisableDevelopmentMode -Register
"$($_.InstallLocation)\AppXManifest.xml" –Verbose}
The above worked for me. However, I have since discovered that after unpinning the tile/s you don't want, you can simply just delete the unrequired tile folder/s here: C:\Users\YOUR USERNAME\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe\LocalState\PinnedTiles
Not to hijack the topic, but if you'd like to see all your bookmarks/favorites, here's a starter powershell script to give you that information.
You'll need the Newtonsoft.Json.dll
cls;
[Reflection.Assembly]::LoadFile("C:\Users\<YOUR USER FOLDER>\Documents\WindowsPowerShell\Newtonsoft.Json.dll") |Out-Null;
$source = "C:\Users\<YOUR USER FOLDER>\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe\RoamingState";
$filter = "{*}.json";
$files = Get-ChildItem -Recurse -Path $source -filter $filter -File;
foreach ($f in $files)
{
$json = Get-Content -Path $f.FullName
$result = [Newtonsoft.Json.JsonConvert]::DeserializeObject($json);
$result.Title.ToString()
$result.URL.ToString()
}
I am very noob to Powershell and have small amounts of Linux bash scripting experience. I have been looking for a way to get a list of files that have Social Security Numbers on a server. I found this in my research and it performed exactly as I had wanted when testing on my home computer except for the fact that it did not return results from my work and excel test documents. Is there a way to use a PowerShell command to get results from the various office documents as well? This server is almost all Word and excel files with a few PowerPoints.
PS C:\Users\Stephen> Get-ChildItem -Path C:\Users -Recurse -Exclude *.exe, *.dll | `
Select-String "\d{3}[-| ]\d{2}[-| ]\d{4}"
Documents\SSN:1:222-33-2345
Documents\SSN:2:111-22-1234
Documents\SSN:3:111 11 1234
PS C:\Users\Stephen> Get-childitem -rec | ?{ findstr.exe /mprc:. $_.FullName } | `
select-string "[0-9]{3}[-| ][0-9]{2}[-| ][0-9]{4}"
Documents\SSN:1:222-33-2345
Documents\SSN:2:111-22-1234
Documents\SSN:3:111 11 1234
Is there a way to use a PowerShell command to get results from the various office documents as well? This server is almost all Word and excel files with a few PowerPoints.
When interacting with MS Office files, the best way is to use COM interfaces to grab the information you need.
If you are new to Powershell, COM will definitely be somewhat of a learning curve for you, as very little "beginner" documentation exists on the internet.
Therefore I strongly advise starting off small :
First focus on opening a single Word doc and reading in the contents into a string for now.
Once you have this ready, focus on extracting relevant info (The Powershell Match operator is very helpful)
Once you are able to work with a single Word doc, try to locate all files named *.docx in a folder and repeat your process on them: foreach ($file in (ls *.docx)) { # work on $file }
Here's some reading (admittedly, all this is for Excel as I build automated Excel charting tools, but the lessons will be very helpful for automating any Office application)
Powershell and Excel - Introduction
A useful document from a deleted link (link points to the Google Cache for that doc) - http://dev.donet.com/automating-excel-spreadsheets-with-powershell
Introduction to working with "Objects" in PS - CodeProject
When you only want to restrict this to docx and xlsx, you might also want to consider plain unzipping and then searching through the contents, ignoring any XML tags (so allow between each digit one or more XML elements).