How to extract current version of Windows through KQL in Azure? - azure

We are planning to upgrade all our 2012-R2 Servers to 2019 Server, and to do this we need to get an overview of all our 2012 server in Azure.
We are now using this KQL query to get the information of what version is installed, but we noticed that this is only including the windows version FIRST installed, and not the current installed windows version. Is there any ways to get the current windows version out of an Azure KQL script?
Resources
| where type =~ 'Microsoft.Compute/virtualMachines'
| project resourceGroup, version = properties.storageProfile.imageReference.sku, minorversion=properties.storageProfile.imageReference.exactVersion
| join ResourceContainers on resourceGroup
| project resourceGroup, tags.Description, version, minorversion

I found a solution with below script
| where type =~ 'Microsoft.Compute/virtualMachines'
| project resourceGroup, name, initial_OSVersion = properties.storageProfile.imageReference.sku, current_OS = properties.extended.instanceView.osName, current_OSVersion = properties.extended.instanceView.osVersion
| join ResourceContainers on resourceGroup
| project customer_Name = tags.Description, ComputerName = name, current_OS,initial_OSVersion, current_OSVersion, resourceGroup
| order by tostring(current_OS) asc

Related

Alternative to New-SqlAzureKeyVaultColumnMasterKeySettings

I currently have a script that is similair to the script that is described inside https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-keys-using-powershell?view=sql-server-ver15
However the script uses powershell 5 which isn't available to me on linux agents in our azure-devops environment. Because of this a majority of the azure-sql commands aren't available to us. Is there an alternative to these sqlserver module cmdlets.
The recommended alternative is to use the cloud console, but you may be able to get away with linux powershell like so:
I can't guarantee if this all works, but that specific command only creates a SqlColumnMasterKeySettings object that contains information about the location of your column master key. You can probably just create one manually, but you'll need to know the exact values. I would recommend running it from a windows machine first to see what the values should be for your environment.
# On Windows [Optional]
$cmkSettings = New-SqlAzureKeyVaultColumnMasterKeySettings -KeyURL $akvKey.Key.Kid
$cmkSettings | Format-List KeystoreProviderName,KeyPath
KeystoreProviderName : # Take these strings
KeyPath : # And use them in your script on linux
# Now on Linux, using the values above:
$cmkSettings = [Microsoft.SqlServer.Management.PowerShell.AlwaysEncrypted.SqlColumnMasterKeySettings]::new("YourKeystoreProviderName","YourKeyPath")
New-SqlColumnMasterKey -Name 'CMK1' -InputObject $database -ColumnMasterKeySettings $cmkSettings
# Success!
The key settings properties are just strings that get saved to your SQL Instance, so this should work fine. The harder part is authenticating to Azure to create keys from your master key, but you can try importing the desktop version of the commands like so:
# Start a NEW powershell session without the sqlserver module:
pwsh
# Get the module directory:
$d = (Get-Item (Get-Module SqlServer).path).DirectoryName
# Import the desktop version of these assemblies:
Import-Module "$d/Microsoft.SqlServer.Diagnostics.Strace.dll"
Import-Module "$d/Microsoft.SqlServer.Management.PSSnapins.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AzureAuthenticationManagement.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AlwaysEncrypted.Types.dll"
# Then import the module normally (there will be errors - you can ignore these)
Import-Module SqlServer
# Now you may be able to authenticate to Azure to generate new keys:
# Required to generate new keys
# NOTE: -Interactive fails on linux
Add-SqlAzureAuthenticationContext -ClientID "YourID" -Secret "YourSecret" -Tenant "YourTenant"
# Create a key using your master key:
New-SqlColumnEncryptionKey -Name 'CEK1' -InputObject $database -ColumnMasterKey 'CMK1'
This worked on my installation of centos7/pwsh7.1.3 - make sure you have SqlServer version 21.1.18245 (only 10 days old at the moment) as many new sql commands got ported to pwsh 7.1.

Prisma 2 query relation return null

I migrated my project from prisma 1.
Then I added new models relative to my Live model.
I try to query the new models with relation type (one-to-one and one-to-many) but that returns null.
Here is my prisma.schema
Query live.questionnaire{} works but questionnaire.live{} doesn't, and both questionnaire.questionnaireField{} and questionnaireField.questionnaire{} dont't wortk.
Here is an exemple of the playground
live.questionnaire is ok.
questionnaire.live is not.
My schemas
Live schema perview where relation live.questionnaire ok
Each of questionnaire.live and questionnaire.fields and questionnaireField.questionnaire are down
My query resolver where only createdAt works
In prisma studio everything seems to be good...
If someone could help, Thanks
See your resolvers not clear but I will give you a way to build your GraphQL server with Prisma in just minute using Pal.Js CLI you can start new project with my CLI and just put your schema and run one command pal g and you will get everything ready for you with full CRUD system.
Install my CLI
yarn global add #paljs/cli
//or
npm install -g #paljs/cli
Start new project
> pal c
.______ ___ __ __ _______.
| _ \ / \ | | | | / |
| |_) | / ^ \ | | | | | (----`
| ___/ / /_\ \ | | .--. | | \ \
| | / _____ \ | `----.| `--' | .----) |
| _| /__/ \__\ |_______| \______/ |_______/
✔ Please select your start example · apollo-sdl-first
full-stack-nextjs
full-stack-gatsbyjs
apollo-nexus-schema
❯ apollo-sdl-first // select this example
graphql-modules
✔ please enter your project name · great-project
✔ please enter your project description · new NodeJs Prisma GraphQL TypeScript project
✔ please enter your project author · Ahmed Elywa
✔ please enter your project repository · https://github.com/paljs/prisma-tools
✔ please select your package manager · yarn
❯ yarn
npm
✔ Skip package installation · no
❯ yes
no
Now open your project folder and change ./prisam/schema.prisma file with yours
now generate all resolvers
pal g
you have now your graphql server ready
start your server
yarn dev
or
npm run dev
that is it

Transfer file from Windows to Linux without using 3rd party software and using Shell.Application only

How to transfer file from Window Server to Linux without using 3rd party software? I just can use pure PowerShell script to transfer zip file.
I'm using PowerShell v2.0 (I know it's pretty old and I don't have privilege to update to current version - only can use for Shell.Application script)
Telnet successfully
Destination server installed private/public key (which I gen from my server using PuTTYgen - but no privilege to install PuTTY or WinSCP)
$timestamp = (Get-Date).AddMonths(-1).ToString('yyyy-MM-dd')
$todaysDate = (Get-Date).AddDays(-1)
$source = "D:\Testing\*.csv", "D:\Testing\*.csv"
$target = "D:\Testing\bin\$timestamp.zip"
$housekeepZipFile = "D:\Testing\bin\*.zip"
$locationToTransfer = "D:\Testing\bin\*.zip"
$mftFileTransfer = "UserName#192.168.0.50:/UserName/Outbox"
Get-ChildItem -Path $locationToTransfer –Recurse | Where-Object {
$_.LastWriteTime -gt (Get-Date).AddDays(-1)
} | Copy-Item -Destination $mftFileTransfer -Force
Is my syntax correct? Just now tried, seems not receive any file.
Using Window Server 2008
As Ansgar already commented, keys are used with SSH/SFTP. There's no support for SSH/SFTP in PowerShell nor in Windows 2008. If you need to use SSH/SFTP, you have to use 3rd party software/library.
And as already said above, you do not need install privileges to use WinSCP nor PuTTY/psftp.

heptio velero unable to take backup of persistant volumes in azure AKS

i am using
- AKS
- k8s version 1.12.5
- Velero version:- v0.11.0
- Documents referred from the link
installed velero on server
Install prereq i.e. 00-prereqs.yaml It installs velero namespace,
velero service account rbac rules etc.
Created azure storage account and container
in it. (i used terraform to create storage account while
used AZ CLI to create storage container). It is all based on their
documentation available.
Created secret.
kubectl create secret generic cloud-credentials
--namespace velero
--from-literal AZURE_SUBSCRIPTION_ID=""
--from-literal AZURE_TENANT_ID=""
--from-literal AZURE_CLIENT_ID=""
--from-literal AZURE_CLIENT_SECRET=""
--from-literal AZURE_RESOURCE_GROUP="name-of-resource-group-where-my-vm etc created typically starts with MC_ in azure"
applied remaining k8s resources present at
execute backup commands
it observed that this command created files for my backup in my storage account as well.
the similar structure created for other backups as well.
while checking pod logs it is observed following information
time="2019-03-22T14:38:02Z" level=info msg="Executing takePVSnapshot"
backup=velero/d042203191536 group=v1 groupResource=pods
logSource="pkg/backup/item_backupper.go:378"
name=pvc-6dd56a3d-4c90-11e9-bc92-1297bc38e414 namespace=default
time="2019-03-22T14:38:02Z" level=info msg="label
\"failure-domain.beta.kubernetes.io/zone\" is not present on
PersistentVolume"
again
level=error msg="Error getting block store for volume snapshot
time="2019-03-22T14:38:02Z" level=info msg="PersistentVolume is not a
supported volume type for snapshots, skipping."
backup=velero/d042203191536 group=v1 groupResource=pods
logSource="pkg/backup/item_backupper.go:436"
and following error as well
level=error msg="backup failed" controller=backup
error="[clusterroles.rbac.authorization.k8s.io
\"system:auth-delegator\" not found,
clusterroles.rbac.authorization.k8s.io \"system:auth-delegator\" not
found]" key=velero/d042203191618
logSource="pkg/controller/backup_controller.go:202"
all these logs I observed after executing backup at multiple time intervals
not sure if I am missing anything .any pointers to resolve these problems are really helpful.
Those are currently supported Volume providers
| [Azure Managed Disks][3] | Ark Team | [Slack][10], [GitHub Issue][11] |
| [Google Compute Engine Disks][4] | Ark Team | [Slack][10], [GitHub Issue][11] |
| [Restic][1] | Ark Team | [Slack][10], [GitHub Issue][11] |
| [Portworx][6] | PortWorx | |
| [DigitalOcean][7] | StackPointCloud | |
Make sure your volume type is compatible with velero plugins

Node application, manage module dependencies

I'm new in node development and I'm doing a little bit of training.
My folder structure looks like this:
node-test
|
|
+---build
| |
| +---node_modules
| |
| +---public
| |
| +---server
| |
| \---main.js
|
|---src
|
+---node_modules
|
+---public
|
+---server
|
|---.gitignore
|
\---main.ts
I'm using Gulp to build source code to build folder maintaining the structure.
I'm facing some problem related Node dependencies.
Initially I used gulp-npm-files to copy required dependencies into build\node_modules using all the examples found on package page but always some dependency is missing. So for the moment gulp copies the entire src\node_modules in build\node_modules. Certainly is not the right way but express works and I can see static HTML.
I've also another problem about dependencies:
this.app.use(express.static(RouterContants.PublicDirectory));
This line sets the only folder that the browser can access, right? OK, then how can I access node_modules?
How dependencies have to been managed correctly?
Any suggestions and advices are welcome.
Thank you

Resources