Write-SqlTableData to import CSV into a Azure SQL Server table - azure

The below command creates a new table, test for me but it doesn't insert any data into it.
Write-SqlTableData -TableName test -ServerInstance myservername -DatabaseName mydb -SchemaName dbo -Credential $mycreds -InputData $data -Force
This is the error message I get:
Write-SqlTableData : Cannot access destination table '[Mydb].[dbo].
[test]'.
At line:1 char:1
+ Write-SqlTableData -TableName test -ServerInstance myinstance
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: ([dbo].[test]:Table) [Write-SqlTableData], InvalidOperationException
+ FullyQualifiedErrorId : WriteToTableFailure,Microsoft.SqlServer.Management.PowerShell.WriteSqlTableData
Any ideas are appreciated.
UPDATE
This is the code to populate data.
$data = import-csv 'C:\Users\azure-user\myfile.csv'
This is what the file looks like -
"State","ProviderNbr","Entity","Address","City","Zip","Phone","Ownership","StarRating"
"AL","017000","ALABAMA DEPARTMENT OF HEALTH HOME CARE","201 MONROE STREET, SUITE 1200", "ALABAMA", "32423", "3233233233", "Alabama", "4"

This is a weird one - as you say, in Azure Read-SqlTableData works but Write-SqlTableData doesn't. From the discussion on MSDN here I think its something to do with the Azure environment making it hard for the cmdlet to interpret the 'ServerInstance' parameter.
Example 5 in Microsoft's Write-SqlTableData documentation "Write data to an existing table of an Azure SQL Database" shows the way forwards - we need to instantiate an SMO reference to the Table and feed that to the cmdlet instead. Unfortunately the example Microsoft gives contains a small error (you can't do $table = $db.Tables["MyTable1"] to get the table, it doesn't work)
Here's a modified version of that example:
$csvPath = "C:\Temp\mycsv.csv"
$csvDelimiter = ","
# Set your connection string to Azure SQL DB.
$connString = 'Data Source=server;Persist Security Info=True'
$cred = Get-Credential -Message "Enter your SQL Auth credentials"
$cred.Password.MakeReadOnly()
$sqlcred = New-Object -TypeName System.Data.SqlClient.SqlCredential -ArgumentList $cred.UserName,$cred.Password
# Create a SqlConnection and finally get access to the SMO Server object.
$sqlcc = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $connString,$sqlcred
$sc = New-Object -TypeName Microsoft.SqlServer.Management.Common.ServerConnection -ArgumentList $sqlcc
$srv = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $sc
# Get access to table 'MyTable1' on database 'MyDB'.
# Note: both objects are assumed to exists already.
$db = $srv.Databases["MyDB"]
$tableSmo = $db.Tables | Where-Object {$_.Name -eq "MyTable1"}
# leading comma makes an array with one item
# this makes PowerShell pass the entire contents of the file directly to the Write-SqlTableData cmdlet, which in turn can do a bulk-insert
, (Import-Csv -Path $pcsvPath -Delimiter $csvDelimiter) | Write-SqlTableData -InputObject $tableSmo
$sc.Disconnect()
If you're doing integrated security you can miss off all the $cred and $sqlcred stuff and just create the SqlConnection using $sqlcc = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $connString
Note: This worked for me with "A SQL Server running on a VM in Azure", in that I was having the same error as you, 'cannot access destination table' and this approach fixed it. I haven't tested it with an "Azure SQL Database" i.e. SQL Server as a service. But from the microsoft documentation it sounds like this should work.

According to my test, we can not use the command Write-SqlTableData to import CSV file to Azure SQL and we just can use it to import CSV file to on-premise SQL
So if you want to import CSV file to Azure SQL with powershell, you can use the command Invoke-SQLCmdto insert record on by one. For example:
$Database = ''
$Server = '.database.windows.net'
$UserName = ''
$Password = ''
$CSVFileName = ''
$text = "CREATE TABLE [dbo].[Colors2](
[id] [int] NULL,
[value] [nvarchar](30) NULL
) "
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query $text
$CSVImport = Import-CSV -Path $CSVFileName
ForEach ($CSVLine in $CSVImport){
$Id =[int] $CSVLine.Id
$Vaule=$CSVLine.value
$SQLInsert = "INSERT INTO [dbo].[Colors2] (id, value)
VALUES('$Id', '$Vaule');"
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query $SQLInsert
}
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query "select * from [dbo].[Colors2]"
Besides, you also can use other ways ( such as BULK INSERT ) to implement it. For further information, please refer to https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql?view=azuresqldb-current.

Related

Query ObjectId of ConditionalAccessLocationCondition

I am writing a script to write to Azure, I basically want to find a user, create a network location, create a conditional access policy. This is what I have so far. The trouble is that the $secmon_guid and $location_policy_guid do not work. If I manually put the values in, it works.
# Run these commands first to connect and install without the #
Install-Module -Name AzureAD -AllowClobber -Force # Answer Y to install NuGet. Run once on workstation running script.
Install-Module -Name Microsoft.Graph.Identity.SignIns -Force # Install this to allow us to setup a trusted location. Run once on workstation running script.
Install-Module MSOnline -Force #Allow us to edit users. Run once on workstation running script.
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope LocalMachine #Set execution policy to allow our script to do things.
Import-Module -Name AzureAD #The following 3 commands are ran for each client.
Connect-AzureAD # Use GA credentials from Glue
Connect-MsolService #Reauthenticate if necessary.
Get-AzureADMSConditionalAccessPolicy #This will list out all of the existing CA policies. This is a good opportunity to get them into documentation.
Connect-MgGraph #This enabled graph, you will need to approve the request in the popup window.
#Set variable for account name
Set-Variable -name "account" -Value "secmon"
#Create named location for the IP address
$ipRanges = New-Object -TypeName Microsoft.Open.MSGraph.Model.IpRange
$ipRanges.cidrAddress = "IP ADDR"
New-AzureADMSNamedLocationPolicy -OdataType "#microsoft.graph.ipNamedLocation" -DisplayName "Blackpoint IP Address for SecMon" -IsTrusted $true -IpRanges $ipRanges
#Disable MFA for secmon
Get-MsolUser -SearchString "secmon" | Set-MsolUser -StrongAuthenticationRequirements #()
#Get the Azure AD GUID for use later
$secmon_guid = Get-MsolUser -SearchString "secmon" | Select ObjectID
#Name the policy
$name = "Allow Secmon Only from Blackpoint IP"
#Enable the policy. Set to Disabled to test.
$state = "Enabled"
#Get location GUID and save to variable
$location_policy_guid = Get-AzureADMSNamedLocationPolicy | Where-Object -Property DisplayName -Contains 'Blackpoint IP Address for SecMon' | Select-Object -Property Id
#Working on this
#Create the overarching condition set for CA, this is the container.
$conditions = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
#Include all applications - This might be able to be removed?
$conditions.Applications = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessApplicationCondition
$conditions.Applications.IncludeApplications = 'All'
#Create the user condition and include secmon
$conditions.Users = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessUserCondition
$conditions.Users.IncludeUsers = $secmon_guid
#Add new location policy to CA policy
$conditions.Locations = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$conditions.Locations.IncludeLocations = $location_policy_guid
#Grant access control to CA policy
$controls = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$controls._Operator = "OR"
$controls.BuiltInControls = "block"
#End work
New-AzureADMSConditionalAccessPolicy `
-DisplayName $name `
-State $state `
-Conditions $conditions `
-GrantControls $controls
The error I get is due to poorly formatted GUID's, the values I am pulling are not correct. How can I fix this? Any help is greatly appreciated!
New-AzureADMSConditionalAccessPolicy : Error occurred while executing NewAzureADMSConditionalAccessPolicy
Code: BadRequest
Message: 1054: Invalid location value: #{Id=1234GUID}.
InnerError:
RequestId: 5678GUID
Where you define the variables, you need to use -ExpandProperty on the select-object statement e.g:
$secmon_guid = Get-MsolUser -SearchString "secmon" | Select -ExpandProperty ObjectID
Otherwise, you would have to access your current variable like so:
$conditions.Users.IncludeUsers = $secmon_guid.ObjectID

Assign the result of a query to a variable in powershell in Azure-Runbooks

I have a runbook using automation in Azure. It is getting a single integer result from a table and it can return the correct value. The code I am using is below and it works.
$SQLServerCred = Get-AutomationPSCredential -Name "SqlCredential"
#Import the SQL Server Name from the Automation variable.
$SQL_Server_Name = Get-AutomationVariable -Name "SqlServer"
#Import the SQL DB from the Automation variable.
$SQL_DB_Name = Get-AutomationVariable -Name "Database"
$Query = "select max(je.ExecutionOrder) as LastStepExecuted
from PPoint.JobExecutionHistory je
where je.EventType = 'Start'
and je.JobRunId = PPoint.fnGetJobRunID()"
invoke-sqlcmd -ServerInstance "$SQL_Server_Name" -Database "$SQL_DB_Name" -Credential $SQLServerCred -Query "$Query" -Encrypt
The next step for me is to assign the result from the query to a variable and then evaluate it to see if it should call another runbook. So I want to have a variable named LastStep and assign it the integer result of LastStepExecuted from the query below. I then want to do something along this line (this is pseudocode)
if LastStep = 2147483647
call another runbook
else
do nothing - end the runbook
I have tried several ways to capture the LastStepExecuted in a variable but I can't figure it out. Can anyone help?
Any help or advice much appreciated.
You can use the Start-AzAutomationRunbook cmdlet to trigger a child runbook from the another run book inside the automation account.
$SQLServerCred = Get-AutomationPSCredential -Name "SqlCredential"
#Import the SQL Server Name from the Automation variable.
$SQL_Server_Name = Get-AutomationVariable -Name "SqlServer"
#Import the SQL DB from the Automation variable.
$SQL_DB_Name = Get-AutomationVariable -Name "Database"
$Query = "select max(je.ExecutionOrder) as LastStepExecuted
from PPoint.JobExecutionHistory je
where je.EventType = 'Start'
and je.JobRunId = PPoint.fnGetJobRunID()"
$LastStep=invoke-sqlcmd -ServerInstance "$SQL_Server_Name" -Database "$SQL_DB_Name" -Credential $SQLServerCred -Query "$Query" -Encrypt
if($LastStep -eq 2147483647)
{
Start-AzAutomationRunbook -AutomationAccountName "MyAutomationAccount" -Name "Test-ChildRunbook" -ResourceGroupName "LabRG" -DefaultProfile $AzureContext -Parameters $params -Wait
}
else
{
Write-Output "conditionfailed the value of $LastStep"
}
You can refer this documentation, about modular runbooks in azure automation.

Continuous integration and Deploy SSAS tabular to Azure Analysis Services

I am trying to deploy SSAS Tabular model to Azure Analysis Server through MS Build and Release process.
I am able to successfully execute Invoke-ProcessASDatabase. But I am having problem with Deploying new objects to the Azure Server.
I am using Command Line to deploy the tabular model using below command
"C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\Microsoft.AnalysisServices.Deployment.exe"
and it fails with error -
Authentication failed: User ID and Password are required when user interface is not available.
I don't see a way how I can provide credentials in my command line task.
Even I was trying to automate model deployment.
Here is the power shell script I wrote. Hope this helps.
$msBuildPath = Get-MSBuildToPath
$Microsoft_AnalysisServices_Deployment_Exe_Path = Get-Microsoft_AnalysisServices_Deployment_Exe_Path
# BUild smproj
& $msBuildPath $projPath "/p:Configuration=validation" /t:Build
Get-ChildItem $binPath | Copy -Destination $workingFolder -Recurse
$secureStringRecreated = ConvertTo-SecureString -String $AnalysisServerPassword -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($AnalysisServerUserName, $secureStringRecreated)
#$plainText = $cred.GetNetworkCredential().Password
#region begin Update Model.deploymenttargets
# Read Model.deploymenttargets
[xml]$deploymenttargets = Get-Content -Path $deploymenttargetsFilePath
$deploymenttargets.DeploymentTarget.Database = $AnalysisDatabase
$deploymenttargets.DeploymentTarget.Server = $AnalysisServer
$deploymenttargets.DeploymentTarget.ConnectionString = "DataSource=$AnalysisServer;Timeout=0;UID=$AnalysisServerUserName;Password=$AnalysisServerPassword;"
$deploymenttargets.Save($deploymenttargetsFilePath);
#endregion
#region begin Update Model.deploymentoptions
# Read Model.deploymentoptions
[xml]$deploymentoptions = Get-Content -Path $deploymentoptionsFilePath
# Update ProcessingOption to DoNotProcess otherwise correct xmla file wont be generated.
$deploymentoptions.Deploymentoptions.ProcessingOption = 'DoNotProcess'
$deploymentoptions.Deploymentoptions.TransactionalDeployment = 'false'
$deploymentoptions.Save($deploymentoptionsFilePath);
#endregion
# Create xmla deployment file.
& $Microsoft_AnalysisServices_Deployment_Exe_Path $asdatabaseFilePath /s:$logFilePath /o:$xmlaFilePath
#region begin Update .xmla
#Add passowrd in .xmla file.
$xmladata = Get-Content -Path $xmlaFilePath | ConvertFrom-Json
foreach ($ds in $xmladata.createOrReplace.database.model.dataSources){
$ds.Credential.AuthenticationKind = 'Windows'
$ds.Credential.Username = $AnalysisServerUserName
#Add password property to the object.
$ds.credential | Add-Member -NotePropertyName Password -NotePropertyValue $AnalysisServerPassword
}
$xmladata | ConvertTo-Json -depth 100 | Out-File $xmlaFilePath
#endregion
#Deploy model xmla.
Invoke-ASCmd -InputFile $xmlaFilePath -Server $AnalysisServer -Credential $cred

"Unable to get the Open property of the Workbooks class" when automated

I have some PowerShell scripts that invoke SQL commands, take the results and put them into a CSV file, then the CSV file is put into an excel workbook and it's emailed out to a distribution list. I had no issues running these reports through Windows Scheduled tasks on my older Windows 2008 server running SQL 2008. But I have migrated over to Windows 2016 running SQL 2016. Now when I run this process through Scheduled tasks I get the following error:
Unable to get the Open property of the Workbooks class
At C:\PowerShell\scrpits\ArtiReport3.ps1:607 char:1
+ $workbook = $excel.Workbooks.Open($csvFilePath)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], COMException
+ FullyQualifiedErrorId : System.Runtime.InteropServices.COMException
If I run the PowerShell script manually I have no issues and everything runs fine. I'm using the same login to run the scripts manually as I do through scheduled tasks. Here is the script.
$query = "*SQL Query runs here*"
#Edit these peramters for the server this will be running on#
$smtpServer = "*server*";
$smtpFrom = "I3Reports#server.com";
$smtpTo = "*email list here*”
$messageSubject = "I3 Report";
#create anonymus loging for sending e-mail#
$User = "anonymous";
$PWord = ConvertTo-SecureString -String "anonymous" -AsPlainText -Force
$Creds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $user, $pword
$date = (Get-Date).AddDays(-1).ToString('yyyy-MM-dd')
$date = $date+"_I3Report.xls";
$csvFilePath = "c:\Scripts\queryresults.csv"
$excelFilePath = "c:\scripts\$date"
$instanceName = "*server*"
Import-Module "sqlps"
$results = Invoke-Sqlcmd -QueryTimeout 7200 -Query $query -ServerInstance $instanceName
# Output to CSV
$results | export-csv $csvFilePath -Delimiter " " -NoTypeInformation
#this line will remove all the quotation marks from the csv file
(Get-Content $csvFilePath) | % {$_ -replace '"', ""} | out-file -FilePath $csvFilePath -Force
# Convert CSV file to Excel
$excel = New-Object -ComObject excel.application
$excel.visible = $False
$excel.displayalerts=$False
$workbook = $excel.Workbooks.Open($csvFilePath) #<-- Program fails here
$workSheet = $workbook.worksheets.Item(1)
#$workSheet.cells.item(3,3) = "HOPLA"
#for freezing pane#
$workSheet.application.activewindow.splitcolumn = 0
$workSheet.application.activewindow.splitrow = 1
$workSheet.Range("A2").application.activewindow.freezepanes = $true
$resize = $workSheet.UsedRange
$resize.EntireColumn.AutoFit() | Out-Null
$xlExcel8 = 43
$workbook.SaveAs($excelFilePath, $xlExcel8)
$workbook.Close()
$excel.quit()
$excel = $null
send-mailmessage -from $smtpFrom -to $smtpTo -subject "$messageSubject" -body "Attachment" -Attachments $excelFilePath -smtpServer $smtpServer -Credential $creds;
As mentioned this works when I run it in PowerShell manually, but through scheduled tasks it gets that error and it references the section I noted in the code. I've been working on this for days and I can't seem to figure out what is causing the issue. Any help or suggestions is welcomed. Thanks for taking the time.
I found the solution to this problem here
Powershell Excel Automation - Save/Open fails in Scheduled Task
Creating the folders and gaining access to the directories that they are in did it for me.

Office 365 - how to manage many users

I have 200 unsorted users in office 365. I want to find an easy way to manage who they are and what security group each user belongs to.
Is there an easy way to export username and what groups each user belongs to?
Iam quite new to poweshell...
But i want to export a CSV file with user and gruops.
Is this possible?
Or do you recommend any other way to quick get an overview of all users and what grups they belong to.
Some users need to be in multiple groups and i suspect some users are missing in groups they should be in..
Thanks for any tips i can get.
################################################################################################################################################################
# Script accepts 2 parameters from the command line
#
# Office365Username - Optional - Administrator login ID for the tenant we are querying
# Office365Password - Optional - Administrator login password for the tenant we are querying
#
#
# To run the script
#
# .\Get-DistributionGroupMembers.ps1 [-Office365Username admin#xxxxxx.onmicrosoft.com] [-Office365Password Password123]
#
#
# Author: Alan Byrne
# Version: 2.0
# Last Modified Date: 16/08/2014
# Last Modified By: Alan Byrne alan#cogmotive.com
################################################################################################################################################################
#Accept input parameters
Param(
[Parameter(Position=0, Mandatory=$false, ValueFromPipeline=$true)]
[string] $Office365Username,
[Parameter(Position=1, Mandatory=$false, ValueFromPipeline=$true)]
[string] $Office365Password
)
#Constant Variables
$OutputFile = "DistributionGroupMembers.csv" #The CSV Output file that is created, change for your purposes
$arrDLMembers = #{}
#Remove all existing Powershell sessions
Get-PSSession | Remove-PSSession
#Did they provide creds? If not, ask them for it.
if (([string]::IsNullOrEmpty($Office365Username) -eq $false) -and ([string]::IsNullOrEmpty($Office365Password) -eq $false))
{
$SecureOffice365Password = ConvertTo-SecureString -AsPlainText $Office365Password -Force
#Build credentials object
$Office365Credentials = New-Object System.Management.Automation.PSCredential $Office365Username, $SecureOffice365Password
}
else
{
#Build credentials object
$Office365Credentials = Get-Credential
}
#Create remote Powershell session
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell -Credential $Office365credentials -Authentication Basic –AllowRedirection
#Import the session
Import-PSSession $Session -AllowClobber | Out-Null
#Prepare Output file with headers
Out-File -FilePath $OutputFile -InputObject "Distribution Group DisplayName,Distribution Group Email,Member DisplayName, Member Email, Member Type" -Encoding UTF8
#Get all Distribution Groups from Office 365
$objDistributionGroups = Get-DistributionGroup -ResultSize Unlimited
#Iterate through all groups, one at a time
Foreach ($objDistributionGroup in $objDistributionGroups)
{
write-host "Processing $($objDistributionGroup.DisplayName)..."
#Get members of this group
$objDGMembers = Get-DistributionGroupMember -Identity $($objDistributionGroup.PrimarySmtpAddress)
write-host "Found $($objDGMembers.Count) members..."
#Iterate through each member
Foreach ($objMember in $objDGMembers)
{
Out-File -FilePath $OutputFile -InputObject "$($objDistributionGroup.DisplayName),$($objDistributionGroup.PrimarySMTPAddress),$($objMember.DisplayName),$($objMember.PrimarySMTPAddress),$($objMember.RecipientType)" -Encoding UTF8 -append
write-host "`t$($objDistributionGroup.DisplayName),$($objDistributionGroup.PrimarySMTPAddress),$($objMember.DisplayName),$($objMember.PrimarySMTPAddress),$($objMember.RecipientType)"
}
}
#Clean up session
Get-PSSession | Remove-PSSession

Resources