PowerShell Dynamic Creation of ADUsers from imported Excelsheet - excel

first of all, i want to say, that i'm very, very new to PowerShell. These are the first PS-scripts i wrote.
I'm currently working on a PS-Script for AD-Administration. Currently the Scripts for Adding/Deleting SmbShares, Adding or removing Users from Groups, and so on, are already done.
I already had a working script for creating the users in AD, but it wasn't dynamic, as in hard coded variables that all would have to be entered into a new-ADUser command. As the code will be used for more than one specific set of parameters, it has to be dynamic.
I'm working with Import-Excel and found a great function here, but I'm having two problems wih this function.
$sb = {
param($propertyNames, $record)
$propertyNames | foreach-object -Begin {$h = #{} } -Process {
if ($null -ne $record.$_) {$h[$_] = $record.$_}
} -end {New-AdUser #h -verbose}
}
Use-ExcelData -Path $Path -HeaderRow 1 -scriptBlock $sb
The dynamic part of this is, that the table headers will be used as the parameternames for New-ADUser. Only thing one needs to change if the amount of parameters needed changes is add or delete a column in the excel sheet. The column header always needs the same name as the parameter of New-ADUser.
Screenshot of excel table
My Problem now is the "Type" Header i've got at column A. It is needed to specify the type of the user for adding the user to specific ADGroups. But due to the function above using all headers as parameters this doesn't work.
Has anyone an idea how to change the function $sb so that it starts with the second column? I've tried aroung with skip 1 and tried a lot of other workarounds, but with my non-experience nothing ssemed to come close to what i need.
SOLVED PROBLEM BELOW: added -DataOnly to Use-ExcelData and now it works.
The second problem would be, that the function does not stop trying to create users once there are no more values for the parameters. For trying around i deleted the column "Type". In the example of trying to create the two users testuser and testuser2, Powershell creates the users with no problems but then asks for a name for a new-ADUser.
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test User,CN=Users,DC=****,DC=**".
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test2 User2,CN=Users,DC=****,DC=**".
Cmdlet New-ADUser an der Befehlspipelineposition 1
Geben Sie Werte für die folgenden Parameter an:
Name:
Thank you in advance, sorry for my english and please tell me if I did something wrong forumwise.

I see you I would save the excel sheet as a CSV file and then import it. It's faster and easier to consume. The headers become your parameter names and the import behaves like any other object.
$csvData = Import-csv -path <path to csv file>
From here, iterate the rows and access the values as properties of the row. No need to import the data into a hashtable, it's already accessible with property names defined by the header row.
foreach ($row in $csvData) {
Write-Host $row.Name
Write-Host $row.Path
}
Once the loop reaches the end of the file, it stops trying to create users.
FYI, The use of single letter variables is going to make your code very difficult to maintain. My eyes hurt just looking at it.

Related

Editing a number of lists of a share point site together

I don't know if this is the right place to ask but I am creating a powerapp canvas based app. My data source are saved as share point list files. I have to change the column type from single line of text to Choice for all those list files. As I have a very large number of lists ( Around 100) and each list contains 30-40 columns , changing one by one seems impossible. Is there a way I can change the column type of all the lists at once? Even a way to change the column type of all columns of a single list at once would ease the work a bit!! Please help. If it's not possible directly, is it possible in javascript? can you nudge me to the right direction? Thanks.
list
The easiest way is to run powershell pnp script to update all column types.
#Set Variables
$SiteURL = {siteUrl}
$ListNames = ("List1","List2");
$ColumnName = "Single"
#Connect to PNP Online
Connect-PnPOnline -Url $SiteURL -UseWebLogin
ForEach($listName in $ListNames)
{
$field = Get-PnPField -List $listName -Identity $ColumnName -Includes FieldTypeKind
$field.FieldTypeKind = [Microsoft.SharePoint.Client.FieldType]::Choice
$field.Update()
}
Invoke-PnPQuery

export-csv powershell with custom column type

I'm exporting an AD report via power shell using the code below.
$Get-ADUser -Filter 'enabled -eq $true' -SearchBase "OU=Staff,OU=Users,OU=OSA,DC=domian,DC=org" -properties mail, employeeID | select employeeID, mail, ObjectGUID | Export-CSV "C:\Reports\ADExports\Students.csv" -notypeinformation
It outputs the csv file and everything looks fine except, the 'Data type' of all columns are set to 'Short Text'.
I require the Employee ID column type to be 'Numbers'. Is it possible to export a csv with custom field type.
I hope this make sense.
Thanks in advance.
CSVs are plain text and do not contain type information. However, you can use the following module, which provides an Export-Excel cmdlet. This cmdlet takes various Excel parameters, including a -NumberFormat.
$x | Export-Excel -Numberformat 'Number' -Path 'test.xlsx' #This worked for me.
You will probably have to play around with it a little depending on your exact use case. Good luck!
ok guys, I'm going to assume that you cannot export a csv via powershell with custom data types.
However, I found a way around my issue. I've converted the data type when importing the csv in to access and managed to solve my issue.
If anyone's interested you can find the exact issue and solution here - Joining Two Tables with Different Data types MS ACCESS - 'type mismatch in expression' error
Thank you for bring the 'ImportExcel' module to my attention. Now i know there's module available for this and you can do quite a bit on excel manipulation.
Thank you all for your comments/answers.
Thanks.

Issue with Powershell Excel attempting to cast .csv Values

At this point, I believe it may be a file I/O issue.
While utilizing a Powershell script invoking Excel methods to go through a .csv file from a website, powershell is attempting to cast placeholders for data that is too long for a cell "#######" instead of the date and time contained within the 'cell' (search engines may need 'pound sign' or 'hashtag' to reach this result).
Below is the offending portion of the script.
[DateTime]$S = $sheet.Cells.Item($rowS+$i,$colS).text
[DateTime]$G = $sheet.Cells.Item($rowG+$i,$colG).text
[DateTime]$A = $sheet.Cells.Item($rowA+$i,$colSWScan).text
The data should exist as MM/DD/YYYY HH:MM, but is being read by Powershell/PSExcelModule as #######, which is what is displayed with the Excel GUI when opening the file.
This is only a portion of what the entire script does. Any suggestions on how to resolve the error while maintaining usage of PSExcel-Module would be most helpful.
Stackoverflow seems to have an issue with me posting the verbose error message, and this is my first post. Let me know if that would be helfpul with troubleshooting.
Edit for comment #1:
# Create an instance of Excel.Application and Open Excel file
$objExcel = New-Object -ComObject Excel.Application
# Open the file
$workbook = $objExcel.Workbooks.Open($file)
# Activate the first worksheet
$sheet = $workbook.Worksheets.Item($sheetName)
$objExcel.Visible=$false
After getting my head out of 'Excelland', I realized it may be easier to re-write the script to utilize the .csv organization (the original imported file for the script was a .xlsx), but I am admittedly unfamiliar with .csv scripting. However, the original question still stands while I re-write the code as I may need to switch back to .xlsx imported documents. Thank you for the suggestion J E Carter II.
Answer:
$objExcel.Cells.EntireColumn.AutoFit()
Credit to J E Carter II
When you open an excel file as an object under windows, you're launching excel.
You might be able to add the following commands to your excel object handle to get the data to represent correctly.
$objExcel.Cells.Select
$objExcel.Cells.EntireColumn.AutoFit
Do this before getting values from cells. If that doesn't work, let me know and I can find some csv handling examples.
Those might work better on the $workbook object. I can't remember which is implicit when recording a macro (which is how I got those).
It's also possible you may need to precede those two lines with something like
$workbook.Sheets("sheetname").Select

Trying to Export a CSV list of users using Active Directory Module for Windows Powershell

So the below is where I'm at so far:
import-module activedirectory
$domain = "ourdomain"
Get-ADUser -Filter {enabled -eq $true} -Properties whenCreated,EmailAddress,CanonicalName |
select-object Name,EmailAddress,CanonicalName,whenCreated | export-csv C:\Data\test.csv
Unfortunately, when I run the above I get dates in two different formats in the CSV, e.g.:
01/01/2017
1/01/2017 8:35:56 PM
The issue this poses is that there isn't really a clean way to sort them. Excel's formatting doesn't change either of these formats to be more like the other, both because of the inclusion of time in one and not the other, and because the time-inclusive format doesn't use trailing zeroes in the single digit numbers, but the time-exclusive format does.
We have an existing script that captures users using the LastLogonTimestamp attribute that does this correctly by changing the bottom line to the following:
select-object Name,EmailAddress,CanonicalName,#{Name="Timestamp"; Expression={[DateTime]::FromFileTime($_.whenCreated).ToString('yyyy-MM-dd_hh:mm:ss')}}
For some reason this expression runs properly when we query the LastLogonTimestamp attribute, but when we run this version querying the whenCreated attribute, we get an entirely blank column underneath the Timestamp header.
I'm not particularly knowledgeable about PowerShell itself, and my colleague who had found the original script for the LastLogonTimestamp just found it online and adapted it as minimally as possible to have it work for us, so I don't know if something in this line would work properly with one of these attributes and not the other. It seems strange to me though that two attributes using dates in the same program would store them in different formats though, so I'm not convinced that's it.
In any case, any help anyone can offer to help us get a uniform date format in the output of this script would be greatly appreciated - it needn't have the time included if it's easier to do away with it, though if they're equally easy we may as well keep it.
whencreated is already a [DateTime]. Notice the difference between the properties when you run something like this:
Get-ADUser TestUser -Properties lastlogon,whenCreated | select lastlogon,whenCreated | fl
(Get-ADUser TestUser -Properties lastlogon).lastlogon | gm
(Get-ADUser TestUser -Properties whenCreated).whenCreated | gm
This means that you don't have to convert to a DateTime before running the toString() method.
select-object #{Name="Timestamp"; Expression={$_.whenCreated.ToString('yyyy-MM-dd_hh:mm:ss')}}

Adding a header to a '|' delimited CSV file in Powershell?

I was wondering if anybody knows a way to achieve this without breaking/mesing with the data itself?
I have a CSV file which is delimited by '|' which was created by retrieving data from Sharepoint using an SPQuery and exported using out-file (because export-csv is not an option since I would have to store the data in a variable and this would eat at the RAM of the server, querying remotely unfortuntely will also not work so i have to do this on the server itself). Nevertheless I have the Data i need but i want to perform some manipulations and move and autocalc certain data within an excel file and export the said excel file.
The problem I have right now is that I sort of need a header to the file. I have tried using the following code:
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$file = Import-Csv inputfilename.csv -Header $header | Export-Csv D:\outputfilename.csv
In powershell but the issue here is that when i perform the second Export-Csv it will delimit at anything that has a comma and thus remove it, i sort of need the data to remain intact.
I have tried playing with the -Delimit '|' setting both on the import and the export path but no matter what i do it seems to be cutting off the data. Is there a better way to simply add a row at the Top (a header) without messing with the already existing file structure?
I have found out that using a delimiter such as -delimiter '°' or any other special case character will remove my problem entirely, but i can never be sure if such a character is going to show up in the dataset and thus (as stated already) am looking for a more "elegant" solution.
Thanks
One option you have is to create the original CSV with the headers first. Then when you are exporting the SharePoint data, use the switch -Append in the Out-File command to append the SP data to the CSV.
I wouldn't even bother messing with it in csv format.
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$in_file = '.\inputfilename.csv'
$out_file = '.\outputfilename.csv'
$x = Get-Content $in_file
Set-Content $out_file -Value $header,$x
There's probably a more eloquent/refined two-liner for some of this, but this should get you what you need.

Resources