Editing a number of lists of a share point site together - sharepoint

I don't know if this is the right place to ask but I am creating a powerapp canvas based app. My data source are saved as share point list files. I have to change the column type from single line of text to Choice for all those list files. As I have a very large number of lists ( Around 100) and each list contains 30-40 columns , changing one by one seems impossible. Is there a way I can change the column type of all the lists at once? Even a way to change the column type of all columns of a single list at once would ease the work a bit!! Please help. If it's not possible directly, is it possible in javascript? can you nudge me to the right direction? Thanks.
list

The easiest way is to run powershell pnp script to update all column types.
#Set Variables
$SiteURL = {siteUrl}
$ListNames = ("List1","List2");
$ColumnName = "Single"
#Connect to PNP Online
Connect-PnPOnline -Url $SiteURL -UseWebLogin
ForEach($listName in $ListNames)
{
$field = Get-PnPField -List $listName -Identity $ColumnName -Includes FieldTypeKind
$field.FieldTypeKind = [Microsoft.SharePoint.Client.FieldType]::Choice
$field.Update()
}
Invoke-PnPQuery

Related

Filter on folder and item type in ArcGIS Online

enter image description hereI want to change the extent of all webmaps in a given folder in ArcGIS Online. However, the standard query command (gis.content.search) does not have a property that allows searching only in a folder.
On the other hand, once I have a list of all items in a given folder, I am unable to only filter all webmaps out. This is the script I used (see also the image):
folderowner = 'my_username'
updatefolder = 'testfolder'
user = gis.user.get(username=my_username)
user
rightFolder = user.items(folder = updatefolder)
rightFolder
for item in rightFolder:
print(item.type)
for item in rightFolder:
if (item_type = "Web Map"):
print (item.title)
screenshot of script
I can print the Item types. But once I try to list only Web Maps, I get a Syntax error. I have tried many different notations.
I can pass the types in a print command, but I can't use the types as a way to filter. Where I am basically looking for is thus a way to get only web maps in a given folder.

PowerShell Dynamic Creation of ADUsers from imported Excelsheet

first of all, i want to say, that i'm very, very new to PowerShell. These are the first PS-scripts i wrote.
I'm currently working on a PS-Script for AD-Administration. Currently the Scripts for Adding/Deleting SmbShares, Adding or removing Users from Groups, and so on, are already done.
I already had a working script for creating the users in AD, but it wasn't dynamic, as in hard coded variables that all would have to be entered into a new-ADUser command. As the code will be used for more than one specific set of parameters, it has to be dynamic.
I'm working with Import-Excel and found a great function here, but I'm having two problems wih this function.
$sb = {
param($propertyNames, $record)
$propertyNames | foreach-object -Begin {$h = #{} } -Process {
if ($null -ne $record.$_) {$h[$_] = $record.$_}
} -end {New-AdUser #h -verbose}
}
Use-ExcelData -Path $Path -HeaderRow 1 -scriptBlock $sb
The dynamic part of this is, that the table headers will be used as the parameternames for New-ADUser. Only thing one needs to change if the amount of parameters needed changes is add or delete a column in the excel sheet. The column header always needs the same name as the parameter of New-ADUser.
Screenshot of excel table
My Problem now is the "Type" Header i've got at column A. It is needed to specify the type of the user for adding the user to specific ADGroups. But due to the function above using all headers as parameters this doesn't work.
Has anyone an idea how to change the function $sb so that it starts with the second column? I've tried aroung with skip 1 and tried a lot of other workarounds, but with my non-experience nothing ssemed to come close to what i need.
SOLVED PROBLEM BELOW: added -DataOnly to Use-ExcelData and now it works.
The second problem would be, that the function does not stop trying to create users once there are no more values for the parameters. For trying around i deleted the column "Type". In the example of trying to create the two users testuser and testuser2, Powershell creates the users with no problems but then asks for a name for a new-ADUser.
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test User,CN=Users,DC=****,DC=**".
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test2 User2,CN=Users,DC=****,DC=**".
Cmdlet New-ADUser an der Befehlspipelineposition 1
Geben Sie Werte für die folgenden Parameter an:
Name:
Thank you in advance, sorry for my english and please tell me if I did something wrong forumwise.
I see you I would save the excel sheet as a CSV file and then import it. It's faster and easier to consume. The headers become your parameter names and the import behaves like any other object.
$csvData = Import-csv -path <path to csv file>
From here, iterate the rows and access the values as properties of the row. No need to import the data into a hashtable, it's already accessible with property names defined by the header row.
foreach ($row in $csvData) {
Write-Host $row.Name
Write-Host $row.Path
}
Once the loop reaches the end of the file, it stops trying to create users.
FYI, The use of single letter variables is going to make your code very difficult to maintain. My eyes hurt just looking at it.

PowerShell: update O365 AD bulk attributes through csv file

We are trying to bulk update our Azure Active Directory. We have a excel csv list of UserPrincipalNames that we will update the Title, Department, and Office attributes
# Get List of Clinical CMs
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Import-csv $PATH
# Pass CMs into Function
ForEach ($UPN in $CMs) {
# Do AD Update Task Here
Set-Msoluser -UserPrincipalName $UPN -Title "Case Manager" -Department "Clinical" -Office "Virtual"
}
The CSV:
User.1#domain.com
User.2#domain.com
User.3#domain.com
The Set-MsolUser command will work on its own, but it is not working as intended in this For loop. Any help or insight is greatly appreciated
As Jim Xu commented, here my comment as answer.
The input file you show us is not a CSV file, instead, it is a list of UPN values all on a separate line.
To read these values as string array, the easiest thing to is to use Get-Content:
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Get-Content -Path $PATH
Of course, although massive overkill, it can be done using the Import-Csv cmdlet:
$CMs = (Import-Csv -Path $PATH -Header upn).upn

export-csv powershell with custom column type

I'm exporting an AD report via power shell using the code below.
$Get-ADUser -Filter 'enabled -eq $true' -SearchBase "OU=Staff,OU=Users,OU=OSA,DC=domian,DC=org" -properties mail, employeeID | select employeeID, mail, ObjectGUID | Export-CSV "C:\Reports\ADExports\Students.csv" -notypeinformation
It outputs the csv file and everything looks fine except, the 'Data type' of all columns are set to 'Short Text'.
I require the Employee ID column type to be 'Numbers'. Is it possible to export a csv with custom field type.
I hope this make sense.
Thanks in advance.
CSVs are plain text and do not contain type information. However, you can use the following module, which provides an Export-Excel cmdlet. This cmdlet takes various Excel parameters, including a -NumberFormat.
$x | Export-Excel -Numberformat 'Number' -Path 'test.xlsx' #This worked for me.
You will probably have to play around with it a little depending on your exact use case. Good luck!
ok guys, I'm going to assume that you cannot export a csv via powershell with custom data types.
However, I found a way around my issue. I've converted the data type when importing the csv in to access and managed to solve my issue.
If anyone's interested you can find the exact issue and solution here - Joining Two Tables with Different Data types MS ACCESS - 'type mismatch in expression' error
Thank you for bring the 'ImportExcel' module to my attention. Now i know there's module available for this and you can do quite a bit on excel manipulation.
Thank you all for your comments/answers.
Thanks.

Trying to Export a CSV list of users using Active Directory Module for Windows Powershell

So the below is where I'm at so far:
import-module activedirectory
$domain = "ourdomain"
Get-ADUser -Filter {enabled -eq $true} -Properties whenCreated,EmailAddress,CanonicalName |
select-object Name,EmailAddress,CanonicalName,whenCreated | export-csv C:\Data\test.csv
Unfortunately, when I run the above I get dates in two different formats in the CSV, e.g.:
01/01/2017
1/01/2017 8:35:56 PM
The issue this poses is that there isn't really a clean way to sort them. Excel's formatting doesn't change either of these formats to be more like the other, both because of the inclusion of time in one and not the other, and because the time-inclusive format doesn't use trailing zeroes in the single digit numbers, but the time-exclusive format does.
We have an existing script that captures users using the LastLogonTimestamp attribute that does this correctly by changing the bottom line to the following:
select-object Name,EmailAddress,CanonicalName,#{Name="Timestamp"; Expression={[DateTime]::FromFileTime($_.whenCreated).ToString('yyyy-MM-dd_hh:mm:ss')}}
For some reason this expression runs properly when we query the LastLogonTimestamp attribute, but when we run this version querying the whenCreated attribute, we get an entirely blank column underneath the Timestamp header.
I'm not particularly knowledgeable about PowerShell itself, and my colleague who had found the original script for the LastLogonTimestamp just found it online and adapted it as minimally as possible to have it work for us, so I don't know if something in this line would work properly with one of these attributes and not the other. It seems strange to me though that two attributes using dates in the same program would store them in different formats though, so I'm not convinced that's it.
In any case, any help anyone can offer to help us get a uniform date format in the output of this script would be greatly appreciated - it needn't have the time included if it's easier to do away with it, though if they're equally easy we may as well keep it.
whencreated is already a [DateTime]. Notice the difference between the properties when you run something like this:
Get-ADUser TestUser -Properties lastlogon,whenCreated | select lastlogon,whenCreated | fl
(Get-ADUser TestUser -Properties lastlogon).lastlogon | gm
(Get-ADUser TestUser -Properties whenCreated).whenCreated | gm
This means that you don't have to convert to a DateTime before running the toString() method.
select-object #{Name="Timestamp"; Expression={$_.whenCreated.ToString('yyyy-MM-dd_hh:mm:ss')}}

Resources