i'am stuck coding with below requirement.
I have two excel(xls) files(old and new users list). In each file there are 4 fields "Userid", "UserName", "Costcenter", Approving Manager" . Now, i need to check whether each Userid from New user list exists in Old user list. If so, i have to copy/replace the values of "Costcenter" and Approving Manager" in the New User list with the values from the same columns from Old user list. If this condition fails then hightlight the entire row for the "userid" in the New User List for which there is no corresponding matching record in the Old User list and finally not last but least we have to save the New user list. There are about 2000+ userid's
below, i started of coding to get the Userid list from the New user list into an Array. will be doing the same for Old user list. From there on how do i go by modifying the new user list like i explained above?
$objExcel = new-object -comobject excel.application
$UserWorkBook = $objExcel.Workbooks.Open("O:\UserCert\New_Users.xls")
$UserWorksheet = $UserWorkBook.Worksheets.Item(1)
$OldUsers = #()
$intRow = 2 #starting from 2 since i have to exclude the header
do {
$OldUsers = $UserWorksheet.Cells.Item($intRow, 1).Value()
$intRow++
} while ($UserWorksheet.Cells.Item($intRow,1).Value() -ne $null)
Any help Greatly appreciated...
If the userIDs in each list in some type of regular order, then it should be easy to open both workbooks up at the same time, maintain 2 pointers (Row_for_old_userlist and Row_for_new_userlist) and compare the contents of the New User ID with the old one.
If they aren't in some semblance of order, then for each item in the new userlist, you'll have to scan the entire old userlist to find them and then take your action.
I'm not sure that saving in CSV is a valid approach from your requirements - you may not be able to get the data that way.
However - I think you're really asking how to set the value of an Excel spreadsheet cell.
If so, here's some code I use to insert a list of defects into a spreadsheet...and then set the last cell's column A to a different color. Strangely enough, that "value2" part...it's non-negotiable. Don't ask me why it's "value2", but it is.
for ($x=0; $x -lt $defects.count; $x++)
{
$row++
[string] $ENGIssue = $defects[$x]
$sheet.Cells.Item($row,1).value2 = $ENGIssue
}
$range = $sheet.range(("A{0}" -f $row), ("B{0}" -f $row))
$range.select() | Out-Null
$range.Interior.ColorIndex=6
Related
I am new to Google Apps Script and have just begun to understand its working. A team member wrote out a simple simple script for some work i was doing. The script, in essence, triggered when any of a permitted set of users (could vary) submits inputs to a 'Form Responses 1' spreadsheet via a Google Form.
Basically, I have a form that users complete and then submit. Upon submission, the script checks for the active row, The code adds 1 to the number of the cell W2 (which is a 'do not edit' cell, and replaces W2 with the new number, then checks if the Unique ID field on the Active Row is null and then replaces it with a concatenated ID thats alphanumeric. ie, it prefixes a set alphabetical prefix and takes the numerical input from the cell W2 on the same form to create a new Unique ID.
The script was working perfectly until the team member left and I removed her access from the Google sheets with no change to the script at all. I've been scrambling trying to figure out what happened after that, because since access was removed, when I haven't made any changes to my code. I have searched many different places and cannot seem to find what is wrong.
If i post it on a new google sheet, it's working fine .. but not on this sheet which already has around 900 critical entries.
Any guidance is welcome. the Script is as below. HELP!
//Logger.log(SpreadsheetApp.getActiveSpreadsheet().getUrl());
//Logger.log(SpreadsheetApp.getActive().getUrl());
// Get the active sheet
var sheet = SpreadsheetApp.getActiveSheet();
// Get the active row
var row = sheet.getActiveCell().getRowIndex();
// Get the next ID value. NOTE: This cell should be set to the last record counter value
var id = sheet.getRange("X2").getValue()+1;
Logger.log("HKG0"+id);
// Check if ID column is empty
if (sheet.getRange(row, 1).getValue() == "") {
// Set new ID value
sheet.getRange(2, 24).setValue(id);
sheet.getRange(row, 1).setValue("HKG0"+id);
}
}
If your code is running off of a form submit trigger then this should work for you.
function formsubit(e) {
Logger.log(JSON.stringify(e));
var sheet = e.range.getSheet();
var id = sheet.getRange("X2").getValue() + 1;
if (sheet.getRange(e.range.rowStart, 1).getValue() == "") {
sheet.getRange(2, 24).setValue(id);
sheet.getRange(e.range.rowStart, 1).setValue("HKG0" + id);
}
}
The Logger.log will help you to learn more about the event object. You can learn more about event objects here
If you're looking for a unique id for each submission try: const id = new Date(e.values[0]).valueOf(); it's the number of milliseconds since Jan 1, 1970
I have to go through a loop in excel using the COM Object (no additional modules allow in environment aside from what comes installed with POSH 5).
In each loop I have to look through a worksheet (from a list of variables) for a particular set of values and pull and append data according to it.
My problem isnt so much accomplishing it, but rather the performance hit i get every time I do a Find Value2 in each worksheet.
With future expected massive increase of list of worksheets, and old ones with just more and more columns to parse through and work on in the future, how can I make this smoother and faster.
What I currently do is the following:
$Exl = New-Object -ComObject "Excel.Application"
$Exl.Visible = $false
$Exl.DisplayAlerts = $false
$WB = $Exl.Workbooks.Open($excel)
Foreach ($name in $names) {
$ws = $WB.worksheets | where {$_.name -like "*$name*"}
$range = $ws.Range("C:C")
$findstuff = $range.find($item)
$stuffrow = $findstuff.row
$stuffcolumn = $findstuff.column
}
This last part is what takes A LOT of time, and with each additional sheet and more columns I only see it growing, where it might take 10-20 mins
what can be done to optimize this?
On a side note: while I only need the one row and columnar results, there is also a slight issue with when finding value, it only shows the first result. If in the future there might be a need for the multiple rows and columns where value2 = $variable what should I do? (thats less important though, I asked in case if its related)
Anytime the pipeline is used, there's a performance hit. Instead of using the where object, try something like this (using an if statement):
foreach ($name in $names) {
$ws = if ($WB.worksheets.name -like "*$name*")
$range = Range("C:C")
$findstuff = $range.find($item)
$stuffrow = $findstuff.row
$stuffcolumn = $findstuff.column
}
Note that maybe your line has a typo for the part *where {$_.name -like "*$names*"}*. Maybe it should read *where {$_.name -like "*$name*"}*?
I found my basis from the following bookmark I had: http://community.idera.com/powershell/powershell_com_featured_blogs/b/tobias/posts/speeding-up-your-scripts
So I found a very simple answer.... which is somehow simultaneously EXTREMELY obvious and EXTREMELY unintuitive.
When defining the $range variable Add a pipe to select ONLY the stuff you need.
Instead of:
$range = $ws.Range("C:C")
do:
$range = $ws.Range("C:C") | Select Row, text, value2, column
Why is this unintuitive?
1) Normally Piping would make things slower especially if your pushing many to filter a few
2) One would expect that, especially since its going through the COM object, since it ACTUALLY runs the action when setting a variable rather than just defining. But that is not what happens here. When you set the Variable, it runs AFTER the variable has been defined and gathers the data THE MOMENT the variable is called [I tested this, and saw resource usage at that particular period only], and saves the data after that first variable call. (WHICH IS VERY WEIRD)
I have a list in sharepoint (List B) which has a column value that is a look up(information already on this site) from List A.
In attempt to connect list A to a database, the data was deleted from list A and then restored..
That being said; List A is populated and correct, but now List B is blank in that particular look up column..
Does anyone have any suggestions of ways to restore that particular column value??
The issue was that when trying to connect list A to a database it actually deleted all items and then reinserted them. Therefore the lookup column in list B was pointing to the old metadata for that particular column.
I ended up deleting the data that had been 'reinserted' in list A ,then restored the previous list A column values from the recycle bin.
Probably not what you're looking for but might be better than nothing.
Do you have any columns to match the items? Something along the line of "If ListB Item has x in column xc then the corresponding looked up item has y in column yc". If so you could repopulate the list with a powershell script like:
asnp *share*
[System.Reflection.Assembly]::Load("Microsoft.SharePoint, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null
$ErrorActionPreference = "stop"
$webUrl = "<your Url>"
$listBTitle = "<Title of list B>"
$listATitle = "<Title of list A>"
$listBLookUpColumnName = "<lookup Column name in list b>"
$listAValueToShow = "<Column name of the value that should be displayed in the lookup field>"
$matchColumnA = "<Title of match column list A>"
$matchColumnB = "<Title of match column list B>"
$web = Get-SPWeb $webUrl
$listB = $web.Lists[$listBTitle]
$listA = $web.Lists[$listATitle]
$listB.Items | % {
$lookingItem = $_
$itemToLookIn = $listA.Items | ? { $_[$matchColumnA] -eq $lookingItem[$matchColumnB] }
$lookUpValue = New-Object Microsoft.Sharepoint.SPFieldLookupValue
$lookUpValue.LookupId = $itemToLookIn.ID
$lookUpValue.LookupValue = $itemToLookIn[$listAValueToShow]
$lookingItem[$listBLookUpColumnName] = $lookUpValue
$lookingItem.Update()
}
If not, is version control activated in ListB? Maybe write a script to revert the items 1 version back or read the lookup value from a previous version if some other work has been done with the items.
The workflow status data that I want is in the column with the same name as the document library. However I cannot access the data in that column with the name for that column displayed in SharePoint. I need the internal column name if I am going to access that column with the code below.
$list = $SPWeb.lists["document library name here"]
$items = $list.Items
$count = 0
foreach($item in $items)
{
# (typically you put the column name you want where SPWorkflowStatusColumnName is)
if($item["SPWorkflowStatusColumnName"] -eq "Completed")
{
$count = $count + 1
}
}
The internal Column name is eight character long removing spaces and non-alphanumeric characters.
or to find the internal name of the column just click on the column name in list and check the url value after SortField= will be the internal name of the column.
for eg. in my case usrl looks like this:
http://[server]/[sitecollection]/[site]/listname/Forms/AllItems.aspx?View={view GUID}&SortField=TradingS&SortDir=Asc
and my fields internal name is "TradingS".
I got a custom sharepoint list, with a title column which contains Servernames and i got alot of .csv files that has the same name as the title
Is it somehow possible to upload em to the list item as an attachment with a script? Maybe Powershell?
I got over 100 files so it would be a pain if i had to do it all manually.
The File and Title column looks like this:
Title
Server1.domain.com
.csv
Server1.domain.com.csv
Have been looking around for quite a while but havent been able to find anything i could use
Approach:
Iterate over the SharePoint list items.
Load the corresponding CSV file by the SPListItem.Title
Add the file as attachment to the SPListItem.
Script:
$w = Get-SPWeb http://List/Location/Url
$l = $w.Lists["TheList"]
foreach ($item in $l.Items)
{
$file = Get-Item "$($item.Title).csv"
$stream = $file.OpenRead()
$buffer = New-Object byte[] $stream.length
[void]$stream.Read($buffer, 0, $stream.Length)
$item.Attachments.AddNow($file.Name, $buffer)
}
Considerations:
Disposing of the file stream
Disposing of the SPWeb object
Handle missing files
Edit
Two mistakes in first attempt:
SPAttachmentCollection.Add expects a byte array.
SPAttachmentCollection.AddNow should be used to add the attachment directly (without update of the SPListItem)
Updated the code...