Remove all site collections under one specific path - sharepoint

I have a web application with 100 site collections under the /sites/ path.
Also a root site collection in the root /
I need a powershell script to delete all of them except the root site collection.
I know the powershell command is remove-spsite but I dont want to type 1000 urls

Get-SPSite: Returns all site collections that match the given criteria.
Get-SPWeb: Returns all subsites that match the given criteria.
Remove-SPWeb: Completely deletes the specified Web.
By piping one command into the next you should be able to delete all web sites. The final whatif parameter shows what will happen.
Get-SPSite | Get-SPWeb | Remove-SPWeb -whatif

Related

The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator while renaming site column

I am getting
"The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator"
while renaming site column in SharePoint online using CSOM. I have faced this issue in the past while fetching items from the large list but this is a different scenario, here I am just trying to rename the site column.
This issue is caused by the Item Count exceed the list view threshold limitation. And No matter retrieve item or rename site column with CSOM, it will throw such exception.
For SharePoint Online, to come across this limitation, here are some way:
Use indexed column.
Reduce the list item and create multiple views which make sure the item count is less than list view threshold.
More information please refer:
Office 365: How SharePoint Online handles List View Threshold
I faced the same issue accessing the folders from Sharepoint online. One of my subfolder under Sites rootfolder had 6000+ subfolders, resulting to threshold limit error. So instead, I user alternative way to access only specific folder that I need using GetFolderByServerRelativeUrl function. Steps were...
Get Online Context
Get List of items. It will also provide the root folder.
Using GetFolderByServerRelativeUrl, get only specific folder within this root folder. Below code may help further.
private Folder GetSubFolder(Web web, Folder rootFolder, string subFolderName)
{
Folder subFolder = null;
try
{
//If folder exists, get the folder form Sharepoint Cloud
subFolder = web.GetFolderByServerRelativeUrl(rootFolder.ServerRelativeUrl + "/" +subFolderName);
web.Context.Load(subFolder);
web.Context.ExecuteQuery();
}
catch (ServerException ex)
{
subFolder = null;
}
return subFolder;
}
}

Microsoft Graph API SharePoint search

I have been trying to search through my SharePoint site. I am able to get results for a single drive:
xxx.sharepoint.com,xxxxxx-xxxx-xxxx-xxxx-xxxxxx,xxxxx-xxxx-xxxx-xxxx-xxxxxx/drives/xxxxxxxxx/search(q='{content}')
But if I do the same search at drive/root, I don't get any result:
xxx.sharepoint.com,xxxxxx-xxxx-xxxx-xxxx-xxxxxx,xxxxx-xxxx-xxxx-xxxx-xxxxxx/drive/root/search(q='{content}')
We basically want to perform a search across the entire subsite.
Hey a bit late but I've just discovered you can use the /sites endpoint to retrieve all items in a site by expanding relationships of graph objects. Could you try use:
https://graph.microsoft.com/v1.0/sites/root/sites?$expand=lists($expand=items)
This seems to return all list items in all subsites under the root site. You should hopefully then be able to to filter further by subsite, list, field values etc.

SharePoint 2013 Retriving documents without DirName and ListID

I like to check on how to retrieve documents that does not belong to any List or Document Library.
Currently the documents URL in search result is something like this "http://example.com/file.doc"
I suspect the documents where at this location due to data importing using PowerShell script and the script was unable to get the path that I was uploading to.
I will like to get the details of documents like this and delete them.
Thank you.
Use powershell:
$web = Get-SPWeb http://example.com
$web.Files | select Url
if you would like to retreive from all subwebs the code would be:
foreach ($inWeb in $web)
{
$inWeb.Files | select Url
}

Sharepoint list content from parent site to subsite

I have a list with multiple records in it. I want to filter some records from the list and display it on subsite web page. I goggled it and I found out that there is no direct way to use the list from parent site into subsite. I am using Sharepoint 2013 and I have full control access to site.
As far as I'm aware, you're correct--there doesn't seem to be a way to pull list items from a parent site to a subsite, at least not in the sense that you most likely want. This is a shot in the dark, but if you're looking to pull in static list items, such as street suffixes or product names, you can create a content type in your parent site that will pass down values you define in a choice field. If you really need dynamic linkage, give this a look: http://www.boostsolutions.com/cascaded-lookup.html
You have two options to display a SharePoint list from Parent site in a subsite:
Using Data View Web Part.
Using Content Query Web Part.
Check the details steps at
How to display a sharepoint list from parent site in subsite?
SP 2016 - Display list from Parent site in sub-site

How can I search through a single sharepoint List in a site collection using Search Scope?

I have added a Web Address rule to a Search Scope and set the Folder url to the following for searching through a single list in site collection :-
http://svrmosstest3/sites/asmtportal/Lists/SearchList
And added this scope to the search dropdown, this search is working fine and will return results from that list only, but it returns one extra item which is an entry for the list itself which is :-
http://svrmosstest3/sites/asmtportal/Lists/SearchList/AllItems.aspx
because this will always fall under the Rule URL.
Is there any other method to create a search scope that will search only through the items of a Single Sharepoint List in a site collection ??
Please tell me ,.. if there are any sharepoint experts ??
You will most likely need to add a crawl rule to exclude that one page from being added to the index (which will then prevent it from being included in your search scope).

Resources