I have set up a SharePoint Online site and I have created a provider hosted app. One of the features of the app is to create subsites and there are times when a subsite needs to be renamed, including renaming the subsite URL. I can use the CSOM to create the subsite without any problems but when I try to rename the URL I get an access denied error. If I only change the title and description of the subsite there is no problem. If I log into SharePoint Online via the browser (using the same user account!) and I use the UI to rename the URL then it works without any problem. The page in SharePoint I use to rename the URL is https://tenant.sharepoint.com/testproject/_layouts/15/prjsetng.aspx
I have tried this on both a Microsoft 365 Developer subscription (where I am doing most of my development and testing) and the main SharePoint Online site where the solution will eventually be deployed to. I don't know much of the details for the main SPO site, other people set it up and I was provided an account to test renaming the subsite. To be clear, I am able to rename the subsite URL via the UI in both the developer and main SharePoint Online sites.
Is there something I'm doing wrong? Is there a limitation to renaming a subsite URL via code in SharePoint Online? Is there a bug in SharePoint Online that prevents renaming a subsite URL using code?
The exception thrown includes ServerErrorTypeName = "Microsoft.SharePoint.SPException". I can get the correlation id but from what I understand that's of no use in SharePoint Online. The exception Message is literally "Access denied." There is no inner exception.
Here is the code I'm using to rename the subsite:
SharePointContext spContext = SharePointContextProvider.Current.GetSharePointContext(HttpContext);
ClientContext clientContext = new ClientContext(spContext.SPHostUrl)
{
Credentials = new SharePointOnlineCredentials("SPUserName", "SPPassword".ToSecureString())
};
var webUrl = request.OldProjectUrl;
var subweb = clientContext.Site.OpenWeb(webUrl);
clientContext.Load(subweb);
clientContext.ExecuteQuery();
subweb.Title = request.ProjectName;
subweb.Description = request.ProjectName;
subweb.ServerRelativeUrl = "/HardcodedForTesting"; // <-- if I skip this line there is no error
subweb.Update();
clientContext.ExecuteQuery();
I was trying to achieve the same result but encountered the same error.
I was able to solve this by disabling the NoScriptSite setting of the site collection.
Using the PnP.PowerShell module:
Set-PnPSite -NoScriptSite:$false
Also the value you give to the ServerRelativeUrl property must be correctly constructed. I found two allowed format:
/sites/site-collection-path/new-subsite-path
new-subsite-path
Just did a test on my environment, I could rename the subsite URL via CSOM code normally. I use the same code as yours.
For your issue, you'd better create a service request with Microsoft.
Related
I'm trying to get the OneNote notebook information that is linked to my organization's CRM accounts. Each account has a OneNote book created for it that can be accessed inside of CRM.
From what I understand, I can use the SharePointDocumentLocation endpoint (found here: https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/sharepointdocumentlocation?view=dynamics-ce-odata-9) to get the location of the specific file if I ask for location type to be 1.
However, SharePointDocumentLocationId and SiteCollectionId don't seem to be pointing to anything on my company's sites. Should I be getting my data somewhere else?
I started searching through my company's SharePoint structure to see if I can get any hints as to where these documents may be located. My initial Postman request (getting the sites off of the root site) don't show the site that hosts our CRM documents (sites/crmdocs). I was able to find where this was stored eventually, but trying to get the OneNote notebooks stored there returns an error since we have more than 20,000 notebooks there, so I can't fetch them all. As far as I know, I'm able to get notebooks if I have the specific ID I want.
Once I fetch the CRM information, I try to send a request like this:
https://graph.microsoft.com/v1.0/sites/{myCompanyUrl},{siteCollectionId},{sharepointDocumentLocationId}/onenote/notebooks/
SiteCollectionId and SharePointDocumentLocationId are from my CRM SharePointDocumentLocation request
The error I receive is:
The requested site was not found. Please check that the site is still accessible.
Assuming your environment is using the out of the box sharepoint site and sharepoint document location hierarchy, you can access One Note files using the following link structure:
[SharePointAbsoluteUrl]/[EntityLogicalName]/[RelativeUrl]_[RegardingObjectId]/[RelativeUrl]
How to get [SharePointAbsoluteUrl] :
Querying for sharepointdocumentlocations is actually not enough because Dynamics 365 stores this information in another entity called sharepointsite. This is how you can obtain it:
var query = new QueryExpression("sharepointsite")
{
ColumnSet = new ColumnSet("absoluteurl")
};
query.Criteria.AddCondition("IsDefault", ConditionOperator.Equal, true);
var entityCollection = _service.RetrieveMultiple(query);
var absoluteUrl = entityCollection[0].Attributes["absoluteurl"];
In Web API it is equivalent to:
GET https://[Your Org]/api/data/v9.0/sharepointsites?$select=absoluteurl&$filter=isdefault%20eq%20true
There can only be a default sharepoint site so this query will return a single record.
How to get the remaining parts:
Fetch for sharepointdocumentlocations that have Location Type dedicated to One Note Integration:
var query = new QueryExpression("sharepointdocumentlocation")
{
ColumnSet = new ColumnSet("regardingobjectid", "relativeurl")
};
query.Criteria.AddCondition("locationtype", ConditionOperator.Equal, 1);
var entityCollection = _service.RetrieveMultiple(query);
In Web API it is equivalent to the following get request, don't forget to add add Prefer: odata.include-annotations="*" to your HTTP Request Headers so that it gets the lookup lookuplogicalname field:
GET https://[Your Org]/api/data/v9.0/sharepointdocumentlocations?$select=relativeurl,_regardingobjectid_value&$filter=locationtype%20eq%201
This query can return many records, I've only used the first one in the examples below for explanation purposes.
[EntityLogicalName] will be your ((EntityReference)entityCollection[0].Attributes["regardingobjectid"]).LogicalName;
In Web Api will be your value._regardingobjectid_value#Microsoft.Dynamics.CRM.lookuplogicalname value.
[RelativeUrl] will be your entityCollection[0].Attributes["relativeurl"];
In Web Api will be your value.relativeurl value.
[RegardingObjectId] can be obtained with this expression ((EntityReference)entityCollection[0].Attributes["regardingobjectid"]).Id.ToString().Replace("-", "").ToUpper();
In Web Api id will be your _regardingobjectid_value value and you have to remove dashes and convert it to upper case in whatever language you are doing the request.
You should end up with an URL like this https://mycompany.sharepoint.com/account/A Datum Fabrication_A56B3F4B1BE7E6118101E0071B6AF231/A Datum Fabrication
We have developed an internal portal with multiple subsites using sharepoint office 365. We created our own page layouts/masterpage for the sites and most of the things like menu's, page body and page logos are customized(unique for all sites/subsites).
Each page has a header logo and a url assigned to that logo(logo describes site or subsite name). We have written a javascript file to load these logo and url and calling on the masterpage. Now the problem is these logo and link should load based on Global navigation
Example:
If the site is using the same navigation items as the parent site?
Yes - pull logo and link from site above
No - pull logo and link according to site name
if i get the GlobalNavigation setting value then i can do this in javscript file. Is there a way to get this GlobalNavigation setting value in javascipt file? I googled on this but didn't get enough information.
Thanks in advance,
Amarnath
--------UPDATED-------
I am using the below code but getting error "sp.runtime.js:2 Uncaught Error: The property or field 'Source' has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested"
used code
var ctx = SP.ClientContext.get_current();
var web = ctx.get_web();
//Navigation Provider Settings for Current Web.
var webNavSettings = new SP.Publishing.Navigation.WebNavigationSettings(ctx, web);
//Global Navigation (Top Navigation Bar)
var navigation = webNavSettings.get_globalNavigation();
navigation.set_source(1);
webNavSettings.update();
ctx.executeQueryAsync(onSuccess, onFailure);
I haven't found that much information online about how to utilize the Social Part of SharePoint 2013 by Server Object Model. To understand and follow my question better i recommend you going to this site.
Lets say I have a Feature Receiver that is fired when i certain site is created and I would like to take advantage of the Follow Content Feature but instead of every time a site is created i would like to make the person that created the site automatically follow that site.
Does anyone got experience with working with the Social functionallity in SharePoint 2013? If so would be awesome with a summary how to use the different Social methods.
Social Actor
From what I've understood from reading about this you need to create a "Actor" to represent the item or in my case the site. SocialActorInfo which takes to properties ContentUri and ActorType.
SocialActorInfo actorInfo = new SocialActorInfo();
actorInfo.ContentUri = contentUrl;
actorInfo.ActorType = contentType;
Find and Check if that Actor is followed by the current user
Then you have to check if that SocialActor is Followed by the current user.
ClientResult<bool> isFollowed = followingManager.IsFollowed(actorInfo);
Follow/Unfollow the Site/Item
ClientResult<SocialFollowResult> result = followingManager.Follow(actorInfo);
clientContext.ExecuteQuery();
"followingManager.UnFollow(actorInfo);"
Questions,
If I want to follow a site, what kind of ActorTypes are there?
How do i do this with server-side code?
Additional Information
Microsoft says: When users follow documents, sites, or tags, status updates from documents, conversations on sites, and notifications of tag use show up in their newsfeed. The features related to following content can be seen on the Newsfeed and the Following content pages.
SharePoint Server 2013 provides the following APIs that you can use to programmatically follow content:
Client object models for managed code
NET client object model
Silverlight client object model
Mobile client object model
JavaScript object model
Representational State Transfer (REST) service
Server object model
Link to Follow Content in SharePoint 2013, I can just find how to do it with REST or CSOM.
Just wanted to Share, this Solved the task.
Just a Follow Method that takes a SPWeb object and a SPUser object.
SPServiceContext serverContext = SPServiceContext.GetContext(web.Site);
UserProfileManager profileManager = new UserProfileManager(serverContext);
string userString = user.LoginName.ToString();
UserProfile userProfile = profileManager.GetUserProfile(userString);
if (userProfile != null)
{
SPSocialFollowingManager manager = new
SPSocialFollowingManager(userProfile);
SPSocialActorInfo actorInfo = new SPSocialActorInfo();
actorInfo.ContentUri = new Uri(web.Url);
actorInfo.AccountName = user.LoginName;
actorInfo.ActorType = SPSocialActorType.Site;
manager.Follow(actorInfo);
}
So let's say I have a full url into a sharepoint website.
In the past when I wanted to get the weburl and doc url, I used the "url to web url" method of the Front Page Server Extensions. (http://msdn.microsoft.com/en-us/library/ms460544.aspx).
So for example if you had a site at
http://webapp/site1/chidsite/a.doc
I want a method in CSOM that will return /site1/childsite as the weburl.
I see the Web.WebUrlFromPageUrlDirect() method in CSOM but I'm not sure I'm getting what I need back from it. In the Uri class I get back, would I use teh "AbsolutePath" property for the weburl?
What is the correct way to do this?
I also would like to get the docURL that is usually retrieved by a call to url to web url via frontpage extensions.
The fileurl will be the file location relative to the web site. So if the document is stored in the document library called "Documents", you will get a value of /Documents/file.ext.
Well, I've reverted to using the FrontPage extensions when doing URLToWebURL since I just wasn't sure what I was getting using CSOM methods. Though check the link below for answers I received elsewhere.
Information Pertaining to this issue
We create all our site collections programatically with a custom site def/template. Everything works as expected, except for the crawler. It's apparently denied access to the sites. The crawl logs says:
http://server.localnetwork.lan/somesites/siteName
The object was not found. (The item
was deleted because it was either not
found or the crawler was denied access
to it.)
And in the log files I'm getting this:
08/11/2009 14:20:34.01 OWSTIMER.EXE
(0x0674)
0x1560 Search Server Common
MS Search Administration
7hmh High exception in
SearchUpgradeProvisioner Keyword
Config
System.InvalidOperationException:
jobServerSearchServiceInstance is null
at
Microsoft.Office.Server.Search.Administration.SearchUpgradeProvisioner..ctor(SearchServiceInstance
searchServiceInstance) at
Microsoft.Office.Server.Search.Administration.OSSPrimaryGathererProject.ProvisionContentSources()
If I create a site collection manually the crawler is able to access it. The same users/accounts have the same access on both sites, so that shouldn't be the issue.
The code we use to actually create the site collection looks a little like this:
SPWebApplication app = SPWebApplication.Lookup(new Uri("WebApplicationUrl"));
app.FormDigestSettings.Enabled = false;
app.Sites.Add("url", "title", "description", "language code", "SiteTemplateName", "Owner.Username", "Owner.Fullname", "Owner.Email");
app.FormDigestSettings.Enabled = true;
The code has been slightly altered to protect the innocent... ;)
Any idea what we're doing wrong?
(Please note, I'm not sure if this is a programming error or a config/setup error, so I'm cross-posting with Serverfault)
If you receive this error whilst the crawler account (the default content access account) has read permission to all your sites then you most likely need to disable the loopback check.
http://support.microsoft.com/kb/896861
http://koenvosters.wordpress.com/2009/06/15/access-denied-when-using-hostname-search-and-site-on-moss-2007/