Download and display images from sharepoint (Xamarin.Forms PCL) - sharepoint

I'm currently developping an app based on Sharepoint REST API and Microsoft Graph. My app is composed mainly with a Tabbed Page, and on the first tab I want to display News from sharepoint. To do so I want to get images from the news, but using the url of the images as Image Source returns me "forbidden" error. So, I decided to find a way to download those images. I've been looking for a way for few days now, and found nothing...
Does anyone knows a way to do that, or why do I have a forbidden access error ?
I bring some precisions, I'm using an HttpClient to authenticate to the tenant with REST API :
client = new HttpClient();
Uri mUri = new Uri(App.ReturnUri);
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var data = await DependencyService.Get<IAuthenticator>()
.Authenticate(App.LoginAuthority, App.RestResourceUri, App.ClientId, mUri);
App.AuthenticationResult = data;
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", App.AuthenticationResult.AccessToken);
From that, I hope someone knows if there is a way to download images, or to access to url in order to use them as Source. I succeeded in retrieving URLs but not displaying images, knowing that I can retrieve any other data from sharepoint
Thanks for any help,

The Image component can use a URL as its data, but that requires images that aren't behind an authentication which is probably the case with your SharePoint images. That is also why it's returning you a Forbidden error.
var webImage = new Image { Aspect = Aspect.AspectFit };
webImage.Source = ImageSource.FromUri(new Uri("https://xamarin.com/content/images/pages/forms/example-app.png"));
To add the authentication layer for SharePoint you would probably be able to implement your own UriImageSource implementation. In the end an ImageSource only needs a Stream object so getting that from either the internet or local storage would work.
To download and store images you will have to write code for both platforms specifically because local storage works differently. Check out this blog post as a starting point combined with your own piece of code.
https://blog.falafel.com/xamarin-forms-storing-and-retrieving-photos/

Related

I'm looking for an example of writing to a file from a Chrome Extension [duplicate]

I'm currently creating an extension for google chrome which can save all images or links to images on the harddrive.
The problem is I don't know how to save file on disk with JS or with Google Chrome Extension API.
Have you got an idea ?
You can use HTML5 FileSystem features to write to disk using the Download API. That is the only way to download files to disk and it is limited.
You could take a look at NPAPI plugin. Another way to do what you need is simply send a request to an external website via XHR POST and then another GET request to retrieve the file back which will appear as a save file dialog.
For example, for my browser extension My Hangouts I created a utility to download a photo from HTML5 Canvas directly to disk. You can take a look at the code here capture_gallery_downloader.js the code that does that is:
var url = window.webkitURL || window.URL || window.mozURL || window.msURL;
var a = document.createElement('a');
a.download = 'MyHangouts-MomentCapture.jpg';
a.href = url.createObjectURL(dataURIToBlob(data.active, 'jpg'));
a.textContent = 'Click here to download!';
a.dataset.downloadurl = ['jpg', a.download, a.href].join(':');
If you would like the implementation of converting a URI to a Blob in HTML5 here is how I did it:
/**
* Converts the Data Image URI to a Blob.
*
* #param {string} dataURI base64 data image URI.
* #param {string} mimetype the image mimetype.
*/
var dataURIToBlob = function(dataURI, mimetype) {
var BASE64_MARKER = ';base64,';
var base64Index = dataURI.indexOf(BASE64_MARKER) + BASE64_MARKER.length;
var base64 = dataURI.substring(base64Index);
var raw = window.atob(base64);
var rawLength = raw.length;
var uInt8Array = new Uint8Array(rawLength);
for (var i = 0; i < rawLength; ++i) {
uInt8Array[i] = raw.charCodeAt(i);
}
var bb = new this.BlobBuilder();
bb.append(uInt8Array.buffer);
return bb.getBlob(mimetype);
};
Then after the user clicks on the download button, it will use the "download" HTML5 File API to download the blob URI into a file.
I had long been wishing to make a chrome extension for myself to batch download images. Yet every time I got frustrated because the only seemingly applicable option is NPAPI, which both chrome and firefox seem to have not desire in supporting any longer.
I suggest those who still wanted to implement 'save-file-on-disk' functionality to have a look at this Stackoverflow post, the comment below this post help me a lot.
Now since chrome 31+, the chrome.downloads API became stable. We can use it to programmatically download file. If the user didn't set the ask me before every download advance option in chrome setting, we can save file without prompting user to confirm!
Here is what I use (at extension's background page):
// remember to add "permissions": ["downloads"] to manifest.json
// this snippet is inside a onMessage() listener function
var imgurl = "https://www.google.com.hk/images/srpr/logo11w.png";
chrome.downloads.download({url:imgurl},function(downloadId){
console.log("download begin, the downId is:" + downloadId);
});
Though it's a pity that chrome still doesn't provide an Event when the download completes.chrome.downloads.download's callback function is called when the download begin successfully (not on completed)
The Official documentation about chrome.downloadsis here.
It's not my original idea about the solution, but I posted here hoping that it may be of some use to someone.
There's no way that I know of to silently save files to the user's drive, which is what it seems like you're hoping to do. I think you can ASK for files to be saved one at a time (prompting the user each time) using something like:
function saveAsMe (filename)
{
document.execCommand('SaveAs',null,filename)
}
If you wanted to only prompt the user once, you could grab all the images silently, zip them up in a bundle, then have the user download that. This might mean doing XmlHttpRequest on all the files, zipping them in Javascript, UPLOADING them to a staging area, and then asking the user if they would like to download the zip file. Sounds absurd, I know.
There are local storage options in the browser, but they are only for the developer's use, within the sandbox, as far as I know. (e.g. Gmail offline caching.) See recent announcements from Google like this one.
Google Webstore
Github
I made an extension that does something like this, if anyone here is still interested.
It uses an XMLHTTPRequest to grab the object, which in this case is presumed to be an image, then makes an ObjectURL to it, a link to that ObjectUrl, and clicks on the imaginary link.
Consider using the HTML5 FileSystem features that make writing to files possible using Javascript.
Looks like reading and writing files from browsers has become possible. Some newer Chromium based browsers can use the "Native File System API". This 2020 blog post shows code examples of reading from and writing to the local file system with JavaScript.
https://blog.merzlabs.com/posts/native-file-system/
This link shows which browsers support the Native File System API.
https://caniuse.com/native-filesystem-api
Since Javascript hitch-hikes to your computer with webpages from just about anywhere, it would be dangerous to give it the ability to write to your disk.
It's not allowed. Are you thinking that the Chrome extension will require user interaction? Otherwise it might fall into the same category.

What is the Sharepoint Document Location endpoint really returning?

I'm trying to get the OneNote notebook information that is linked to my organization's CRM accounts. Each account has a OneNote book created for it that can be accessed inside of CRM.
From what I understand, I can use the SharePointDocumentLocation endpoint (found here: https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/sharepointdocumentlocation?view=dynamics-ce-odata-9) to get the location of the specific file if I ask for location type to be 1.
However, SharePointDocumentLocationId and SiteCollectionId don't seem to be pointing to anything on my company's sites. Should I be getting my data somewhere else?
I started searching through my company's SharePoint structure to see if I can get any hints as to where these documents may be located. My initial Postman request (getting the sites off of the root site) don't show the site that hosts our CRM documents (sites/crmdocs). I was able to find where this was stored eventually, but trying to get the OneNote notebooks stored there returns an error since we have more than 20,000 notebooks there, so I can't fetch them all. As far as I know, I'm able to get notebooks if I have the specific ID I want.
Once I fetch the CRM information, I try to send a request like this:
https://graph.microsoft.com/v1.0/sites/{myCompanyUrl},{siteCollectionId},{sharepointDocumentLocationId}/onenote/notebooks/
SiteCollectionId and SharePointDocumentLocationId are from my CRM SharePointDocumentLocation request
The error I receive is:
The requested site was not found. Please check that the site is still accessible.
Assuming your environment is using the out of the box sharepoint site and sharepoint document location hierarchy, you can access One Note files using the following link structure:
[SharePointAbsoluteUrl]/[EntityLogicalName]/[RelativeUrl]_[RegardingObjectId]/[RelativeUrl]
How to get [SharePointAbsoluteUrl] :
Querying for sharepointdocumentlocations is actually not enough because Dynamics 365 stores this information in another entity called sharepointsite. This is how you can obtain it:
var query = new QueryExpression("sharepointsite")
{
ColumnSet = new ColumnSet("absoluteurl")
};
query.Criteria.AddCondition("IsDefault", ConditionOperator.Equal, true);
var entityCollection = _service.RetrieveMultiple(query);
var absoluteUrl = entityCollection[0].Attributes["absoluteurl"];
In Web API it is equivalent to:
GET https://[Your Org]/api/data/v9.0/sharepointsites?$select=absoluteurl&$filter=isdefault%20eq%20true
There can only be a default sharepoint site so this query will return a single record.
How to get the remaining parts:
Fetch for sharepointdocumentlocations that have Location Type dedicated to One Note Integration:
var query = new QueryExpression("sharepointdocumentlocation")
{
ColumnSet = new ColumnSet("regardingobjectid", "relativeurl")
};
query.Criteria.AddCondition("locationtype", ConditionOperator.Equal, 1);
var entityCollection = _service.RetrieveMultiple(query);
In Web API it is equivalent to the following get request, don't forget to add add Prefer: odata.include-annotations="*" to your HTTP Request Headers so that it gets the lookup lookuplogicalname field:
GET https://[Your Org]/api/data/v9.0/sharepointdocumentlocations?$select=relativeurl,_regardingobjectid_value&$filter=locationtype%20eq%201
This query can return many records, I've only used the first one in the examples below for explanation purposes.
[EntityLogicalName] will be your ((EntityReference)entityCollection[0].Attributes["regardingobjectid"]).LogicalName;
In Web Api will be your value._regardingobjectid_value#Microsoft.Dynamics.CRM.lookuplogicalname value.
[RelativeUrl] will be your entityCollection[0].Attributes["relativeurl"];
In Web Api will be your value.relativeurl value.
[RegardingObjectId] can be obtained with this expression ((EntityReference)entityCollection[0].Attributes["regardingobjectid"]).Id.ToString().Replace("-", "").ToUpper();
In Web Api id will be your _regardingobjectid_value value and you have to remove dashes and convert it to upper case in whatever language you are doing the request.
You should end up with an URL like this https://mycompany.sharepoint.com/account/A Datum Fabrication_A56B3F4B1BE7E6118101E0071B6AF231/A Datum Fabrication

How to access Rich Text Field Image outside SalesForce?

I'm using SalesForce object to store user information, it includes profile image too. I'm able to insert images in Rich Text Field by sending Base64 String along with <img> tag. Image gets stored and displayed in the SalesForce page. No problem with that.
Also I'm able to get the image url to the client side (Mobile App) with the help of Rest API (NodeJS).The problem here is, Image URL is accessible(browser) only if the SalesForce account is logged in. If else it goes to the SalesForce login page (When I try with other brower).
What I want is to display the images in my Mobile app which doesn't have SalesForce login access. But I can provide enough information to client side if needed.
My image url looks like this,
https://c.ap5.content.force.com/servlet/rtaImage?eid=a037F00000YSybT&feoid=00N7F00000Pupl6&refid=0EM7F000000lIN0
Also, I have some of the security information in my server side (Node) which I got when I successfully login to SalesForce. BTW I'm using node-salesforce package to establish SalesForce connection.
1) organizationId
2) instanceUrl
3) userID
4) accessToken
I have been stuck with this problem since a week before and I'm running out of deadline. Please HELP!!!
Thanks!
I ended up rewriting the urls in the rich text on the fly when returning them from my api. In a manner so that they refer to my api which I use as image proxy through this SF API endpoint with an integration user: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_sobject_rich_text_image_retrieve.htm
private void rewriteImageUrls(RichTextRecord richTextRecord) {
Document doc = Jsoup.parse(richTextRecord.getBody());
doc.select("img").forEach(imgElement -> {
try {
URL originalUrl = new URL(imgElement.absUrl("src"));
String newUrl = appEnv.getBackendUrl() + "/api/v1/rich_text_records/" + richTextRecord.getSfid() + "/image?refid="+ UrlUtils.getQueryMap(originalUrl.getQuery()).get("refid");
imgElement.attr("src", newUrl);
} catch (MalformedURLException e) {
e.printStackTrace();
}
});
richTextRecord.setBody(doc.outerHtml());
}
Far from ideal but it works!

Get a weburl from Sharepoint Client Side Object Model

So let's say I have a full url into a sharepoint website.
In the past when I wanted to get the weburl and doc url, I used the "url to web url" method of the Front Page Server Extensions. (http://msdn.microsoft.com/en-us/library/ms460544.aspx).
So for example if you had a site at
http://webapp/site1/chidsite/a.doc
I want a method in CSOM that will return /site1/childsite as the weburl.
I see the Web.WebUrlFromPageUrlDirect() method in CSOM but I'm not sure I'm getting what I need back from it. In the Uri class I get back, would I use teh "AbsolutePath" property for the weburl?
What is the correct way to do this?
I also would like to get the docURL that is usually retrieved by a call to url to web url via frontpage extensions.
The fileurl will be the file location relative to the web site. So if the document is stored in the document library called "Documents", you will get a value of /Documents/file.ext.
Well, I've reverted to using the FrontPage extensions when doing URLToWebURL since I just wasn't sure what I was getting using CSOM methods. Though check the link below for answers I received elsewhere.
Information Pertaining to this issue

How to get sharepoint site collection url using javascript?

I need to get the the site colllection url using the javascript.
I have written one simple function as below
function getSiteCollectionUrl()
{
var pageUrl= window.location.href;
var protocol = pageUrl.split(":")[0];
var addr=pageUrl.split("//")[1];
var webUrl = addr.split("/")[0];
var siteColleUrl = protocol + "://" + webUrl
}
let say the site address is "http://mysite/trialsite/default.aspx",
then it will return: "http://mysite"
But I think this isn't proper way to get the site collection url.
PLease suggest if you have any other idea.
Just by looking at SharePoint tag, right below this question, there is exactly the same question: How to get site collection url using javascript?
And the answers are there. And luckily you can get SharePoint site collection URL without invoking SharePoint API, because there is a L_Menu_BaseUrl variable available that contains it.
At http://server/documents it returns:
console.log(L_Menu_BaseUrl)
/documents
You cannot obtain the Site collection Url from just pure javascript without invoking SharePoint API.
If you need to do it in Javascript, you can invoke SharePoint API through web services and get the Site collection Url.
There is already Javascript API available called SPServices. You can use the following function:
SPGetCurrentSite

Resources