Sharepoint REST - Can we update file metadata while uploading file itself? - sharepoint

Using below Endpoint we can upload file to sharepoint:
https://domain.example.com/_api/web/GetFolderByServerRelativeUrl("FolderRelativeUrl")/Files/add(url="File",overwrite=true)
Using below endpoint we can update the metadata for specific file:
https://domain.example.com/_api/web/GetFileByServerRelativeUrl(URL)/ListItemAllFields
Is it possible to update metadata when we are uploading file itself?
And same while retrieve, we need to fetch metadata along with file.
Basically I am trying to avoid 2 separate calls? Does SharePoint API support this feature?

SharePoint can't provide a REST API to achieve it.
As a workaround, we can use CSOM(C#) to achieve it.
public Boolean UploadDocument(String fileName, String filePath, List metaDataList)
{
SP.ClientContext ctx = new SP.ClientContext("http://yoursharepointURL");
Web web = ctx.Web;
FileCreationInformation newFile = new FileCreationInformation();
newFile.Content = System.IO.File.ReadAllBytes(#"C: \TestFile.doc");
newFile.Url = "/" + fileName;
List docs = web.Lists.GetByTitle(“Shared Documents”);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(newFile);
context.Load(uploadFile);
context.ExecuteQuery();
SPClient.ListItem item = uploadFile.ListItemAllFields;
//Set the metadata
string docTitle = string.Empty;
item["Title"] = docTitle;
item.Update();
context.ExecuteQuery();
}
If you want to call web service using Ajax from UI, we can create a custom web service with CSOM(C#), then consume the web service using Ajax.

It's frustrating, but you can upload the initial version and set metadata in a single call. But not upload new version and set metadata, only as separate calls. I am transfering files from a DMS which can have multiple versions, and the version history in Sharepoint will not match. To make it consistent, I also transfer the initial version and metadata as two calls. The customer is informed, and the version history is ok. The file import shows as an empty version.

Related

Does Azure Blob Storage supports partial content 206 by default?

I am using Azure blob storage to storage all my images and videos. I have implemented the upload and fetch functionality and it's working quite good. I am facing 1 issue while loading the videos, because when I use the url which is generated after uploading that video on Azure blob storage, it's downloading all the content first before rendering it to the user. So if the video size is 100 mb, it'll download all the 100 mb and till than user won't able to see the video.
I have done a lot of R&D and came to know that while rendering the video, I need to fetch the partial content (status 206) rather than fetching the whole video at a time. After adding the request header "Range:bytes-500", I tried to hit the blog url, but it's still downloading the whole content. So I have checked with some open source video URLs and tried to hit the video URL along with the "Range" request header and it was successfully giving 206 response status, which means it was properly giving me the partial content instead of the full video.
I read some forum and they are saying Azure storage supports the partial content concept and need to enable it from the properties. But I have checked all the options under the Azure storage account but didn't find anything to enable this functionality.
Can anyone please help me out to resolve this or if there's anything on Azure portal that I need to enable? It's something that I have been doing the R&D for this since a week now. Any help would be really appreciated.
Thank you! Stay safe.
Suppose the Accept-Ranges is not enabled, from this blog I got it needs to set the default version of the service.
Below is a sample code to implement it.
var credentials = new StorageCredentials("account name", "account key");
var account = new CloudStorageAccount(credentials, true);
var client = account.CreateCloudBlobClient();
var properties = client.GetServiceProperties();
properties.DefaultServiceVersion = "2019-07-07";
client.SetServiceProperties(properties);
Below is a return header comparison after setting the property.
Before:
After:
Assuming the video content is MPEG-4 the issue may be the media itself needs to have the moov atom position changed from the end of the file to the beginning. The browser won't render the video until it finds the moov atom in the file therefore you want to make sure the atom is at the start of the file which can be accomplished using ffmpeg with the "FastStart". Here's a good article with more detail : HERE
You just need to update your Azure Storage version. It will work automatically after the update.
Using Azure CLI
Just run:
az storage account blob-service-properties update --default-service-version 2021-08-06 -n yourStorageAccountName -g yourStorageResourceGroupName
List of avaliable versions:
https://learn.microsoft.com/en-us/rest/api/storageservices/previous-azure-storage-service-versions
To see your current version, open a file and inspect the x-ms-version header
following is the SDK I used to download the contents:
var container = new BlobContainerClient("UseDevelopmentStorage=true", "sample-container");
await container.CreateIfNotExistsAsync();
BlobClient blobClient = container.GetBlobClient(fileName);
Stream stream = new MemoryStream();
var result = await blobClient.DownloadToAsync(stream, cancellationToken: ct);
which DOES download the whole file right away! Unfortunately the solution provided in other answers seems to be referencing another SDK? So for the SDK that I use - the solution is to use the method OpenReadAsync:
long kBytesToReadAtOnce = 300;
long bytesToReadAtOnce = kBytesToReadAtOnce * 1024;
//int mbBytesToReadAtOnce = 1;
var result = await blobClient.OpenReadAsync(0, bufferSize: (int)bytesToReadAtOnce, cancellationToken: ct);
By default - it fetches 4mb of data, so you have to override the value to smaller amount if you want your app to have smaller memory footprint.
I think that internally the SDK sends the requests with the byte range already set. So all you have to do is enable the partial content support in Web API like this:
return new FileStreamResult(result, contentType)
{
EnableRangeProcessing = true,
};

How to Upload images from local folder to Sitecore

`webClient.UploadFile("http://www.myurl.com/~/media/DCF92BB74CDA4D558EEF2D3C30216E30.ashx", #"E:\filesImage\Item.png");
I'm trying to upload images to sitecore using webclient.uploadfile() method by sending my sitecore address and the path of my local images.But I'm not able to upload it.I have to do this without any API's and Sitecore Instances.
The upload process would be the same as with any ASP.net application. However, once the file has been uploaded you need to create a media item programtically. You can do this from an actual file in the file system, or from a memory stream.
The process involves using a MediaCreator object and using its CreateFromFile method.
This blog post outlines the whole process:
Adding a file to the Sitecore Media Library programatically
If you're thinking simply about optimizing your developer workflow you could use the Sitecore PowerShell Extensions using the Remoting API as described in this this blog post
If you want to use web service way than you can use number of ways which are as follows:
a) Sitecore Rocks WebService (If you are allowed to install that or it is already available).
b) Sitecore Razl Service(It is third party which need license).
c) Sitecore Powershell Remoting (This needs Sitecore PowerShell extensions to be installed on Sitecore Server).
d) You can also use Sitecore Service which you can find under sitecore\shell\WebService\Service.asmx (But this is legacy of new SitecoreItemWebAPI)
e) Last is my enhanced SitecoreItemWebAPI (This also need SitecoreItemWebApi 1.2 as a pre-requisite).
But in end except option d you need to install some or other thing in order to upload the image using HTTP, you should also know the valid credentials to use any of above stated methods.
If your customers upload the image on the website, you need to create the item in your master database. (needs access and write right on the master database) depend on your security you might consider not build it with custom code.
But using the Sitecore webforms for marketers module With out of the box file upload. Create a form with upload field and using the WFFM webservices.
If you dont want to use Sitecore API, then you can do the following:
Write a code that uploads images into this folder : [root]/upload/
You might need to create folder structure that represent how the images are stored in Sitecore, eg: your images uploaded into [root]/upload/Import/ will be stored in /sitecore/media library/Import
Sitecore will automatically upload these images into Media library
Hope this helps
Option: You can use Item Web API for it. No reference to any Sitecore dll is needed. You will only need access to the host and be able to enable the Item Web API.
References:
Upload the files using it: http://www.sitecoreinsight.com/how-create-media-items-using-sitecore-item-web-api/
Enable Item Web Api: http://sdn.sitecore.net/upload/sdn5/modules/sitecore%20item%20web%20api/sitecore_item_web_api_developer_guide_sc66-71-a4.pdf#search=%22item%22
I guess that is pretty much what you need, but as Jay S mentioned, if you put more details on your question helps on finding the best option to your particular case.
private void CreateImageIteminSitecore()
{
filePath = #"C:\Sitecore\Website\ImageTemp\Pic.jpg;
using (new SecurityDisabler())
{
Database masterDb = Sitecore.Configuration.Factory.GetDatabase("master");
Sitecore.Resources.Media.MediaCreatorOptions options = new Sitecore.Resources.Media.MediaCreatorOptions();
options.FileBased = true;
options.AlternateText = Path.GetFileNameWithoutExtension(filePath);
options.Destination = "/sitecore/media library/Downloads/";
options.Database = masterDb;
options.Versioned = false; // Do not make a versioned template
options.KeepExisting = false;
Sitecore.Data.Items.MediaItem mediaitemImage = new Sitecore.Resources.Media.MediaCreator().CreateFromFile(filePath, options);
Item ImageItem = masterDb.GetItem(mediaitemImage.ID.ToString());
ImageItem.Editing.BeginEdit();
ImageItem.Name = Path.GetFileNameWithoutExtension(filePath);
ImageItem.Editing.EndEdit();
}
}

Lucene.NET and storing data on Azure Blob Storage

The question I am asking is specifically because I don't want to use AzureDirectory project. I am just trying something on my own.
cloudStorageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=xxxx;AccountKey=xxxxx");
blobClient=cloudStorageAccount.CreateCloudBlobClient();
List<CloudBlobContainer> containerList = new List<CloudBlobContainer>();
IEnumerable<CloudBlobContainer> containers = blobClient.ListContainers();
if (containers != null)
{
foreach (var item in containers)
{
Console.WriteLine(item.Uri);
}
}
/* Used to test connectivity
*/
//state the file location of the index
string indexLocation = containers.Last().Name.ToString();
Lucene.Net.Store.Directory dir =
Lucene.Net.Store.FSDirectory.Open(indexLocation);
//create an analyzer to process the text
Lucene.Net.Analysis.Analyzer analyzer = new
Lucene.Net.Analysis.Standard.StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
//create the index writer with the directory and analyzer defined.
bool findexExists = Lucene.Net.Index.IndexReader.IndexExists(dir);
Lucene.Net.Index.IndexWriter indexWritr = new Lucene.Net.Index.IndexWriter(dir, analyzer,!findexExists, Lucene.Net.Index.IndexWriter.MaxFieldLength.UNLIMITED);
//create a document, add in a single field
Lucene.Net.Documents.Document doc = new Lucene.Net.Documents.Document();
string path="D:\\try.html";
TextReader reader = new FilterReader("D:\\try.html");
doc.Add(new Lucene.Net.Documents.Field("url",path,Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.NOT_ANALYZED));
doc.Add(new Lucene.Net.Documents.Field("content",reader.ReadToEnd().ToString(),Lucene.Net.Documents.Field.Store.YES,Lucene.Net.Documents.Field.Index.ANALYZED));
indexWritr.AddDocument(doc);
indexWritr.Optimize();
indexWritr.Commit();
indexWritr.Close();
Now the issue is after indexing is completed I am not able to see any files created inside the container. Can anybody help me out?
You're using the FSDirectory there, which is going to write files to the local disk.
You're passing it a list of containers in blob storage. Blob storage is a service made available over a REST API, and is not addressable directly from the file system. Therefore the FSDirectory is not going to be able to write your index to storage.
Your options are :
Mount a VHD disk on the machine, and store the VHD in blob storage. There are some instructions on how to do this here: http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/15/mount-a-page-blob-vhd-in-any-windows-azure-vm-outside-any-web-worker-or-vm-role.aspx
Use the Azure Directory, which you refer to in your question. I have rebuilt the AzureDirectory against the latest storage SDK: https://github.com/richorama/AzureDirectory
Another alternative for people looking around - I wrote up a directory that uses the azure shared cache (preview) which can be an alternative for AzureDirectory (albeit for bounded search sets)
https://github.com/ajorkowski/AzureDataCacheDirectory

Sharepoint 2010 Client Object Module getting a site url list

I’m trying to learn SharePoint Client Object Model, specifically how to get a list of all SharePoint site URLs using a remote connection. This is possible using webservices…but I want to do it using the client object model.
I’ve figured how to get the title lists of a specific sharepoint site using the following code:
client object module):
ClientContext ctx = new ClientContext( server );
ctx.AuthenticationMode = ClientAuthenticationMode.Default;
ctx.Credentials = WindowsAuthenticationCredentials(username, password);
Web w = ctx.Web;
var lists = ctx.LoadQuery(w.Lists);
ctx.ExecuteQuery();
//Enumerate the results.
foreach (List theList in lists)
{
}
Output:
Announcements, Master Collection Pages… etc…
How can I do the same to get a site url list?
In web services you can call the following to achieve that, but as I said just trying to figure out how to do the same using client object module. If you can provide c# code that would greatly be appreciated.
WSPSitedata.SiteData sitedata = new SiteData();
sitedata.Url = #SharePointBaseURL + #"_vti_bin/sitedata.asmx";
sitedata.Credentials = our_credentials
_sSiteMetadata metaData = new _sSiteMetadata();
_sWebWithTime[] webWithTime
sitedata.GetSite(out metaData, out webWithTime, out users, out groups, out vgroups);
The SharePoint Client Object Model CSOM is designed to remotly interact with your SiteCollection. Sure, it is possible to connect to various SiteCollections, but it's not possible to look over all SiteCollections sitting within a SPWebApplications.
In 2010 you could still use the ASMX WebServices which are available in earlier versions of SharePoint.
To get a better understanding of the CSOM you should have a look at the MSDN site http://msdn.microsoft.com/en-us/library/ee537247.aspx
Did you really mean a list containing all SiteCollection URLs or was that a misunderstanding?
Thorsten

Upload a file to SharePoint through the built-in web services

What is the best way to upload a file to a Document Library on a SharePoint server through the built-in web services that version WSS 3.0 exposes?
Following the two initial answers...
We definitely need to use the Web Service layer as we will be making these calls from remote client applications.
The WebDAV method would work for us, but we would prefer to be consistent with the web service integration method.
There is additionally a web service to upload files, painful but works all the time.
Are you referring to the “Copy” service?
We have been successful with this service’s CopyIntoItems method. Would this be the recommended way to upload a file to Document Libraries using only the WSS web service API?
I have posted our code as a suggested answer.
Example of using the WSS "Copy" Web service to upload a document to a library...
public static void UploadFile2007(string destinationUrl, byte[] fileData)
{
// List of desination Urls, Just one in this example.
string[] destinationUrls = { Uri.EscapeUriString(destinationUrl) };
// Empty Field Information. This can be populated but not for this example.
SharePoint2007CopyService.FieldInformation information = new
SharePoint2007CopyService.FieldInformation();
SharePoint2007CopyService.FieldInformation[] info = { information };
// To receive the result Xml.
SharePoint2007CopyService.CopyResult[] result;
// Create the Copy web service instance configured from the web.config file.
SharePoint2007CopyService.CopySoapClient
CopyService2007 = new CopySoapClient("CopySoap");
CopyService2007.ClientCredentials.Windows.ClientCredential =
CredentialCache.DefaultNetworkCredentials;
CopyService2007.ClientCredentials.Windows.AllowedImpersonationLevel =
System.Security.Principal.TokenImpersonationLevel.Delegation;
CopyService2007.CopyIntoItems(destinationUrl, destinationUrls, info, fileData, out result);
if (result[0].ErrorCode != SharePoint2007CopyService.CopyErrorCode.Success)
{
// ...
}
}
Another option is to use plain ol' HTTP PUT:
WebClient webclient = new WebClient();
webclient.Credentials = new NetworkCredential(_userName, _password, _domain);
webclient.UploadFile(remoteFileURL, "PUT", FilePath);
webclient.Dispose();
Where remoteFileURL points to your SharePoint document library...
There are a couple of things to consider:
Copy.CopyIntoItems needs the document to be already present at some server. The document is passed as a parameter of the webservice call, which will limit how large the document can be. (See http://social.msdn.microsoft.com/Forums/en-AU/sharepointdevelopment/thread/e4e00092-b312-4d4c-a0d2-1cfc2beb9a6c)
the 'http put' method (ie webdav...) will only put the document in the library, but not set field values
to update field values you can call Lists.UpdateListItem after the 'http put'
document libraries can have directories, you can make them with 'http mkcol'
you may want to check in files with Lists.CheckInFile
you can also create a custom webservice that uses the SPxxx .Net API, but that new webservice will have to be installed on the server. It could save trips to the server.
public static void UploadFile(byte[] fileData) {
var copy = new Copy {
Url = "http://servername/sitename/_vti_bin/copy.asmx",
UseDefaultCredentials = true
};
string destinationUrl = "http://servername/sitename/doclibrary/filename";
string[] destinationUrls = {destinationUrl};
var info1 = new FieldInformation
{
DisplayName = "Title",
InternalName = "Title",
Type = FieldType.Text,
Value = "New Title"
};
FieldInformation[] info = {info1};
var copyResult = new CopyResult();
CopyResult[] copyResults = {copyResult};
copy.CopyIntoItems(
destinationUrl, destinationUrls, info, fileData, out copyResults);
}
NOTE: Changing the 1st parameter of CopyIntoItems to the file name, Path.GetFileName(destinationUrl), makes the unlink message disappear.
I've had good luck using the DocLibHelper wrapper class described here: http://geek.hubkey.com/2007/10/upload-file-to-sharepoint-document.html
From a colleage at work:
Lazy way: your Windows WebDAV filesystem interface. It is bad as a programmatic solution because it relies on the WindowsClient service running on your OS, and also only works on websites running on port 80. Map a drive to the document library and get with the file copying.
There is additionally a web service to upload files, painful but works all the time.
I believe you are able to upload files via the FrontPage API but I don’t know of anyone who actually uses it.
Not sure on exactly which web service to use, but if you are in a position where you can use the SharePoint .NET API Dlls, then using the SPList and SPLibrary.Items.Add is really easy.

Resources