File upload with metadata using SharePoint Web Services - sharepoint

I trying to upload a file with metadata using SharePoint Web Services. The first approach I took is to use the WebRequest/WebResponse objects and then update the metadata using the Lists.asmx - UpdateListItems method. This works just fine but it creates two versions of the file. The second approach I took was to use the Copy.asmx web service and use the CopyIntoItems method which copies the file data along with the metadata. This works fine and creates v 1.0 but when I try to upload the same file with some changes in the metadata (using the Copy.asmx) it does not do update anything. Does anybody came across the same issue or has some other ideas to implement the required functionality.
Thanks,
Kiran

This might be a bit of topic (sorry) but I'd like to advice you to a real timesaving shortcut when working with SharePoint remotely, http://www.bendsoft.com/net-sharepoint-connector/
It enables you to work with SharePoint lists and document libraries with SQL and stored procedures.
Uploading a file as a byte array
...
string sql = "CALL UPLOAD('Shared Documents', 'Images/Logos/mylogo.png', #doc)";
byte[] data = System.IO.File.ReadAllBytes("C:\\mylogo.png");
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", data);
cmd.ExecuteNonQuery();
...
Upload stream input
using (fs == System.IO.File.OpenRead("c:\\150Mb.bin")) {
string sql = "CALL UPLOAD('Shared Documents', '150Mb.bin', #doc)";
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", fs);
cmd.ExecuteNonQuery();
}
There are quite a few methods to simplify remote document management
UPLOAD(lisname, filename, data)
DOWNLOAD(listname, filename)
MOVE(listname1, filename1, listname2, filename2)
COPY(listname1, filename1, listname2, filename2)
RENAME(listname, filename1, filename2)
DELETE(listname, filename)
CREATEFOLDER(listname, foldername)
CHECKOUT(list, file, offline, lastmodified)
CHECKIN(list, file, comment, type)
UNDOCHECKOUT(list, file)
Cheers

Related

I was trying to convert my word document to PDF using Spire.Doc and always running into an exception "A generic error occurred in GDI+"

I was trying to fetch a word from different sources via url/shared link/shared drives and save it to memory stream. My memory stream gets the data everytime but while converting it to PDF via Spire.Doc it gives an error saying "A generic error occurred in GDI+". But this happens only in my production not in localhost. The app is hosted in Azure cloud.
I put few logs and got to know that line number 3 is causing the issue.
fileContentStream and finalStream are the memory streams.
The code which I used is:
Spire.Doc.Document spireDoc = new Spire.Doc.Document();
spireDoc.LoadFromStream(fileContentStream, Spire.Doc.FileFormat.Auto);
spireDoc.SaveToStream(finalStream, Spire.Doc.FileFormat.PDF);
To convert Word to PDF on Azure cloud using Spire.Doc, you need to use the below code:
Document document = new Document();
spireDoc.LoadFromStream(stream);
ToPdfParameterList tpl = new ToPdfParameterList
{
UsePSCoversion = true
};
document.SaveToStream(stream, tpl);

Support to upload dictionary of synonyms in azure search

I was looking for a way to upload a text file of dictionary of synonyms in azure search, the nearest I could find was
https://azure.microsoft.com/en-in/blog/azure-search-synonyms-public-preview/
https://learn.microsoft.com/en-us/azure/search/search-synonyms
I know it is not a good idea to compare products of different companies, but if there exists a way to upload a dictionary of synonyms in azure search like it is supported in elastic search, then it will of great help and might save a lot of time and rework.
Please help me know how to achieve such thing like uploading the dictionary of the synonym in azure search
The latest .NET SDK for Azure Cognitive Search has this capability. From this sample:
// Create a new SearchIndexClient
Uri endpoint = new Uri(Environment.GetEnvironmentVariable("SEARCH_ENDPOINT"));
AzureKeyCredential credential = new AzureKeyCredential(
Environment.GetEnvironmentVariable("SEARCH_API_KEY"));
SearchIndexClient indexClient = new SearchIndexClient(endpoint, credential);
// Create a synonym map from a file containing country names and abbreviations
// using the Solr format with entry on a new line using \n, for example:
// United States of America,US,USA\n
string synonymMapName = "countries";
string synonymMapPath = "countries.txt";
SynonymMap synonyms;
using (StreamReader file = File.OpenText(synonymMapPath))
{
synonyms = new SynonymMap(synonymMapName, file);
}
await indexClient.CreateSynonymMapAsync(synonyms);
The SDKs for Java, Python, and Javascript also support creating synonym maps. The Java SDK accepts a string rather than a file stream, so you'd have to read the file contents yourself. Unfortunately the Python and Javascript SDKs seem to require a list of strings (one for each line of the file), which is something we should improve. I'm following up with the Azure SDK team to make these improvements.

How to check if two files have the same content

I am working with a nodejs application. I am querying an API endpoint, and storing the retrieved data inside a database. Everything is working well. However, there are instances where some data is not pushed to the database. In this case what I normally do is manually query the endpoint by assigning the application the date when that data was lost, and retrieve it since its stored in a server which automatically deletes the data after 2 days. The API and database fields are identical.
The following is not the problem, but to give you context, I would like to automate this process by making the application retrieve all the data for the past 48 HRS, save it in a .txt file inside the app. I will do the same, query my mssql database to retrieve the data for the past 48 hrs.
My question is, how can check whether the contents of my api.txt file are the same with that of the db.txt?
You could make use of buf.equals(), as detailed in the docs
const fs = require('fs');
var api = fs.readFileSync('api.txt');
var db = fs.readFileSync('db.txt');
//Returns bool
api.equals(db)
So that:
if (api.equals(db))
console.log("equal")
else
console.log("not equal")

Azure Table Storage access time - inserting/reading from

I'm making a program that stores and reads from Azure tables some that are stored in CSV files. What I got are CSV files that that can have various number of columns, and between 3k and 50k rows. What I need to do is upload that data in Azure table. So far I managed to both upload data and retrieve it.
I'm using REST API, and for uploading I'm creating XML batch request, with 100 rows per request. Now that works fine, except it takes a bit too long to upload, ex. for 3k rows it takes around 30seconds. Is there any way to speed that up? I noticed that it takes most time when proccessing response ( for ReadToEnd() command ). I read somewhere that setting proxy to null could help, but it doesn't do much in my case.
I also found somewhere that it is possible to upload whole XML request to blob and then execute it from there, but I couldn't find any example for doing that.
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(content, 0, content.Length);
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Stream dataStream = response.GetResponseStream();
using (var reader = new StreamReader(dataStream))
{
String responseFromServer = reader.ReadToEnd();
}
}
As for retrieving data from azure tables, I managed to get 1000 entities per request. As for that, it takes me around 9s for CS with 3k rows. It also takes most time when reading from stream. When I'm calling this part of the code (again ReadToEnd() ):
response = request.GetResponse() as HttpWebResponse;
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string result = reader.ReadToEnd();
}
Any tips?
As you mentioned you are using REST API you may have to write extra code and depend on your own methods to implement performance improvement differently then using client library. In your case using Storage client library would be best as you can use already build features to expedite insert, upsert etc as described here.
However if you were using Storage Client Library and ADO.NET you can use the article below which is written by Windows Azure Table team as supported way to improve Azure Access Performance:
.NET and ADO.NET Data Service Performance Tips for Windows Azure Tables

SharePoint 2007: How to check if a folder exists in a list using web services?

Using SharePoint 2007 webservices or even Webdav, how can I check if a folder exists in a list (not document library) in SharePoint.
I would also like to check for subfolders...
Anyone have any idea on how this is done? I've asked Microsoft, and their official stance is that Microsoft provide no documentation on this. so any help will be most welcome...
Thanks in advance...
I have this code that creates a folder, but not sure how to modify it to check if the folder exists, also not even sure if this will work with sub folders...
private void CreateFolderUsingWebService(string listName, string folderName)
{
//Check Databox Folder Exists
//string folderAddress = siteAddress + #"/lists/" + listAddress + #"/" + folderName;
//wsDws.CreateFolder(folderAddress);
var doc = new XmlDocument();
XmlElement batch = doc.CreateElement("Batch");
string item = "<Method ID=\"1\" Cmd=\"New\">" +
"<Field Name=\"ID\">New</Field>" +
"<Field Name=\"FSObjType\">1</Field>" +
"<Field Name=\"BaseName\">" + folderName + "</Field></Method>";
batch.SetAttribute("ListVersion", "1");
//batch.SetAttribute("ViewName", "{GUID of View, including braces}");
batch.InnerXml = item;
wsLists.UpdateListItems(listName, batch);
}
Ok - this info might help the next SharePoint developer:
The function above works, and will even create a directory structure. BUT you need to pass the list name, not the list URL, this means if you localize your code, you need to pass the localized list name to the function.
I didn't bother adding a check for ifExists, because it seems to NOT create duplicates or fail if the directory already exists. I know this isn't a great solution, but I just don't have 2-3 weeks to research how to do it properly, so if you have any suggestions, comments are welcome.
Lastly any Microsoft representation reading this - might want to consider why there isn't any really good documentation on this with how to's from MS? Mmmmm
I went as far as downloading the MOSS Web Services SDK, and it contains 1 very vague example of how to use 1 function in the Lists web service, this simply is not enough information for those of us trying to put together robust solutions in MOSS. We need way more documentation please...

Resources