Uploading file in sharepoint library throws an unhandled exception(hresult: 0x80020009, errorcode: -2147352567) with empty error message - sharepoint

I m using following code sample to upload multiple files (using sharepoint Object Model, no webservice) in a document library, but some times it throws exception hresult: 0x80020009, with error code -2147352567 and error message is empty (empty string) while file is upload successfully to the document library. And mostly it occurs only first time mean it occurs when uploading first document after that all process goes smoothly no exception occurs after first time occured. If i eat that exception it works fine. Can any one help me to trace the problem, I can't understand why it throws exception while file has been uploaded to document library. I want to know What's the actual reason and what should I do to avoid that problem.
Code:
.....
SPFolder folder = web.GetFolder(folderUrl);
foreach(.....)
{
folder.Files.Add(folderUrl + "/" + fileName, file.Data, true);
}

try using the code provided below that will help you out
using (SPSite spsite = new SPSite("http://SPS01"))
{
using (SPWeb spweb = spsite.OpenWeb())
{
spweb.AllowUnsafeUpdates = true;
SPFolder spfolder = spweb.Folders[Site + "/Shared Documents/"];
byte[] content = null;
using (FileStream filestream = new FileStream("C:/Sample.docx", System.IO.FileMode.Open))
{
content = new byte[(int)filestream.Length];
filestream.Read(content, 0, (int)filestream.Length);
filestream.Close();
}
SPFile spfile = spfolder.Files.Add("Sample.docx", content, true);
//Upload file in subfolder.
//SPFile spfile = spfolder.SubFolders["Demonstration Folder"].Files.Add("Sample.docx", content, true);
spfile.Update();
}
}

Related

Copying files from FTP to Azure Blob Storage

I have created my FTP (ftp://xyz.in) with user id and credentials.
I have created an asp.net core API application that will copy files from FTP to Azure blob storage.
I have my API solution placed in C://Test2/Test2 folder.
Now below is my code :
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp:/xyz.in");
request.Method = WebRequestMethods.Ftp.UploadFile;
// This example assumes the FTP site uses anonymous logon.
request.Credentials = new NetworkCredential("pqr#efg.com", "lmn");
// Copy the contents of the file to the request stream.
byte[] fileContents;
// Getting error in below line.
using (StreamReader sourceStream = new StreamReader("ftp://xyz.in/abc.txt"))
{
fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
}
request.ContentLength = fileContents.Length;
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(fileContents, 0, fileContents.Length);
}
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
{
Console.WriteLine($"Upload File Complete, status {response.StatusDescription}");
}
But on line
using (StreamReader sourceStream = new StreamReader("ftp://xyz.in/abc.txt"))
I am getting error : System.IO.IOException: 'The filename, directory name, or volume label syntax is incorrect : 'C:\Test2\Test2\ftp:\xyz.in\abc.txt''
I am not able to understand from where does 'C:\Test2\Test2' string gets append to my FTP.
Test2 is a folder where my .Net Core application is placed.
StreamReader() doesn't take a URL/URI, it takes a file path on your local system: (read the doco):
https://learn.microsoft.com/en-us/dotnet/api/system.io.streamreader.-ctor?view=net-5.0
StreamReader is interpurting the string you've supplied as a filename ("ftp://xyz.in/abc.txt"), and it's looking for it in the current running folder "C:\Test2\Test2". If your string was "abc.txt", it would look for a file called "abc.txt" in the current folder, e.g. C:\Test2\Test2\abc.txt.
What you want is to get the file using WebClient or something similar:
WebClient request = new WebClient();
string url = "ftp://xyz.in/abc.txt";
request.Credentials = new NetworkCredential("username", "password");
try
{
byte[] fileContents = request.DownloadData(url);
// Do Something...
}

Azure Blob Storage - 404 when i save to file

I have another problem with Azure Blob Storage, this time with downloading. I get a list of files without a problem, unfortunately when I want to download it I get a 404 error that the file was not found.
using System.IO;
using System.Linq;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
namespace BlobStorage
{
class Program
{
static void Main(string[] args)
{
CloudStorageAccount backupStorageAccount = CloudStorageAccount.Parse(
"{connectionString}");
var backupBlobClient = backupStorageAccount.CreateCloudBlobClient();
var backupContainer = backupBlobClient.GetContainerReference("{container-name");
var list = backupContainer.ListBlobs(useFlatBlobListing: true);
foreach (var blob in list)
{
var blobFileName = blob.Uri.Segments.Last();
CloudBlockBlob blockBlob = backupContainer.GetBlockBlobReference(blobFileName);
string destinationPath = string.Format(#"D:\" + blobFileName +".txt");
blockBlob.DownloadToFile(destinationPath, FileMode.OpenOrCreate);
}
}
}
}
Error message:
Microsoft.WindowsAzure.Storage.StorageException: "The remote server
returned an error: (404) Not found."
Internal exception WebException: The remote server returned an error:
(404) Not found.
And points to the line:
blockBlob.DownloadToFile (destinationPath, FileMode.OpenOrCreate);
A file like this most exists in blob storage. When I enter the blob editions, copy the url to a file, I can download it through the browser without any problem. Unfortunately, I can not download it from the application level due to a 404 error.
Only why does such a file exist?
The issue is how you're getting the blob name in the following line of code:
var blobFileName = blob.Uri.Segments.Last();
Considering, the path is tempdata/ExampleIotHub/02/2019/05/14/39, the blob's name is ExampleIotHub/02/2019/05/14/39 (assuming your container name is tempdata) however the blobFileName you're getting is just 39 (please see examples here). Since there is no blob by the name 39, you're getting this 404 error.
I suggest you try by doing something like the following:
foreach (var blob in list)
{
var localFileName = blob.Uri.Segments.Last();
CloudBlockBlob blockBlob = blob as CloudBlockBlob;
if (blockBlob != null)
{
string destinationPath = string.Format(#"D:\" + localFileName +".txt");
blockBlob.DownloadToFile(destinationPath, FileMode.OpenOrCreate);
}
}
Please note that I have not tried running this code so there may be some errors.

Using HttpClient to upload files to ServiceStack server

I can't use the ServiceStack Client libraries and I've chosen to use the HttpClient PCL library instead. I can do all my Rest calls (and other json calls) without a problem, but I'm now stucked with uploading files.
A snippet of what I am trying to do:
var message = new HttpRequestMessage(restRequest.Method, restRequest.GetResourceUri(BaseUrl));
var content = new MultipartFormDataContent();
foreach (var file in files)
{
byte[] data;
bool success = CxFileStorage.TryReadBinaryFile(file, out data);
if (success)
{
var byteContent = new ByteArrayContent(data);
byteContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = System.IO.Path.GetFileName(file) ,
};
content.Add(byteContent);
}
}
message.Content = content;
Problem is now that I get a null reference exception (status 500) when posting. I doesn't get into the service. I see the call in the filterrequest, but that's it.
So I'm wondering what I do wrong and how I can pinpoint what is going wrong. How can I catch the correct error on the ServiceStack layer?

Upload a file to a document library in SharePoint 2010 programmatically in client-server application

I am using below code to upload the file in SharePoint 2010 Library
String fileToUpload = #"C:\YourFile.txt";
String sharePointSite = "http://yoursite.com/sites/Research/";
String documentLibraryName = "Shared Documents";
using (SPSite oSite = new SPSite(sharePointSite))
{
using (SPWeb oWeb = oSite.OpenWeb())
{
if (!System.IO.File.Exists(fileToUpload))
throw new FileNotFoundException("File not found.", fileToUpload);
SPFolder myLibrary = oWeb.Folders[documentLibraryName];
// Prepare to upload
Boolean replaceExistingFiles = true;
String fileName = System.IO.Path.GetFileName(fileToUpload);
FileStream fileStream = File.OpenRead(fileToUpload);
// Upload document
SPFile spfile = myLibrary.Files.Add(fileName, fileStream, replaceExistingFiles);
// Commit
myLibrary.Update();
}
}
This worked well through my machine. But when I deploy it on server and used the below snippet to upload file in library from my machine, it gives error. It is not getting the file location (C:\YourFile.txt) from local(client) machine.
When you run on the server your code runs under a different account (apppool identity) which does not have the permission to read C drive.
I dont know why would you want to read and upload a file from the same server, looks like you are simply testing Sharepoint Object Model then it is ok
If you are expecting some other app or service to keep an updated file for Sharepoint , it should be moved to the web directory i.e \wwwroot\wss\VirtualDirectories\80 and then use your code to read and update your doc lib (myLibrary) as you are doing.
Are you running this in a console app or "in SharePoint"?
Could it be that the account running the code doesnt have read permissions in C:\?

maximum file upload size in sharepoint

I use the following method to upload a document into sharepoint document library.
However, upon executing the query - get the following error:
Message = "The remote server returned an error: (400) Bad Request."
the files are failing over 1mb, so i tested it via the sharepoint UI and the same file uploaded successfully.
any thoughts on what's the issue is? is it possible to stream the file over rather than 1 large file chunk? the file in question is only 3mb in size..
private ListItem UploadDocumentToSharePoint(RequestedDocumentFileInfo requestedDoc, ClientContext clientContext)
{
try
{
var uploadLocation = string.Format("{0}{1}/{2}", SiteUrl, Helpers.ListNames.RequestedDocuments,
Path.GetFileName(requestedDoc.DocumentWithFilePath));
//Get Document List
var documentslist = clientContext.Web.Lists.GetByTitle(Helpers.ListNames.RequestedDocuments);
var fileCreationInformation = new FileCreationInformation
{
Content = requestedDoc.ByteArray,
Overwrite = true,
Url = uploadLocation //Upload URL,
};
var uploadFile = documentslist.RootFolder.Files.Add(fileCreationInformation);
clientContext.Load(uploadFile);
clientContext.ExecuteQuery();
var item = uploadFile.ListItemAllFields;
item["Title"] = requestedDoc.FileNameParts.FileSubject;
item["FileLeafRef"] = requestedDoc.SharepointFileName;
item.Update();
}
catch (Exception exception)
{
throw new ApplicationException(exception.Message);
}
return GetDocument(requestedDoc.SharepointFileName + "." + requestedDoc.FileNameParts.Extention, clientContext);
}
EDIT: i did find the following ms page regarding my issue (which seems identical to the issue they have raised) http://support.microsoft.com/kb/2529243 but appears to not provide a solution.
ok found the solution here:
http://blogs.msdn.com/b/sridhara/archive/2010/03/12/uploading-files-using-client-object-model-in-sharepoint-2010.aspx
i'll need to store the document on the server hosting the file then using the filestream upload process i've done in my code below:
private ListItem UploadDocumentToSharePoint(RequestedDocumentFileInfo requestedDoc, ClientContext clientContext)
{
try
{
using(var fs = new FileStream(string.Format(#"C:\[myfilepath]\{0}", Path.GetFileName(requestedDoc.DocumentWithFilePath)), FileMode.Open))
{
File.SaveBinaryDirect(clientContext, string.Format("/{0}/{1}", Helpers.ListNames.RequestedDocuments, requestedDoc.FileName), fs, true);
}
}
catch (Exception exception)
{
throw new ApplicationException(exception.Message);
}
return GetDocument(requestedDoc.SharepointFileName + "." + requestedDoc.FileNameParts.Extention, clientContext);
}

Resources