Whenever I am trying to get the image file using the document Id, getting following error
Could not load file or assembly 'Kofax.CEBPM.ThinClient.DocumentServices, Version=1.0.0.0, Culture=neutral, PublicKeyToken=cf95ca7471e897ff' or one of its dependencies. The system cannot find the file specified.
FYI : Document ID is valid and I am able to search the image in KTA's Repository Browser.
I have tried different ways to get the image but all are failing. Any help ?
private static void GetDocument(string sessionId, string docId)
{
try
{
CaptureDocumentService captureDocumentService = new CaptureDocumentService();
ReportingData reportingData = new ReportingData();
// Using simple GetDocument method
Document document = captureDocumentService.GetDocument(sessionId, null, docId);
// Using GetDocumentAsFile with valid path
captureDocumentService.GetDocumentAsFile(sessionId, docId, #"C:\ValidPath\", "dummy.abc");
}
catch (Exception ex)
{
Console.WriteLine();
Console.WriteLine(ex.Message);
}
}
Judging from the error message, their either is an issue with permissions (check your path and ACLs in KTA) or the kind of file in your repository (are you sure it's a TIFF?)
Personally, I'd go with the GetDocumentFile method - this returns a stream, which may give you more flexibility. Here's an example:
public string ExportDocumentImages(string sessionId, string documentId, string outputFolder, string extension)
{
var cds = new CaptureDocumentService();
var doc = cds.GetDocument(sessionId, null, documentId);
Directory.CreateDirectory(outputFolder);
var fileName = Path.Combine(
outputFolder,
doc.Id + extension);
using (var fs = File.Create(fileName))
{
var s = cds.GetDocumentFile(sessionId, null, doc.Id, "");
s.CopyTo(fs);
}
return fileName;
}
Related
I am facing en enconding issue when downloading a file from Sharepoint Online by an Azure function. So I have an Azure HTTP triggered function that calls Sharepoint Online to retrieve a file and download it. Here is how I call Sharepoint:
public dynamic DownloadFile(Guid fileUniqueId)
{
const string apiUrl = "{0}/_api/web/GetFileById('{1}')/$value";
try
{
var fileInfo = GetFileInfo(fileUniqueId);
if (string.IsNullOrEmpty(_sharepointSiteUrl)) return null;
string api = string.Format(apiUrl, _sharepointSiteUrl, fileUniqueId.ToString());
string response = new TokenHelper().GetAPIResponse(api);
if (string.IsNullOrEmpty(response)) return null;
return new {
fileInfo.FileName,
Bytes = Encoding.UTF8.GetBytes(response)
};
}
catch (Exception ex)
{
throw ex;
}
}
And Here is the Azure App function that is called:
string guidString = req.Query["id"];
if (!Guid.TryParse(guidString, out var fileId))
return new BadRequestResult();
var fileManager = new FileManager();
dynamic fileData = fileManager.DownloadFile(fileId);
if (null == fileData) return new NotFoundResult();
var contentType = (((string)fileData.FileName).ToUpper().EndsWith(".PNG") || ((string)fileData.FileName).ToUpper().EndsWith(".JPEG") || ((string)fileData.FileName).ToUpper().EndsWith(".JPG")) ? "image/jpeg" : "application/octet-stream";
return new FileContentResult(fileData.Bytes, contentType)
{
FileDownloadName = fileData.FileName
};
The file is succesfully downloaded but it seems corrupted as it says that the file type is not recognised. I think that it's an issue related to encoding. Does somebody sees what I'm doing wrong ?
Your code is using UTF8.GetBytes() to try and get the file content from SharePoint Online. You should instead use the CSOM method OpenBinaryDirect() like this:
var fileRef = file.ServerRelativeUrl;
var fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(clientContext, fileRef);
using (var fileStream = System.IO.File.Create(fileName))
{
fileInfo.Stream.CopyTo(fileStream);
}
Can someone tell me why I keep getting a read and write timeout on this function? I have this as a code behind function on click even from a button. Everything as far as the data looks good until I get to the stream section and it still steps through, but when I check the Stream object contents after stepping into that object it states Read Timeout/Write Timeout: System invalid Operation Exception.
protected void SubmitToDB_Click(object sender, EventArgs e)
{
if (FileUploader.HasFile)
{
try
{
if (SectionDropDownList.SelectedValue != null)
{
if (TemplateDropDownList.SelectedValue != null)
{
// This gets the full file path on the client's machine ie: c:\test\myfile.txt
string strFilePath = FileUploader.PostedFile.FileName;
//use the System.IO Path.GetFileName method to get specifics about the file without needing to parse the path as a string
string strFileName = Path.GetFileName(strFilePath);
Int32 intFileSize = FileUploader.PostedFile.ContentLength;
string strContentType = FileUploader.PostedFile.ContentType;
//Convert the uploaded file to a byte stream to save to your database. This could be a database table field of type Image in SQL Server
Stream strmStream = FileUploader.PostedFile.InputStream;
Int32 intFileLength = (Int32)strmStream.Length;
byte[] bytUpfile = new byte[intFileLength + 1];
strmStream.Read(bytUpfile, 0, intFileLength);
strmStream.Close();
saveFileToDb(strFileName, intFileSize, strContentType, bytUpfile); // or use FileUploader.SaveAs(Server.MapPath(".") + "filename") to save to the server's filesystem.
lblUploadResult.Text = "Upload Success. File was uploaded and saved to the database.";
}
}
}
catch (Exception err)
{
lblUploadResult.Text = "The file was not updloaded because the following error happened: " + err.ToString();
}
}
else
{
lblUploadResult.Text = "No File Uploaded because none was selected.";
}
}
Try something like this:
using (var fileStream = FileUploader.PostedFile.InputStream)
{
using (var reader = new BinaryReader(fileStream))
{
byte[] bytUpfile = reader.ReadBytes((Int32)fileStream.Length);
// SAVE TO DB...
}
}
I am uploading documents to my an Azure Blob Storage, which works perfect, but I want to be able to link and ID to this specifically uploaded document.
Below is my code for uploading the file:
[HttpPost]
public ActionResult Upload(HttpPostedFileBase file)
{
try
{
var path = Path.Combine(Server.MapPath("~/App_Data/Uploads"), file.FileName);
string searchServiceName = ConfigurationManager.AppSettings["SearchServiceName"];
string blobStorageKey = ConfigurationManager.AppSettings["BlobStorageKey"];
string blobStorageName = ConfigurationManager.AppSettings["BlobStorageName"];
string blobStorageURL = ConfigurationManager.AppSettings["BlobStorageURL"];
file.SaveAs(path);
var credentials = new StorageCredentials(searchServiceName, blobStorageKey);
var client = new CloudBlobClient(new Uri(blobStorageURL), credentials);
// Retrieve a reference to a container. (You need to create one using the mangement portal, or call container.CreateIfNotExists())
var container = client.GetContainerReference(blobStorageName);
// Retrieve reference to a blob named "myfile.gif".
var blockBlob = container.GetBlockBlobReference(file.FileName);
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(path))
{
blockBlob.UploadFromStream(fileStream);
}
System.IO.File.Delete(path);
return new JsonResult
{
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
Data = "Success"
};
}
catch (Exception ex)
{
throw;
}
}
I have added the ClientID field to the Index(It is at the bottom), but have no idea how I am able to add this to this index. This is still al nerw to me and just need a little guidance if someone can help :
Thanks in advance.
I am trying to set the proper content type after uploading almost 2000 images , not realizing I had to set their ContentType property. Fortunately I realized that before I moved from the .png files to some other type.
Here is my method:
private static void ChangeImageTypeInAzureStorage()
{
var client = GetAzureClient();
var blobContainer = client.GetContainerReference("accessibleimages");
var list = blobContainer.ListBlobs().OfType<CloudBlockBlob>().ToList();
if (!list.Any()) return; //log no entries returned
try
{
foreach (var item in list)
{
if (Path.GetExtension(item.Uri.AbsoluteUri) == ".png")
{
item.Properties.ContentType = "image/png";
}
item.SetProperties();
}
}
catch (Exception ex)
{
//log exceptions with your own methods
Console.WriteLine(ex);
}
Console.WriteLine("Done... press a key to end.");
Console.ReadKey();
}
I'm not getting why nothing is returned to list. The client and blobContainer are correct. I had no problem uploading those images to the same client blobContainer. Needless to say it fails because the list always has a count of 0.
Any help appreciated.
Well, I found the answer after a lot of googling. The boolean parameter useFlatBoolListing for the ListBlobs method has to be set to true.
var list = blobContainer.ListBlobs(null, true).OfType<CloudBlockBlob>().ToList();
I want to download the files from a sharepoint document library through code as there are thousand of files in the document library.
I am thinking of creating console application, which I will run on sharepoint server and download files. Is this approach correct or, there is some other efficient way to do this.
Any help with code will be highly appreciated.
Like SigarDave said, it's perfectly possible to achieve this without writing a single line of code. But if you really want to code the solution for this, it's something like:
static void Main(string[] args)
{
// Change to the URL of your site
using (var site = new SPSite("http://MySite"))
using (var web = site.OpenWeb())
{
var list = web.Lists["MyDocumentLibrary"]; // Get the library
foreach (SPListItem item in list.Items)
{
if (item.File != null)
{
// Concat strings to get the absolute URL
// to pass to an WebClient object.
var fileUrl = string.Format("{0}/{1}", site.Url, item.File.Url);
var result = DownloadFile(fileUrl, "C:\\FilesFromMyLibrary\\", item.File.Name);
Console.WriteLine(result ? "Downloaded \"{0}\"" : "Error on \"{0}\"", item.File.Name);
}
}
}
Console.ReadKey();
}
private static bool DownloadFile(string url, string dest, string fileName)
{
var client = new WebClient();
// Change the credentials to the user that has the necessary permissions on the
// library
client.Credentials = new NetworkCredential("Username", "Password", "Domain");
var bytes = client.DownloadData(url);
try
{
using (var file = File.Create(dest + fileName))
{
file.Write(bytes, 0, bytes.Length); // Write file to disk
return true;
}
}
catch (Exception)
{
return false;
}
}
another way without using any scripts is by opening the document library using IE then in the ribbon you can click on Open in File Explorer where you can then drag and drop the files right on your desktop!