I was trying to convert my word document to PDF using Spire.Doc and always running into an exception "A generic error occurred in GDI+" - c#-4.0

I was trying to fetch a word from different sources via url/shared link/shared drives and save it to memory stream. My memory stream gets the data everytime but while converting it to PDF via Spire.Doc it gives an error saying "A generic error occurred in GDI+". But this happens only in my production not in localhost. The app is hosted in Azure cloud.
I put few logs and got to know that line number 3 is causing the issue.
fileContentStream and finalStream are the memory streams.
The code which I used is:
Spire.Doc.Document spireDoc = new Spire.Doc.Document();
spireDoc.LoadFromStream(fileContentStream, Spire.Doc.FileFormat.Auto);
spireDoc.SaveToStream(finalStream, Spire.Doc.FileFormat.PDF);

To convert Word to PDF on Azure cloud using Spire.Doc, you need to use the below code:
Document document = new Document();
spireDoc.LoadFromStream(stream);
ToPdfParameterList tpl = new ToPdfParameterList
{
UsePSCoversion = true
};
document.SaveToStream(stream, tpl);

Related

Why can't Azure Search import JSON blobs?

When importing data using the configuration found below, Azure Cognitive Search returns the following error:
Error detecting index schema from data source: ""
Is this configured incorrectly? The files are stored in the container "example1" and in the blob folder "json". When creating the same index with the same data in the past there were no errors, so I am not sure why it is different now.
Import data:
Data Source: Azure Blob Storage
Name: test-example
Data to extract: Content and metadata
Parsing mode: JSON
Connection string:
DefaultEndpointsProtocol=https;AccountName=EXAMPLESTORAGEACCOUNT;AccountKey=EXAMPLEACCOUNTKEY;
Container name: example1
Blob folder: json
.json file structure.
{
"string1": "vaule1",
"string2": "vaule2",
"string3": "vaule3",
"string4": "vaule4",
"string5": "vaule5",
"string6": "vaule6",
"string7": "vaule7",
"string8": "vaule8",
"list1": [
{
"nested1": "value1",
"nested2": "value2",
"nested3": "value3",
"nested4": "value4"
}
],
"FileLocation": null
}
Here is an image of the screen with the error when clicking "Next: Add cognitive skills (Optional)" button:
To clarify there are two problems:
1) There is a bug in the portal where the actual error message is not showing up for errors, hence we are observing the unhelpful empty string "" as an error message. A fix is on the way and should be rolled out early next week.
2) There is an error when the portal attempts to detect index schema from your data source. It's hard to say what the problem is when the error message is just "". I've tried your sample data and it works fine with importing.
I'll update the post once the fix for displaying the error message is out. In the meantime (again we're flying blind here without the specific error string) here are a few things to check:
1) Make sure your firewall rules allow the portal to read from your blob storage
2) Make sure there are no extra characters inside your JSON files. Check the whitespace charcters are whitespace (you should be able to open the file in VSCode and check).
Update: The portal fix for the missing error messages has been deployed. You should be able to see a more specific error message should an error occur during import.
Seems to me that is a problem related to the list1 data type. Make sure you're selecting: "Collection(Edm.String)" for it during the index creation.
more info, please check step 5 of the following link: https://learn.microsoft.com/en-us/azure/search/search-howto-index-json-blobs
I have been in contact with Microsoft, and this is a bug in the Azure Portal. The issue is the connection string wizard does not append the Endpoint suffix correctly. They have recommeded to manually pasting the connection string, but this still does not work for me. So this is a suggested answer by Microsoft, but I don't believe is completely correct because the portal outputs the same error message:
Error detecting index schema from data source: ""

Getting PDF_VALIDATION_FAILED exception when trying to create envelopne in DocuSign sample recipe code

I am trying to run the code available on GitHub.
Issue is when I am trying to create an envelope I am getting an exception saying "PDF_VALIDATION_FAILED".
Can anyone help me out with this issue?
Really you should be creating a new issue to log the new error you are getting, or modify your original post. In any case the issue is most likely due to the file extension, the default is pdf so if you want to send a different format document you can do the following:
// Add a document to the envelope
Document doc = new Document();
doc.DocumentBase64 = System.Convert.ToBase64String(fileBytes);
doc.Name = Path.GetFileName("/PATH/TO/DOC/TEST.DOCX");
doc.DocumentId = "1";
doc.FileExtension = "docx";
I managed to get a fix for this one. Apparently the file I was uploading was a corrupt file. However, not I am getting a different error 'UNABLE_TO_LOAD_DOCUMENT' when I try to upload a file in any format other than pdf.
Can anyone help me with this query? Also, what all file formats does DocuSign support?
Also, one of the previous libraries 'DocuSign.Integrations.Client' seem to work fine with word document uploads. Should they be used instead of 'DocuSign.eSign.Api', 'DocuSign.eSign.Client' and 'DocuSign.eSign.Model'?
This is in response to the second error you mentioned:
I think you are missing assigning the FileExtension to the required format
something like : doc.FileExtension = "docx"
Once you do that, you will get rid of UNABLE_TO_LOAD_DOCUMENT error and the document can be sent successfully.

Core data/iCloud seeding with local xml file throwing errors in iOS8

Hopefully this is something simple but I haven't been able to track down a fix yet. I have an application that I'm trying to implement both iCloud and Core Data with. I'd like it to run on iOS7 and iOS8.
The application is a checklist/tableview application for collectibles.
Essentially, the application has an pre-seeded xml file with about 50,000 in it. The sqlite/core data is initially configured to have just 1 item. Users can, from a table view, select groups to add to the core data store (so that not all 50,000 items are included). When the user selects a group that has 1-50 items, it parses the xml for those items and writes them into the core data store. When a user selects a group that has a larger amount of files, it parses and adds them, but then also throws some random "no document at url" errors during the parsing process. The application doesn't crash, and all items seem to be added, but the application stops synching with iCloud. The exact error is:
__45-[PFUbiquityFilePresenter processPendingURLs]_block_invoke(439): CoreData: Ubiquity:
Librarian returned a serious error for starting downloads Error Domain=BRCloudDocsErrorDomain Code=5
"The operation couldn’t be completed. (BRCloudDocsErrorDomain error 5 - No document at URL)"
UserInfo=0x7fd7f54abea0 {NSDescription=No document at URL,
NSFilePath=/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848- 0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/C45FA553-6CA0-4C26-845B-B478EF7EAD60.1.cdt,
NSUnderlyingError=0x7fd7f54aa200 "The operation couldn’t be completed. No such file or directory"}
with userInfo {
NSDescription = "No document at URL";
NSFilePath = "/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/C45FA553-6CA0-4C26-845B-B478EF7EAD60.1.cdt";
NSUnderlyingError = "Error Domain=NSPOSIXErrorDomain Code=2 \"The operation couldn\U2019t be completed.
No such file or directory\" UserInfo=0x7fd7f5433240 {NSDescription=No such file or directory}";
} for these urls: (
"file:///Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile%20Documents/iCloud~com~xxxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/C45FA553-6CA0-4C26-845B-B478EF7EAD60.1.cdt"
)
Then I will get a "move" error as well (sometimes after the parse is complete):
[PFUbiquityTransactionLog moveFileToPermanentLocationWithError:](761): CoreData: Ubiquity:
CoreData: Ubiquity: Error writing export log to file: file:///Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile%20Documents/iCloud~com~xxxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt
error: Error Domain=NSCocoaErrorDomain Code=516 "The operation couldn’t be completed. (Cocoa error 516.)"
UserInfo=0x7fd7f49cdfd0 {NSSourceFilePathErrorKey=/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxx~xxxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/tempLogs.nosync/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt,
NSUserStringVariant=(
Move
), NSDestinationFilePath=/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt,
NSFilePath=/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxx~xxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/tempLogs.nosync/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt,
NSUnderlyingError=0x7fd7f497f430 "The operation couldn’t be completed. File exists"}
userInfo: {
NSDestinationFilePath = "/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxx~xxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt";
NSFilePath = "/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/tempLogs.nosync/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt";
NSSourceFilePathErrorKey = "/Users/zacharyfisher/Library/Developer/CoreSimulator/Devices/4B70FCFC-4704-4C83-B848-0D52D833E28A/data/Library/Mobile Documents/iCloud~com~xxxxxxx~xxxxxxx/CoreData/iCloud/nobody~sim43DA22C4-427B-5FCD-9B61-90CE79638F6B/tempLogs.nosync/iCloud/PZbSJk1f2RNB6ucDj0Y6VqL1KgXYAxi4LcApXONjvnQ=/ABE37211-02B7-4F20-B631-B5D91B23E9BE.1.cdt";
NSUnderlyingError = "Error Domain=NSPOSIXErrorDomain Code=17 \"The operation couldn\U2019t be completed. File exists\"";
NSUserStringVariant = (
Move
);
}
Any thoughts on how to fix this? Am I trying to make too many changes at once and that is crashing the core data/icloud synching? Any thoughts or pointers would be appreciated.
Zack
Zack, this isn't an answer (I don't have the rep to comment), but it might help you get on the right track. I'm implementing iCloud core data and am running into what looks like the same problem with iOS 8... same "No document at URL" error (no move error for me), and same breakdown in data sync. Two observations:
When I am running my app on two devices, core data sync initially works very well... like for a few minutes and a few updates. Then I get the "no document at URL" error.
My .sqlite database is very small and the updates I'm trying to make are modest (e.g. adding a single new entity), so I don't think file sizes or update complexity are factors.
On the device where these errors are logging, the store stops importing changes from iCloud. But changes I make on that device do continue to persist to the other device. So the effect is like a one-way breakdown.
Hope this helps. Would appreciate posts on any progress you make. I've been wrestling with this for several weeks and am close to giving up and shipping the app (which is universal) without iCloud data sync.

Azure Table Storage access time - inserting/reading from

I'm making a program that stores and reads from Azure tables some that are stored in CSV files. What I got are CSV files that that can have various number of columns, and between 3k and 50k rows. What I need to do is upload that data in Azure table. So far I managed to both upload data and retrieve it.
I'm using REST API, and for uploading I'm creating XML batch request, with 100 rows per request. Now that works fine, except it takes a bit too long to upload, ex. for 3k rows it takes around 30seconds. Is there any way to speed that up? I noticed that it takes most time when proccessing response ( for ReadToEnd() command ). I read somewhere that setting proxy to null could help, but it doesn't do much in my case.
I also found somewhere that it is possible to upload whole XML request to blob and then execute it from there, but I couldn't find any example for doing that.
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(content, 0, content.Length);
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Stream dataStream = response.GetResponseStream();
using (var reader = new StreamReader(dataStream))
{
String responseFromServer = reader.ReadToEnd();
}
}
As for retrieving data from azure tables, I managed to get 1000 entities per request. As for that, it takes me around 9s for CS with 3k rows. It also takes most time when reading from stream. When I'm calling this part of the code (again ReadToEnd() ):
response = request.GetResponse() as HttpWebResponse;
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string result = reader.ReadToEnd();
}
Any tips?
As you mentioned you are using REST API you may have to write extra code and depend on your own methods to implement performance improvement differently then using client library. In your case using Storage client library would be best as you can use already build features to expedite insert, upsert etc as described here.
However if you were using Storage Client Library and ADO.NET you can use the article below which is written by Windows Azure Table team as supported way to improve Azure Access Performance:
.NET and ADO.NET Data Service Performance Tips for Windows Azure Tables

File upload with metadata using SharePoint Web Services

I trying to upload a file with metadata using SharePoint Web Services. The first approach I took is to use the WebRequest/WebResponse objects and then update the metadata using the Lists.asmx - UpdateListItems method. This works just fine but it creates two versions of the file. The second approach I took was to use the Copy.asmx web service and use the CopyIntoItems method which copies the file data along with the metadata. This works fine and creates v 1.0 but when I try to upload the same file with some changes in the metadata (using the Copy.asmx) it does not do update anything. Does anybody came across the same issue or has some other ideas to implement the required functionality.
Thanks,
Kiran
This might be a bit of topic (sorry) but I'd like to advice you to a real timesaving shortcut when working with SharePoint remotely, http://www.bendsoft.com/net-sharepoint-connector/
It enables you to work with SharePoint lists and document libraries with SQL and stored procedures.
Uploading a file as a byte array
...
string sql = "CALL UPLOAD('Shared Documents', 'Images/Logos/mylogo.png', #doc)";
byte[] data = System.IO.File.ReadAllBytes("C:\\mylogo.png");
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", data);
cmd.ExecuteNonQuery();
...
Upload stream input
using (fs == System.IO.File.OpenRead("c:\\150Mb.bin")) {
string sql = "CALL UPLOAD('Shared Documents', '150Mb.bin', #doc)";
SharePointCommand cmd = new SharePointCommand(sql, myOpenConnection);
cmd.Parameters.Add("#doc", fs);
cmd.ExecuteNonQuery();
}
There are quite a few methods to simplify remote document management
UPLOAD(lisname, filename, data)
DOWNLOAD(listname, filename)
MOVE(listname1, filename1, listname2, filename2)
COPY(listname1, filename1, listname2, filename2)
RENAME(listname, filename1, filename2)
DELETE(listname, filename)
CREATEFOLDER(listname, foldername)
CHECKOUT(list, file, offline, lastmodified)
CHECKIN(list, file, comment, type)
UNDOCHECKOUT(list, file)
Cheers

Resources