SharePoint ItemAdded and SPFile.OpenBinary(), zero bytes - sharepoint

I have an event receiver tied to a SharePoint 2010 picture library. When a picture is uploaded, I want to open it for processing. Files uploaded with the web interface work fine, but files copied via Windows Explorer view return zero bytes. Simplified code below.
public override void ItemAdded(SPItemEventProperties properties)
{
SPListItem item = properties.ListItem;
SPFile file = item.File;
byte[] buffer = file.OpenBinary(); //buffer has zero bytes for files copied in Windows Explorer!
}
If I insert a delay before opening, it works.
public override void ItemAdded(SPItemEventProperties properties)
{
SPListItem item = properties.ListItem;
SPFile file = item.File;
System.Threading.Thread.Sleep(2000);
byte[] buffer = file.OpenBinary(); //buffer now populated correctly
}
But I though that ItemAdded was only called after everything was done, including file upload.
I also found that file.CanOpenFile(true) always returns true, whether or not OpenBinary works.
How can I make sure the file is ready to open before I call OpenBinary()?
I don't like the Thread.Sleep solution, because I'm sure larger files or a busier server would require more wait. The time required can't be predicted, and I don't want to loop and try again forever.
Update: I originally thought the failure to open was caused by larger files. Question has been updated to reflect the explorer view as the cause. I also find that Windows Explorer copy also triggers ItemUpdated (twice), and I am able to open the file here. A little messy to have 3 triggers, 2 calls to do 1 thing, so I am still open to suggestions.

I just encountered this issue today on SharePoint 2013. I've taken the suggestions listed here and improved upon them.
It's fine making the thread sleep for 2 seconds, but what happens when you have a large file? You're going to run into the same issue.
My code fix is as follows:
//Check if the SPContext is null since Explorer View isn't within the context
if (SPContext.Current == null)
{
//If the initial file length is 0, pause the thread for 2 seconds
if (properties.ListItem.File.Length == 0)
{
System.Threading.Thread.Sleep(2000);
//Since our item exists, run the GetItemById to instantiate a new and updated SPListItem object
var spFile = properties.List.GetItemById(properties.ListItemId);
//SharePoint places an Exclusive lock on the file while the data is being loaded into the file
while (spFile.File.LockType != SPFile.SPLockType.None)
{
System.Threading.Thread.Sleep(2000);
spFile = properties.List.GetItemById(properties.ListItemId);
//We need to check if the file exists, otherwise it will loop forever if someone decides to cancel the upload
if (!spFile.File.Exists)
return;
}
//If someone thought it was a good idea to actually load a 0 byte file, don't do anything else
if (spFile.File.Length == 0)
return;
}
}

I face the same Problem within SP2010 and SP2013. Any Idea how to solve this?
Somehow this have something to do with bigger files. Small Files work without any problems, bigger files (400kb) won't work always.
I have only one hint. If you copy & paste a file over windows explorer (WebDAV) to the Library the EventHandle (ItemAdded) will trigger as soon the File was created. But this doesn't mean the File is already filled with data. I saw this once, my debugger hit my breakpoint even while windows was still busy with the copyprocess.
Would be great to know when the copyprocess is finished. I thought i could do this by just do "spfile.openBinary()" and if its empty, just wait 2 sec and do it again until it will get something bigger than 0 bytes. But this doesnt work! It only work if you are waiting BEFORE you call openBinary() the first time, all other times of calling openBinary() lead to the same result.

Related

I am having problems trying to display a RTF data file in a rich text edit control in a MFC dialog

I saw a discussion here about displaying a RTF file in a rich text edit control. Maybe what I am trying to do it too much.
In my dialog class I define a static method:
static DWORD CALLBACK MyStreamInCallback(DWORD dwCookie, LPBYTE pbBuff, LONG cb, LONG* pcb)
{
std::ifstream* pFile = (std::ifstream*)dwCookie;
pFile->read((char*)pbBuff, cb);
return 0;
}
In my dialog OnInitDialog class I try to display the data:
std::ifstream File("d:\\RevisionHistoryTest.rtf");
EDITSTREAM es = { 0 };
es.dwCookie = (DWORD)&File;
es.pfnCallback = MyStreamInCallback;
::SendMessage(m_rtfEdit, EM_STREAMIN, SF_RTF, (LPARAM)&es);
Now, here is a link to a sample project. I don't know where I can put this project in the long term, but my DropBox will do for now. The project does not include the RTF file. This is how I created it:
I went to the following URL in a browser.
I selected all the Revision History content and pasted it into a Microsoft Word file.
I copied the Microsoft Word content and pasted it into a WordPad session and saved it.
Example:
Interestingly, when I subsequently open my RTF file in WordPad I get a popup message:
If I select Unblock then it still opens in the editor. I assume that is because all of the images must still be linked to those on my website. And I think this is related to the issue with my test project, because this is what I see:
I get no errors or anything. It just reads the first line and stops.
I am trying to find the easiest way to display my HTML history in a RTF window.
My original intention was to use a CHtmlView control instead (makes sense to do that) and directly read my Revision History file from the internet. But my help system is designed to permanently show the contents pane on the left. This is why I thought that RTF might be a suitable alternative. But struggling with it.
Update
Based on the comments made about 64 bit builds I located this tutorial which works for both 32 bit and 64 bit. They both display "Revision History" only.
BOOL FillRichEditFromFile(HWND hwnd, LPCTSTR pszFile)
{
BOOL fSuccess = FALSE;
HANDLE hFile = CreateFile(pszFile, GENERIC_READ,
FILE_SHARE_READ, 0, OPEN_EXISTING,
FILE_FLAG_SEQUENTIAL_SCAN, NULL);
if (hFile != INVALID_HANDLE_VALUE)
{
EDITSTREAM es = { 0 };
es.pfnCallback = MyStreamInCallback;
es.dwCookie = (DWORD_PTR)hFile;
if (SendMessage(hwnd, EM_STREAMIN, SF_RTF, (LPARAM)&es) && es.dwError == 0)
{
fSuccess = TRUE;
}
CloseHandle(hFile);
}
return fSuccess;
}
And in OnInitDialog:
FillRichEditFromFile(m_rtfEdit.GetSafeHwnd(), _T("d:\\RevisionHistoryTest.rtf"));
But my initial issue still remains.
Update
I had forgotten to set the control to multiline! That was part of the issue:
At least all of the text is visible now. Just not the images. And I don't like the way some of the links are displayed.
I have been able to edit the file and bring the indents over to the left to make it look better. But images still won't show.
Update
As a workaround to this I realised that I could simply duplicate my HTML Revision History as a standalone page. Then I can use CHtmlView:
The benefit is that the display is consistent wit what the user sees in the help system.

FileSystemWatcher reports file available on network share but file cannot be found

BACKGROUND
I have a server that has a shared folder \\Server\Share with 4 subfolders:
OutboundFinal
OutboundStaging
InboundFinal
InboundStaging
All folders reside on the same physical disk and partition, no junction points used.
I also have several WinForms clients (up to 10) that write and read files to this share, each client is working on multiple threads (up to 5). Files are witten by clients (up to 50 threads altogether) into the \\Server\Share\OutboundStaging folder. Each file has the name of a GUID, so there's no overwriting. Once a file is completely written, it is moved by the client to the \\Server\Share\OutboundFinal folder. A Windows service running on the same server will pick it up, delete it, process it, then writes the file with the same name into the \\Server\Share\InboundStaging folder. Once the file is completely written, it is moved to the \\Server\Share\InboundFinal folder by the service.
This \\Server\Share\InboundFinal folder is monitored by each thread of each WinForms client using a FileSystemWatcher.WaitForChanged(WatcherChangeTypes.Changed | WatcherChangeTypes.Created, timeOut);
The FileSystemWatcher.Filter is set to the GUID filename of the file a certain thread expects to see in the \Server\Share\InboundFinal folder, so the FileSystemWatcher waits until a specific file is shown in the folder.
I have read several SO questions about FileSystemWatcher behaving erratically and not reporting changes on UNC shares. This is however not the case for me.
The code I use looks like this:
FileSystemWatcher fileWatcher = new FileSystemWatcher();
fileWatcher.Path = InboundFinalFolder;
fileWatcher.Filter = GUIDFileName; // contains full UNC path AND the file name
fileWatcher.EnableRaisingEvents = true;
fileWatcher.IncludeSubdirectories = false;
var res = fileWatcher.WaitForChanged(WatcherChangeTypes.Changed | WatcherChangeTypes.Created, timeOut);
if (!fileWatcher.TimedOut)
{
using (FileStream stream = fi.Open(FileMode.Open, FileAccess.Read, FileShare.Read)) {
byte[] res = new byte[stream.Length];
stream.Read(res, 0, stream.Length);
return res;
}
It's the using line that throws the exception.
THE PROBLEM
I would assume that the fileWatcher.WaitForChanged would go on only if the file with the proper GUID name is in the \\Server\Share\InboundFinal folder. This is exactly how FileSystemWatcher works on local folders, but not with file shares accessed over the network (local files, even accessed via a share, also tend to work). FileSystemWatcher reports that the file the thread is waiting for is in the FileSystemWatcher \\Server\Share\InboundFinal folder. However, when I try to read the file, I get a FileNotFoundException. The reading thread has to wait 3-15 seconds before the file can be read. I try to open the file with a FileStream with Read sharing.
What could cause this behavior? How do I work around it? Ideally the FileSystemWatcher.WaitForChanged(WatcherChangeTypes.Changed | WatcherChangeTypes.Created, timeOut); should only continue execution if the file can be read or timeout happens.
The FileSystemWatcher has a bad reputation, but actually, it is not that bad...
1.)
Your code sample does not compile. I tried this:
FileSystemWatcher fileWatcher = new FileSystemWatcher();
fileWatcher.Path = "X:\\temp";
fileWatcher.Filter = "test.txt";
fileWatcher.EnableRaisingEvents = true;
fileWatcher.IncludeSubdirectories = false;
var res = fileWatcher.WaitForChanged(WatcherChangeTypes.Changed |
WatcherChangeTypes.Created, 20000);
if (!res.TimedOut)
{
FileInfo fi = new FileInfo(Path.Combine(fileWatcher.Path, res.Name));
using (FileStream stream = fi.Open(FileMode.Open, FileAccess.Read, FileShare.Read))
{
byte[] buf = new byte[stream.Length];
stream.Read(buf, 0, (int)stream.Length);
}
Console.WriteLine("read ok");
}
else
{
Console.WriteLine("time out");
}
I tested this where X: is a SMB share. It worked without problems (for me, see below).
But:
You should open / read the file with retries (sleeping for 100 ms after every unsuccessfully open). This is because you may run into a situation where the FileSystemWatcher detects a file, but the move (or another write operation) has not yet ended, so you have to wait until the file create / mover is really ready.
Or you do do not wait for the "real" file but for a flag file which the file move task creates after closing the "real" file.
2.)
Could it be that the move task did not close the file correctly?
3.)
Some years ago I had some tools (written in perl) where one script created a flag file and another script waited for it.
I had some nasty problems on a SMB 2 share. I found out that this was due to SMB caching.
See
https://bogner.sh/2014/10/how-to-disable-smb-client-side-caching/
File open fails initially when trying to open a file located on a win2k8 share but eventually can succeeed
https://technet.microsoft.com/en-us/library/ff686200.aspx
Try this (on the client):
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\LanmanWorkstation\Parameters]
"DirectoryCacheLifetime"=dword:00000000
"FileNotFoundCacheLifetime"=dword:00000000
Save this as disablecache.reg and run regedit disablecache.reg
Then reboot.

setCacheDuration on Wicket DownloadLink

I am currently using a downloadLink in Wicket to allow a user to download a created excel file, and then to be deleted afterwards. When this is done over SSL IE gives me an error:
"Unable to download.
Internet Explorer was unable to open this site. The requested site is either unavailable or cannot be found. Please try again later. "
here:
http://support.microsoft.com/kb/323308
after doing some reading from the above microsoft support link, it seems it is because
because it's over SSL, and I have
response.setHeader("Cache-Control", "no-cache, max-age=0, must-revalidate, no-store");
I set my downloadLink like so:
private void setupDownloadLink()
{
IModel excelFileModel = new AbstractReadOnlyModel()
{
public Object getObject()
{
return excelCreator();
}
};
auditDownloadlink = new DownloadLink("auditDownloadlink", excelFileModel);
auditDownloadlink.setOutputMarkupPlaceholderTag(true);
auditDownloadlink.setDeleteAfterDownload(true);
auditDownloadlink.setCacheDuration(Duration.NONE);
auditDownloadlink.setVisible(false);
findUserForm.add(auditDownloadlink);
}
However, it seems to work if I do: auditDownloadlink.setCacheDuration(Duration.Minute);
I guess I am confused on what is happening with this. Does the setCacheDuration mean how long after the file is created, it will be available for before it is deleted? Or does this mean how long in total the file will be available for from the start of declaring it?
inside the excelCreator() method I call File excelfile = new File("Access.xls"); and then go ahead and process all of the excel work and create the spreadsheet, then at the end of the method I call:
FileOutputStream output = new FileOutputStream(excelfile);
workbook.write(output);
output.close();
Will the duration time I set start from the moment I call File excelfile = new File("ssaUserIDAccess.xls")?
What is the best duration and setup I should use for this scenario? Because the files can get quit huge, and can take some time to create if it is big enough.
Thanks!
I do not remember the reason, but we had the same problem on SSL/IE and we just set the cache duration to 1 second that is enough. Just it cannot be NONE. Another solution we've never found.
auditDownloadlink.setCacheDuration(Duration.ONE_SECOND)

Core Data code and multithreading

The following code is fetch data (fill data for the first time) part of my tableViewController. I am using an NSManagedDocument's managedObjectContext to fill (pre populate) my database. The source is an array that I clean up from my TXT file which rests directly in the Xcode's resources folder. After this creation, I have document cases like closed / open and normal.
The following code inputs and fetches my data onto the table correctly with a fetched results controller request. However, when the data is loading in the thread that is meant to free the UI from this one time data creation (26854 object names) into managedObject.name attribute heavy operation, the tableview and my UI is frozen (for 1-15 seconds that is I think while populating in document.managedObjectContext for the first time for my database).
After 10-15 seconds data is loaded and shows correctly. However, when I stop the simulator and restart the app in simulator, although I save the document as seen in below code, and I use the same fetch results controller setup (and request) the table view shows empty, it is movable in this case (The document state shows open and normal at this stage and file path is same, I checked... It seems like neither autosave nor explicit saveForOverwriting I use work... Or is it something else? I tried a lot of things and I'll go crazy soon. I think it has something to do with my multithreading.
self.managedObjectNames is the array property in the table view and I set it from the TXT file during my table view's loadView:
Is there anybody out there who can show the mistake here? Is it that I give self.managedObjectNames in the method of entity creation category.
Thanks!
- (void)fetchDataIntoDocument:(UIManagedDocument *)document {
dispatch_queue_t fetchQ = dispatch_queue_create("Data fetcher", NULL);
dispatch_async(fetchQ, ^{
[document.managedObjectContext performBlock:^{
for (int i = 0; i < 26854; i++) {
[managedObject managedObjectWithId:[NSNumber numberWithInt:i] andArray:self.managedObjectNames inManagedObjectContext:document.managedObjectContext];
}
// NSLog(#"Save baby!!?");
[document saveToURL:document.fileURL forSaveOperation:UIDocumentSaveForOverwriting completionHandler:nil];
}];
});
dispatch_release(fetchQ);
}
The reason why your UI is blocked for 10-15 seconds is because the document.managedObjectContext has been created with NSMainQueueConcurrencyType. That means that the performBlock: method will be executed on the main queue.
Creating the fetchQ in your code does not have any reason. It would have a reason if fetching of data would take some considerable amount of time but adding them would be fast (e.g. creating/modifying only few objects):
dispatch_async(fetchQ, ^{
// fetch data here (e.g. fetchAttribute may take few seconds)
NSString *attribute = fetchAttribute();
[document.managedObjectContext performBlock:^{
MyObject *o;
o = [NSEntityDescription insertNewObjectForEntityForName:#"MyObject"
inManagedObjectContext:document.managedObjectContext];
o.myAttribute = attribute;
}];
});
However I don't know answer to your main question.

Why does my SharePoint workflow fail when the client is running Vista or Windows 7?

I have a similar situation to this question.
I have a custom sequential SharePoint workflow, deleoped in Visual Studio 2008. It is associated with an InfoPath form submitted to a form library. It is configured to automatically start when an item is created.
It works sometimes. Sometimes it just fails to start.
Just like the question linked above, I checked in the debugger, and the issue is that the InfoPath fields published as columns in the library are empty when the workflow fires. (I access the fields with workflowProperties.Item["fieldName"].) But there appears to be a race condition, as those fields actually show up in the library view, and if I terminate the failed workflow and restart it manually, it works fine!
After a lot of head-scratching and testing, I've determined that the workflow will start successfully if the user is running any version of IE on Windows XP, but it fails if the same user submits the same form data from a Vista or Windows 7 client machine.
Does anyone have any idea why this is happening?
I have used another solution which will only wait until InfoPath property is available (or max 60 seconds):
public SPWorkflowActivationProperties workflowProperties =
new SPWorkflowActivationProperties();
private void onOrderFormWorkflowActivated_Invoked(object sender, ExternalDataEventArgs e)
{
SPListItem workflowItem;
workflowItem = workflowProperties.List.GetItemById(workflowProperties.ItemId);
int waited = 0;
int maxWait = 60000; // Max wait time in ms
while (workflowItem["fieldName"] == null && (waited < maxWait))
{
System.Threading.Thread.Sleep(1);
waited ++;
workflowItem = workflowProperties.List.GetItemById(workflowProperties.ItemId);
}
// For testing: Write delay time in Workflow History Event
SPWorkflow.CreateHistoryEvent(
workflowProperties.Web,
workflowProperties.WorkflowId,
(int)SPWorkflowHistoryEventType.WorkflowComment,
workflowProperties.OriginatorUser, TimeSpan.Zero,
waited.ToString() + " ms", "Waiting time", "");
}
workflowProperties.Item will never get the InfoPath property in the code above.
workflowProperties.List.GetItemById(workflowProperties.ItemId) will after some delay.
This occurs due to the fact that Vista/7 saves InfoPath forms through WebDAV, however XP uses another protocol (sorry, can't remember at the time). SharePoint catches the "ItemAdded" event before the file is actually uploaded (that is, the item is already created, but file upload is currently in progress).
What you can do for a workaround is to add a dealay activity and wait for 10 seconds as the first thing in your workflow (will actually be longer than ten seconds due to the way workflows are built in SPPS). This way the upload will already have ended when you perform to read the item. To inform the users about what's happening, you can add a "logToHistoryList" activity before the delay.

Resources