One of our customers has a problem that we cannot reproduce. We programmatically copy a document's properties to a destination file using SPFile.Properties. However, for some reason the file's properties do not match the meta data specified on the list the file is stored in.
Now, we can probably solve this by copying SPFile.Item.Properties (not tested yet), but I am just wondering under what circumstances SPFile.Properties is unequal to SPFile.Item.Properties.
Update: We have just received an update from our customer. Using SPFile.Item.Properties always returns the up to date information. However, we still would like to understand the original question.
There is a slight difference between SPFile.Properties and SPFile.Item fields and the first one is much, much slower to call.
You have most probably seen Microsoft Office document's "properties" window (this one - http://dradisframework.org/images/tutorial/custom_document_properties.png). These are the properties that are read when you access SPFile.Properties. Reading them is slow since there is some code infrastructure that parses the binary DOC file and finds the properties. (takes up to 30 or something milliseconds for every property access) See more here: http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spfile.properties.aspx
In SharePoint, every item is an SPListItem and its field values (and I don't use the word "properties" on purpose here) are stored in Sharepoint's content database. So, when you access SPFile.Item.Properties, you actually look at the SPListItem to which the file is attached and look at its properties from SharePoint's content database.
What happens behind the scene, when you upload a file having some "Office properties" set, is that SharePoint copies them to same-named fields in SPListItem. (Some information about it here: http://weblogs.asp.net/bsimser/archive/2004/11/22/267846.aspx)
This is why these properties typically have the same value, BUT it only happens if SharePoint knows how to read metadata from your file and write them back. So, in case you put a .txt file in your SharePoint store, you will not get any SPFile.Properties back.
The user will always see the ListItem Properties and not the SPFile properties in a document library. So using the ListItem properties in the copy is the way to go.
I believe this issue is related to the Sharepoint property promotion/demotion feature which enables document properties to be embedded in the physical MSOffice file and travel with it to the client etc. This however is only supported currently for Office file types (to my knowledge).
Jonathan
Trying to find the "official documented" anything for sharepoint is pretty much undoable. :-D. The online docs suck, you are better of using blog entries etc.
P.S. I agree with Alex here. Although an SPFile never exists in a list without an accompanying SPListItem, the connection between the 2 can get corrupted (i.e. being able to edit the list item but the file is not openable). This to me indicates information about the 2 is stored in different locations in the content db. I have had this happen before.
Related
I'm familiar with the "Manage Views" tool in Domino Administrator, and wondered if anyone had come up with a way to access the view Index size programmatically. Ideally I'd use LotusScript but any other method would be great too.
I'm hoping to analyze which views are actually used across a large number of databases, without having to extract the info manually.
AFAIK this is not possible in LotusScript. But the index size could be retrieved with the C-API function NIFGetCollectionData.
You need to open the database with NSFDbOpen, get the view with NIFFindView, open the view with NIFOpenCollection and finally get the size with NIFGetCollectionData. And of course you need to call NIFCloseCollection and NSFDbClose to clean up.
Maybe there is an easier way, but you'll probably have to develop the necessary tooling yourself. A small quest in 6 steps:
open the log.nsf database for your server
open the view Usage/by Size
open a usage document for a fairly large database, e.g. mail/yourmail.nsf
the document shows all views and their (index) sizes
open the Document Properties box (Alt-Enter)
the field AllViewInfo (RichText) contains that data
Downsides:
AFAIK there is no ready-made LotusScript library for log.nsf.
those sizes are collected during the night, so they're not up to date
going through a rich-text's data can be daunting
HTH
We are using SharePoint 365.
In a Document Library, when we change the metadata (Properties), the date-modified in Windows Explorer (OneDrive-connected to SharePoint) will change.
In my mind, metadata is a layer on top of the document, and hence changes in metadata should NOT change the modified data of the file.
Is my logic totally wrong ??
Best regards
It is normal.
“Metadata” is the term used to describe additional data that gives information about your files. Metadata that you may already be familiar with includes “Created Date”, “Modified By”, “File Size”, “File Type”, etc. Also "Date modified" (called "Modified" in a SharePoint library) is metadata, which will be changed automatically when other metadata is changed.
How do I search in MS Access (ver 2010) for data in files attached to records? If I do a "Find" and specify text I KNOW is in an attached txt file to a particular record, there are no hits. While if I have the same data in a Text Field or Memo field, Access finds it. I understood from one of the Access help screens I found that it is possible to search attachments from within Access, but I have not been able to do this yet.
BTW, I did try using the query tool and searching for text I knew was in the attachment, but it was not successful, although it did find the same text within a memo field in another record.
Thx,
jmb
I'm fairly certain that there is no mechanism in Access to find records based on text within a file attachment. A bit of web searching found an earlier question here and the responses seem to agree that there isn't.
One reference from Microsoft here says
By using attachments, you open documents and other non-image files in their parent programs, so from within Access, you can search and edit those files.
but I think that statement could be misinterpreted. I believe what they meant to say was that
"...from within Access you can open an attachment in its parent program and then work on it as usual (e.g., edit it, search it, print it, and so on)."
You can use file system object, open the file as string and search sequentially. That's as close as you'll get
I have to programmatically move (archive) a document from document library of a site collection to a document library of another site collection in SharePoint 2010, when a specific value is set for a column in the doc lib.
Would it be possible to write code for this scenario in an event receiver? Is there any other way?
If anybody has any relevant piece of code or links, please share.
Thanks in advance!
You could perhaps do a copy operation, then delete the original file.
Have a look at the following link, which discusses copying a file from one site to another:
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spfilecollection.aspx
The example uses one site collection. However, if you convert the source document to a byte array you can always instantiate the target site collection and add the binary data to a document library within that site collection.
Certainly the copy operation should would work within the event receiver. However, I'm not certain what will happen if you try to delete the file within the receiver (there may be concurrency issues). If the delete does not work, consider firing a one-time timer job to delete the file (which would occur in a different process).
You can try SPExport Class of SharePoint, as per this Article Copy or Move SharePoint items looks like few of the Operations that we do in SharePoint UI uses this API internally to acheive the task. Also this Approach depends if you are trying to do it one time or its going to be a repeatative process.
I have a method which downloads sharepoint documents to the local disk. I use SPFile.OpenBinary() method to get physical file, but it contains all fields of a parent DocumentLibrary. Does anybody know how to clear file (doc file) from these fields? I found only way to do it using Word interop library (method described here http://maxim-dikhtyaruk.blogspot.com/2009/05/trim-sharepoints-documents.html), but it doesn't fit my requirements cause it works only when Microsoft Office is installed on the machine...
You may want to read this to understand whats going on.
I do not know if this can be turned off or not, but it happens only with Word 2007 documents (docx).
You could do any of the following to turn this off:
Create a new content type and associate it with a document library. Use this document library from now on.
Look into some Open XML library or the Open XML SDK published by Microsoft.
This isn't a bug, it's supposed to be a feature! :-) Seriously though, you need to edit the Word document programmatically to remove these additional fields completely (I think even a content type will leave some behind).
For documents prior to Word 2007, you could use a toolkit like Aspose. I almost needed to do a similar thing once and would have used this product to do it. I'm sure there are other options out there.
For Word 2007 and higher, as SharePoint Newbie says, you should be able to use the Office Open XML formats to edit the document. Here's an MSDN intro article.