Is there any way upload file to Sharepoint document library together with all properties (in one REST call) ? I found recently if I uploaded a file and the it's properties (doesn't matter what properties), for Sharepoint it's a new version of the file and it's consumes storage as previous version, so for example I upload large file (4gb) and then it's some custom properties , now this files will consume 8gb storage, regardless if file itself was changed or not.
In Sharepoint SOAP it possible, but in REST seems that not.
Thanks
You have to do them separately and it doesn't actually create a copy of the file. SharePoint stores the deltas and in your case, there wouldn't be a change in the document. When you look at the version history, it can be misleading b/c you will see version 1.0 is 4GB and version 2.0 is 4GB which would make you think that you are consuming 8GB but that is not the case. If you were add a 3rd version that was 5GB, you wouldn't have 13GB used. Instead, the database will only store 5GB of data and SharePoint essentially pieces together the file from the database.
Related
I'm administering our organization Sharepoint Online right now and the storage is running low. I noticed that there are files that are taking up a lot of space because of version history (Example is a powerpoint slide with videos in it) It takes up to over a gb of file sometimes. I manually deleted version history of some files and it freed up almost 50gb worth of storage
Is there a built in way to do this in bulk? Or is there a built-in tool in sharepoint (something like the 'Storage Metrics') that traverses all files and shows the size but also shows the size with the version history size
Per my knowledge, there is no built-in tool to show the size of version histories. And there is no built-in way to delete version histories in bulk.
As a workaround, you could delete version histories in bulk by using PowerShell.
References:
https://www.sharepointdiary.com/2016/02/sharepoint-online-delete-version-history-using-powershell.html
https://social.msdn.microsoft.com/Forums/en-US/870e2f03-abf3-44b8-a2b6-71cb2aade2ef/powershell-script-to-delete-older-versions-of-documents-in-a-sharepoint-online-library?forum=sharepointdevelopment
As Emily mentioned, there is no native function for bulk delete history in SharePoint. SharePoint allows you to delete versions only for a single selected document.
There are generally two approaches to solve this issue: PowerShell Script and a 3rd party tool. Scripts from "sharepointdiary" look good and can be helpful.
There is a tool DMS Shuttle for SharePoint. It provides an UI and can delete versions in bulk for a particular document, for all documents in a Library or sub-folder or even for the whole site.
The tool allows you to specify the number of latest versions to keep.
It is commercial, but there is a trial version and for students it is free. Disclaimer: I work for the vendor.
Take a look on this article: https://dms-shuttle.com/documentation/delete-version-history-in-sharepoint-online-office-365/
Q1: Where is the upload file location in cognos analytics 11?
I need to know the upload file location to backup files
Q2: how to set the upload file size for a single user ??
I know how to set it globally. like this graph
enter image description here
I'm pretty sure that Cognos 11 stores uploaded files in the Content Store database, but I'm not quickly seeing where right now. There are files in the "data files location" (see Cognos Configuration) but looking up a specific uploaded .xlsx file in the Content Store, I can't confirm that it is any of the files in the file system. IBM would tell you you need to use the SDK to get to this content in some automated way that is not through the Cognos Anayltics UI.
I don't think there is a per-user setting for the upload limit. The documentation makes no mention of this setting being available per user.
Q1
It is not clear what you mean by "I need to know the upload file location to backup files"
i.
The uploaded files are stored in the content store.
If you want to back up, create a deployment. You can also back up the CM data base itself.
ii.
If the question is, how do I recover data from an uploaded file, which may have gone missing? You can create a report and add all the columns of the uploaded file and run as excel or CSV. The may not be 100% fidelity of data types etc. The proper answer is, establish proper governance for the use of spreadsheets and the like so that this sort of spreadsheet risk is lessened.
iii.
By default, the default upload location is the user's my content.
You can change that.
You do this by going to manage/ customization / profile
There is a setting to set the default upload location.
Keep in mind that it is possible that a user does not have rights to upload files to a particular folder if he does not have write permissions to that folder.
Q2:
It is not possible to set a per-user upload file limit.
We have a sharepoint website and as part of functional process across the website where there are lot of documents been uploaded. Currently they are been stored into database which results in very bulky table in terms of size. My initial approach was to utilize sharepoint to store the documents into file library. Does anybody think database is the wiser options and why or any other approach which is performant and better to store confidential files?
Using a database for storing documents is not a recommended approach, not only it will have large size but will be hard when it comes to maintenance and performance.
If you have a SharePoint server, why not go with a library or multiple libraries to store documents. You will get the below advantages when using SharePoint.
1.Permission management : you can set up access to documents and choose who access what.
2.Search : if there is a search service running you can search through your libraries.
3.OWA : office web apps can be used to open documents on the browser.
4.Audits : You can enable audit logs to see who does what.
Remember, SharePoint is a CMS and there are other options like MMS etc, but it stores the documents in a database too, its designed well so you dont have to worry much about it. If you go with your custom solution you will have to do a lot of custom development and testing.
I never recommend saving files in the database. The easiest approach is to store them on the server in a directory and only save the file names in the database. This makes it easy to show them via a URL in a browser as well. Create a table with a column for the OriginalFileName and one for the ActualFileName. When i save a file to the server after its uploaded i usually change the name so you never have complications with duplicate file names. I use a GUID as the actual file name when its saved and save the original file name in the database along with the actual so you can get both back.
I've seen in liferay where we can specify the upload size of a file
dl.file.max.size=
But I haven't found a way to specify how to limit the number of files a user (or community) can upload. Obviously, we don't want a user or community upload massive amounts of files and filling up our shared drive. Is there a way to do this?
There's no such quota - that's why you didn't find it. However, you are able to extend Liferay to contain and honor a quota. It has been demonstrated with this app - source code is linked and you might be able to take it as the basis for building your number-of-files-quota. It also doesn't seem to take the number of files into account, but it keeps an eye over the volume stored already and should be easy to extend.
I'm assuming that the spanish user group would happily accept pullrequests if you have a good extension to this plugin.
We are using RBS to store the SharePoint files outside the share point content database. This works nice with SP2010. but when we moved to SharePoint 2013 we found that the file has some extra data and added to the file in the RBS directory. Is there a way to ask the SharePoint not to add this extra binary data. As our users basically access the RBS through a read-only shared folder on the network and we have done a business that depends on that.
You're probably referring to shredded storage - in SP2010, there is a single FILESTREAM blob (.rbsblob) for each file version stored on an SP site (e.g. file is stored in the content database, but its content offloaded to RBS). In SP2013, shredded storage can shred a single file version into multiple smaller pieces and pads them with extra bytes, which means that you can't easily access a single file from the RBS itself - and you're not supposed to either! What that means is that when you upload a file "document.docx", you don't get a single blob, but multiple smaller blobs (depending on the size and settings) that can't easily be merged together - what you see in RBS are these multiple blobs. The best you can do is to prevent files from getting shreded, but it's a dirty way: How to disable Shredded Storage in SharePoint 2013? - however, this will only work for newly uploaded files and has impact on storage and performance (shredded storage enables you to e.g. store only deltas of a single document, say if you have 10 document versions that are 50% the same - these 50% will be shared across all versions as shared shreds instead of being stored multiple times as they would in SP2010).
One option that might be useful is to map a SharePoint site to a network drive and let users work with that instead of RBS directory directly - this way, files will get accessed via RBS without explicitly exposing the blobs (shreds) themselves.
Another option is to download/open a specific RBS-stored file by using SPFile.OpenBinary(), which will merge the RBS-stored shreds and return a single (original) file that you can then store elsewhere (e.g. into another shared folder) - this way, you're duplicating files, but that's pretty much how it's supposed to be anyway. For example, this way you can open a file "document.docx" that is visible on an SP site, but stored in RBS as 5 .rbsblob shreds, and save it as "document.docx" elsewhere (outside of SP).
RBS storage is internal to SharePoint and you should NOT access its files directly. There are no guarantees that the data will remain in the original format as stated here:
Although RBS can be used to store BLOB data externally, accessing or
changing those BLOBs is not supported using any tool or product other
than SharePoint 2013. All access must occur by using SharePoint 2013
only.
Also I'm not aware of any RBS-related configuration for the described behaviour.