List of documents in Entire Sharepoint Site - sharepoint

I am trying to get a list of all documents within my SharePoint Site, owner and date last modified.
It is a SharePoint 2010 Enterprise environment.
It's a site which has many subsites.
All solutions I have tried only allow me to get documents at that specific site level and not subsites.
I must admit, I can get the required information by querying the SQL DB directly but don't want to go down this path, as it's unsupported by MS
Any help would be appreciated.

You can get this information quite easy by using Powershell, you just need to iterate through all subsites and document libraries.
Take a look at these pages:
http://blog.falchionconsulting.com/index.php/2010/08/getting-an-inventory-of-all-sharepoint-documents-using-windows-powershell/
(This is almost exactly what you require, there are a few extra lines that you don't need).
http://www.sharepoint-journey.com/get-all-document-libraries-in-a-site-collection.html
https://sharepoint.stackexchange.com/questions/126397/powershell-get-a-list-of-all-the-document-libraries-for-a-web-application-incl
http://blogs.msdn.com/b/varun_malhotra/archive/2012/02/08/sharepoint-2010-powershell-download-all-files-in-document-library-to-network-share-file-share.aspx

Related

Sharepoint 2013 Document last modified date code snippet

I am using sharepoint 2013. I have a page that does not show the document library itself, but has links to documents within it. Next to the link I would like to show the date the document was last modified for each file. See image
The text is not a link, just the PDF is a link to a file with in a document library. They are not all in the same library though, some are on other sharepoint 2013 sites, that I own as well.
Is this even possible? I have been searching for a few days, but have not found anything close to what I am looking to do. Most of what I am finding is related to getting the date in applications outside of sharepoint.
Yes it is possible depending upon the SharePoint api you use and the location of documents.
If using SharePoint JSOM, its possible if all the sites in which documents are stored are is same site collection.
If using SharePoint object model and using correct privileges, there's no limitation whether the documents are is same or different site collections (you may have to use different context objects though)

Can no longer find SharePoint Site Assets list via Graph API

I'm used the Microsoft Graph API to query SharePoint. Until recently I was able to find the "Site Assets" document library via the Graph API. I can no longer find the list.
What queries have I tried:
https://graph.microsoft.com/v1.0/sites/{siteid}/lists?select=weburl
No URL matches the list of Site Assets. Next:
https://graph.microsoft.com/v1.0/sites/{siteid}/drives?select=weburl
Again no URL matches the list of Site Assets. In the past I was always able to find the assets used the second query. I switched both queries to beta, also without result.
I've looked at the changelog in the Graph API, but nothing relevant is listed.
How can I (nowadays) find the "Site Assets" list on any SharePoint Site?
By default both the lists and drives enumerations attempt to hide system objects, but unfortunately doing so in SharePoint is non-trivial. As a result some system lists were still coming through until recently when we made sure they didn't.
You can still see them, but you'll need to explicitly ask for them by requesting the system facet.
https://graph.microsoft.com/v1.0/sites/{siteid}/lists?select=weburl,system
https://graph.microsoft.com/v1.0/sites/{siteid}/drives?select=weburl,system

Sharepoint Search 2013 - is there any way to index a list of URLs stored in a database?

I have a database table with a list of URLs that I would like Sharepoint Search 2013 to index so they show up in search results - the URLs are a mixture of content types - web pages, Word documents, PDFs, etc.
All the URLs are internal to my network but aren't Sharepoint pages or files stored in Sharepoint.
I am using Sharepoint 2013 Enterprise Search on a Windows 2008 R2 server.
Does anyone have any ideas on how to achieve this?
I have searched for options but can't seem to find anything relevant - BDC and BCS have come up a lot but seems to be more indexing content returned by the connector. What I want to do is to use the data returned from the table as pointer to items to be indexed.
I'm very new to Sharepoint and Sharepoint Search and am at a bit of loss on how to go about this (to make it even more difficult I would like to apply ACLs to the results, and the ACLs are in another table but that's another question!). Given my experience level I would like the answer to be as basic as possible if you can, but any help would be apprecieated.
BDC and BCS is the proper way to do it, but it's very complicated. If you want something simple, create a small script that writes all the URLs to a single HTML document. Then use the web crawler to crawl this document. It will follow the links and crawl the content.

Sharepoint 2010 - Questions regarding basic concepts

I am beginning sharepoint development and have some quick questions concerning basic terms.
How do i find out whether a particular site is a site collection, or a site JUST BY THE URL? Is their a powershell command to do this?
I was creating some sites in sharepoint. Some sites were appended with /sites/sitename whereas others were just under the base url of sharepoint. What is the difference between the 2? AND, how do i recreate the ones under the sites node? For some reason, I cant find the option to create under the sites node again. Please explain this concept as all msdn tutorial are very confusion for beginners like me. Those are good once you get the hang of basics.
Please provide an analogy how to understand web app, site collection, site, web site, etc.
Is there a way to use NEWFORM.aspx for a document library instead of UPLOAD.aspx?
The Site collection is at the root level of your Web application.
So http://abc.com/ => Site collection
Using Powershell, open the Sharepoint Powershell prompt and run Get-SPSite to get all Site-Collections
the /sites/ is called as a managed path
It can be defined in the Central Administration for every web application.
The option to select the /sites will be available only when you create the second site collection under the Web Application (The first one take the / by default.)
Have a look at Technet Article
document library is for uploading file, not for storing user submitted data, for that you need to create a list
1) Document Set is used in cases where multiple documents have the same properties, its like putting all these documents in a folder and then providing attributes to that folder which are in turn applied for each document in that folder.
In your case, if all the files have the same values for the 8 fields then the document set is the correct way to go.
2)If there is additional metadata associated with the files then these can be added either to the content type (eg. document or document set content type) or to the columns in the library itself, you dont need to create a separate list for holding that data. Adding data to the content type ensures consistency across all the document libraries within that site collection, adding columns to the library affects only that library.

Mysites like page that lists all doc libraries across all SharePoint sites?

We'd like to create a web page that will list all Document libraries across all sharePoint Sites for the user currently accessing the page. We'd also like to offer a all site search for the user. That is all sites they have access to.
We currently do not have Mysites enabled, nor do we want to.
Possilbe to code this?
All site search is easy. If you are using the non-free version of SharePoint 2007 or 2010, then that capability is baked into the product. Users can use the search scopes to search across all content in the SharePoint farm. It will automatically trim search results that users don't have access to.
As for you list of all document libraries, this would probably be too much effort to generate in real time for any non-trivial SharePoint environment. You are most likely going to have to gather this information ahead of time and then display the appropriate summary of the data in a WebPart of some other similar interface. Code to crawl every web application and every site and every sub-site and then every Document Library isn't hard. Actually it is very straightforward. What will be a little tricky is that you will need to collect ACL entries for each of these lists so that you can compare them to the current end user. The real trick is that the ACLs might contain SharePoint Group names and Active Directory group names instead of individual end user names. That will make your reporting task more difficult.

Resources