Using CAML or some other query system, how can I find items that use custom code and organize by modified date?
And the background. My group is looking to upgrade a 2007 SharePoint installation to SPO. The problem we're having is a lack of clarity as to what on the site is junk and what is a custom code set. The first idea was just to run through manually and make note of every node. As there are 5,200 nodes, this is pure insanity. I've done some research to find that CAML is how to query a particular site in SharePoint, however I cannot seem to figure out how to query everything at the same time.
I tried to make a view in the root, but again, it only queries the particular level you're on. I'm having the feeling that I might need to write a tool for this and spider the site, but am unsure where to begin. After trying a couple of tools (Stramit Caml running in visual studio and SPUD) I seem to be running in place as I don't understand how the connection works.
Any advice or stories like this?
To scan your environment and detect where custom solutions are used, your best bet is to use the stsadm command called preupgradecheck. This is executed from the commandline on one of your web front end servers, invoking stsadm.exe.
From Microsoft:
The Stsadm command provides a rule-based scanning operation to determine whether servers in an existing SharePoint environment meet the core requirements for upgrading from Windows SharePoint Services 3.0 and related products to future releases of SharePoint Products and Technologies.
The pre-upgrade scanning and reporting operation is implemented as Stsadm –o preupgradecheck, and can be run with or without parameters.
Upon execution, the command checks your environment against various rules. The result of each rule check is written to both an XML log file and a text log file, located in the the %COMMONPROGRAMFILES%\Microsoft Shared\web server extensions\12\LOGS directory, and when the command finishes it will display an HTML file in the default web browser summarizing the results.
Related
On a SharePoint 2013 on premise installation, we have configured an incremental crawl, that runs every 15 minutes.
Now we need a possibility to start this crawl manually, too. Is there a WebService we can call to start the crawl manually? Maybe even a possibility to start the crawl on a smaller scope, like only a list or website?
You may just love Simple HTTP api for Executing PowerShell Scripts
In cases where you want to simply invoke a PowerShell script remotely,
a REST api is a good choice since all modern programming languages
make it simple to perform a HTTP GET operation.
and Simple HTTP api for executing PowerShell scripts (with source code) which is an out of the box solution of what the above is trying to accomplish. Here are the Windows PowerShell cmdlets to administer and configure search in SharePoint 2013 for more details.
Please also look at SharePoint Windows PowerShell Remoting if you are interested on more nitty-gritty remote SharePoint control.
Try to use powershell interface for this purpose.
It is pretty simple when you think about it.
We can use the Get-SPEnterpriseSearchCrawlContentSource command to get a ContentSource object.
After we have a reference to this object, we can start, stop, and pause crawls to our heart’s content.
This class has a few methods that you might be interested in:
StartFullCrawl, StartIncrementalCrawl, StopCrawl, PauseCrawl, and ResumeCrawl.
What each method does should go without explanation.
If it is necessary, you can create a wrapper for powershell call. Your web service will call powershell for this purpose.
The SharePoint install is a SP2010 install on a 2008 R2 server. Everything is fully patched. I am running the SP Designer on the SharePoint Server directly.
I have a workflow which is intended to send an email when a new document is created in a custom list. I have deliberately kept the workflow very simple in order to illustrate this problem.
After creating this single step workflow in SP Designer, I click "Check for Errors" and SP Designer reports "The workflow contains no errors".
I then click "Publish" but the Workflow Error dialog is displayed with the message
Errors were found when compiling the workflow. The workflow files
were saved but cannot be run.
Clicking the advanced button reveals more information:
Could not publish the workflow because the workflow configuration file
contains errors
Any suggestions gratefully received
I'll share what fixed it for me - deactivating all workflow features at the site collection level (that is, Workflows, Three-state workflow, Publishing Approval Workflow) and then reactivating the features. I was then able to publish my workflow. This post helped, not sure whether this only works for 365 though, but it's sure worth trying first if you are considering a reinstall.
after googling for quite some time, i think it's an authentication issue. How is your SharePoint set up? Do you use HTTPS for authentication? If so check out this article.
I know this error message from sharepoint. I got this by dealing with multiple lookup fields refering to other lists. Even when I check the worfklow for errors SharePoint says that its all fine but i can't publish it at all.
Try to build a new Test-Site on your Site Collection. Build a Custom Document Library, leave it standard and then set up a new simple workflow just sending a mail.
Fill out the needed fields in mail only using simple values. Send to your mailadress, simple mail subject and simple mail body.
Set the workflow to run only manually.
Try to publish the workflow.
When this is working, then compair to your existing workflow and change your values by trail and error.
After doing a clean install of the OS and SharePoint, workflows are working flawlessly. I can only conclude that the problems were caused by left over registry settings from MOSS 2007. Thanks for the suggestions that people made.
This could also happens if you chage the URL of the web application, all you have do is click the Design button from the library itself.
when changinf the URL from http://server/Site to example: http://server.xx1.net/site, and you try to publish it tries the old url.
what helped in my situation is changing from start workflow automatically to manually.some times answers for critical situation is very easy. hope it helps, many thanks
I ran into this problem and after digging for days and folks suggesting to rebuild the servers, disabling and re-enabling site features, remove previous workflow versions, etc. and trying everything except rebuilding the servers (not practical for clients production environment). I decided to try some tests and found that this issue was only happening on one particular list no matter how simple or complex the workflow was... And when I would check the box for start automatically on item create (or when item changed) it would fail to publish and give the error above, but if I published it with just manually start worked fine. Finally after deleting views and some more testing, I discovered that there was over 240+ columns in this list (I did not create it...) and 50+ workflows set to run on create... Thankfully I have a test environment I built out for the client so I sync'd the Site Collection database back to test environment from Production re-ran my tests and got same error... So what resolved the problem and what was the ultimate cause of the problem, there was to many columns defined in the list and I had to delete several columns to publish the workflow in the test environment. This actually issue translates into the there is a limit in SQL Server on how much data the list can store each type of column takes up so much space read more about it here:
https://technet.microsoft.com/en-us/library/cc262787(v=office.15).aspx#Column
So what I did in production was worked with my client to determine how to break up the list into multiple lists and have relationships between them, thus moving some of the columns and data to another list (Think database/list normalization)... I hope this solution helps someone.
I have installed and configured SharePoint 2010 to run on the same box as the SQL Server its running from in Windows Server 2008 R2. Everything is working fine except the search. I have uploaded several documents and tagged several items (documents, tasks, announcements etc), however whenever I search the site using the defaul search, i get nothing returned no matter what i search on, I simply get "We did not find any results for [search term]". I know there is setup needed if you wish to use "FAST search", but do I have to do anything to get the standard default search to work?
Found the answer on SharePoint.SE:
After installing the system you need to configure your indexing job.
Navigate to CA > Service Applications > Search.
You will see a link to your Content Sources. If you edit that it will give you the opportunity to setup a schedule for both Full and Incremental indexing.
You can kick off a full crawl, once completed you will have results if everything is configured correctly.
It does work for people search too . if you edit this content source you will see sps protocol which is for user profiles .
To make people search to work, in Central Administration > Manage Service Applications, make sure to provide a valid domain\user account as administrator in User Profile Service Application.
To add on to Chensformers answers, the account has to have "Retrieve People Data for Search Crawlers" enabled. They have to have it in the Administrators button, even if the Permissions is set to Full Control! Quite misleading.
For some reason my search in the sharepoint site does not work.
I have set up the SSP, the scopes, the crawls, everything but it still does not work
Can someone explain to me how to setup the search? Maybe I did something wrong in the process.
It's not the simplest thing in the world to setup, as it's comprised of a number of components.
You need to check each one to determine where your problem is.
Start from the crawl, and work your way forward to the search production on the page.
So check the following:
Check some servers have been setup to index pages. (You can see this under services on servers in the central administration pages.)
Make sure they're all running correctly. (Not in a half started state.)
Check your crawl log in your SSP to see if it is indexing anything.
(Index different types of content, like file shares, web sites, and sharepoint itself. (check each one.)).
(Note you need a special plugin to index PDF's.).
Check your index is copied to the front end server where it is used.
If it's not, it may be because this hasn't been configured, (Check Services running on servers again)
Then check your site collection setup, and ensure you have a search site configured.
Ensure the site collection search details are configured to use the search site.
Finally check the user doing the searching actually has access to the content being indexed.
Doing all of that should give you some idea of where the problem is.
In addition to Bravax's answer its worth checking that you are not getting stung by the local loopback check.
I had similar problem and ended up using search server express which is free (see my answer from this link: sharepoint 2010 foundation search not working)
I have installed search server express 2010 on top of SPF which works great. it has additional features and work well with sharepoint foundation. her is a link for upgrade and configuration: http://www.mssharepointtips.com/tip.asp?id=1086
You need to crawl the the contents source and add the website to it, then run full crawl to index data.
Edited:
What is the easiest way to scrape extract SharePoint list data to a separate SQL Server table? One condition: you're in a work environment where you don't control the SQL Server behind the SharePoint Server, so you can't just pull from the UserData table.
Is there there any utilities that you can use to schedule a nightly extract?
Is Microsoft planning any improvement here for "SharePoint 4"?
Update Jan 06, 2009:
http://connectionstrings.com/sharepoint
For servers where office is not installed you will need:
this download
There is a SSIS SharePoint task you can use to grab the data info a regular dataflow:
http://www.codeplex.com/SQLSrvIntegrationSrv
Scraping? As in screen scraping? Are you serious? ;)
2 Options
SharePoint Object Model - http://msdn.microsoft.com/en-us/library/ms441339.aspx
SharePoint Web Services - http://msdn.microsoft.com/en-us/library/ms479390.aspx
specifically the Lists web service
The web services is how Excel/Access communicate with SharePoint to integrate with its lists.
In fact a bit of Google foo gives these two results :-
Connecting SQL Reporting Services to a SharePoint List
Accessing SharePoint List Items with SQL Server 2005 Reporting Services
The 2 minute answer is to use Data Synchronisation Studio from Simego ( http://www.simego.com ) just point it at your List and database and it will sync all the changes.
There is an ADO.NET adapter for MOSS 2007/2010 and WSS 3.0/4.0 available which goes under the name Camelot .NET Connector for Microsoft SharePoint. It enables you to query lists in SharePoint through standard SQL language, using SharePoint as a data layer.
Besides from the connector, there will be a large number of open source tools and utilities available, such as webparts for exporting data to various formats (XML, MySQL, ..), Joomla plugins, synchronization services, etc.
See http://www.bendsoft.com for more details and to watch webcasts. BendSoft is currently looking for beta-testers and encourage all feedback from the community.
Example:
SELECT * FROM My Custom SharePoint List
INSERT INTO Calendar (EventDate,EndDate,Title,Location) VALUES ('2010-11-04 08:00:00','2010-11-04 10:00:00','Morning meeting with Leia','Starbucks')
DELETE FROM Corp Images WHERE Image Name = 'marketing.jpg'
I had written a full article about this with step by step screenshot procedures. It does not use any third party components only SQL BI Tools and Sharepoint. Have a look here
http://macaalay.com/2013/11/01/how-to-archive-sharepoint-list-items-to-sql-server/
As Ryan said I would also suggest using object model / web services to store data to separate SQL database. I think that the best approach is to write an event handler that will trigger on your least and copy the data user inserted/updated.
Regarding your query about "SharePoint 4", Bill Gates made some remarks at SharePoint Conference 2008. He suggests enriching SQL tables with SharePoint data, and goes on to mention several other potentially cool things. What exactly he means and whether it will help solve your problem in the future is hard to say until we start seeing betas of WSS4 / MOSS 14.
I would go with the simego software, but i dont have the money, maybe a 15 days trial is enough!
If you have MOSS installed, the Business Data Catalog can be setup from the Sharepoint Central Administration to automagically synchronize data for you. This is a very powerful product and is included with MOSS. I love it when a client has it enabled so I can take advantage of it.
But some don't and for myself, I've found that if they don't have BDC running and available, inevitably they don't give developers many rights to SQL Server so SSIS is generally out of the question (but maybe that's just me). No problem; for those I'll pull together a lightweight EXE that runs on a scheduled task that queries Lists.asmx and pushes changes to a SQL Server table. Fairly trivial stuff for a simple list where nothing is deleted. Get yourself Visual Studio 2008, CAML Builder, and prepare for a good time. The Lists.asmx results is a little funny in that a list's row's fields are each a single node with a lot of attributes, with no child nodes ... something like this off the top of my head ... just remember that when coding ...
<z:row ows_Id="1" ows_Field1="A1" ows_Field2="B1"/>
<z:row ows_Id="1" ows_Field1="A2" ows_Field2="B2"/>
Complications in code occur with copying lists where items are deleted, or where there is a parent/child relationship between SP lists. You'd think I'd have some code to send you, but I haven't bothered putting together something I could reuse.
I'm sure there's other ways of handling it, but the scheduled task EXE so far has been reliable for me for multiple apps for multiple years.
i wrote some code to achieve it, you can find it over here
extract data from moss 2007
Depending on the exact nature of the data you need to insert, it may be possible to just use the auto generated RSS feed to get the information you want, a process will need to read the rss and formulate a query.
Otherwise a consoleapp/service could use the object model to do the same thing, but with more control over field information.
I wish something like this was much easier to do. Something that didn't need SSIS and was boiled down to a console tool that reads a xml config file for source/target/map info.
http://blogs.officezealot.com/mtblog/archive/2008/06/03/importing-list-data-into-sql.aspx