We are building a large SharePoint 2007 installation with several hundred site collections over four regionally hosted Web applications. We have approximately 12,000 users, spread out more or less evenly around the globe, and each user may visit many site collections - both on their "home" regional server and on other regional servers.
My question is: how can we allow each user to set his/her time zone once, but allow the time zone to be synchronized to each site collection? Also, if the user moves from one time zone to another, how can we allow him/her to change time zones and apply the changes across all site collections?
We've considered the following:
Update time zone records via the SharePoint API using a scheduled process. Clumsy and slow - we'd prefer changes to take effect more quickly, and our maintenance windows are pretty small already.
Put a trigger on the table that holds time zone information and use a .NET stored proc to update via the SharePoint API. Definitely goes counter to SP best practices.
Create a workflow that allows a user to set his/her home time zone, and then iterate through the site collections to set the appropriate time zone info. This seems to work for existing site collections, but new site collections wouldn't get the settings.
Store the user's time zone in a cookie; have the master page get the cookie and update the current site collection's time zone setting. May work, but our users may use multiple machines, and also, we would rather not have the overhead of doing this on every page load.
So bottom line is that we're not sure what the best option is for us. Any thoughts would be appreciated.
I would suggest building on your cookie idea:
Store user's time zone in their profile and provide an interface to change it.
On page load, if a time zone cookie does not exist create one based on the user profile value.
Compare the cookie value to the time zone set in SPContext.Current.Web.CurrentUser and update accordingly.
As the SPUser object will already exist, and you can use cookies to avoid constantly looking up the profile value, the performance impact should be negligible. You can either add this logic to the master page or use a delegate control to insert a light-weight control (my preference).
Related
Why web crawler must have robustness, politeness, scalablity, quality, freshness and extensibility ?
Robustness: a web crawler must be robust to changes in web site contents. Web search needs to retrieve and index every new web page as soon as possible. if a website just became online, the crawler needs time to go through all the front nodes at the queue in frontier before focusing on this new website. To tackle this web crawler has distributed system which index different web pages with different specification
Politeness: a web search must respect every web server's policies to re-index their web pages. if a web crawler is asked not to crawl a page aggressively by a certain web server, the crawler can put that page into a priority queue and re-index it when the queue is at top
Scalability: new webpages are added every day on the internet, web crawler must index every page asap. for this it needs fault tolerance, distributed systems, extra machines, etc. if a certain node at a web crawler has a fault, other nodes can divide its work and index the particular web pages.
Quality: web search ability to get useful web pages to every user. if the page contains entries which contain content far from user's recent searches or user's interests, the web search must use the previous user experience to predict what kind of content user might like
Freshness: web crawler's ability to fetch and index fresh copies of each page. for eg news websites are updated every second and needed to be re-index urgently. for this web crawler keep a separate priority queue for such priority based contents, to reindex such pages in a small period of time.
Extensibility: during early times, new data formats, languages, and new protocols were introduced. web-crawlers ability to cope with new and unseen data formats and new protocols is called extensibility, this suggests that web crawler architecture must be modular so that changes in one module would not affect others. if a website would contain a new data format unknown to web crawler then the web crawler can fetch the data but requires human intervention to add the data format details to the crawler index module.
I'm currently running Sharepoint 2013 Enterprise and would like to know the following:
Which pages each user has accessed over the last X Days
How much time each user has spent on each page (can't average, I need to know per user)
It is like a Google Analytics, but at a "user" level. Any clues on how to do that?
I searched a lot stackoverflow and found nothing. Maybe I'm using the wrong terminology.
Under the site collection administration "popularity and search report". You can find some of the information.
According to MSDN blog "Due to less than optimal performance running service at scale in large enterprises, web analytics has been discontinued and is not available in SharePoint 2013."
Web analytics was available in 2010.
If allowed you can choose Google analytics, its just java script reference , add it in your default master page
Definitely not web analytics, but user specific usage reporting based on sessions.
Assuming that this is cleared by your legal department (which at least in the EU most likely would not happen), what you need is a simple database with page views (with session info) and each action time stamped.
This would then be reported per user to contain a list of sessions, that each have a list of pages and calculate the time between first and last interaction per session.
I have this problem. I have web page with adult content and for several past months i had PPC advertisement on it. And I've noticed a big difference between Ad company statistics of my page, Google Analytics data and Awstats data on my server.
For example, Ad company tells me, that i have 10K pageviews per day, Google Analytics tells me, that i have 15K pageviews and on Awstats it's around 13K pageviews. Which system should I trust? Should i write my own (and reinvent a wheel again)? If so, how? :)
The joke is, that i have another web page, with "normal" content (MMORPG fan site) and those numbers are +- equal in all three systems (ad company, GA, Awstats). Do you think it's because it's not adult oriented page?
And final question, that is totally offtopic, do you know about Ad company that pays per impression and don't mind adult sites?
Thanks for the answers!
First, you should make sure not to mix up »hits«, »files«, »visits« and »unique visits«. They all have a different meaning and are sometimes called differently. I recommend you to look up some definitions if you are confused about the terms.
awstats has probably the most correct statistics, because it has access to the access.log from the web server. Unfortunately, a cached site (maybe cached by the browser, a proxy from an ISP or your own caching server) might not produce a hit on the web server. Especially if your site is served with good caching hints which don't enforce a revalidation and you are running your own web cache (e.g. Squid) in front of your site, the number will be considerable lower, because it only measures the work of the web server.
On the other hand, Google Analytics is only able to count requests from users which haven't blocked Google Analytics and have JavaScript enabled (but they will count pages served by a web cache). So, this count can be influenced by the user, but isn't affected by web caches.
The ad-company is probably simply counting the number of requests which they get from your site (probably based on their access.log). So, to get counted there, the add must not be cached and must not be blocked by the user.
So, as you can see, it's not that easy to get a single correct value. But as long as you use the measured values in comparison to those from the previous months, you should get at least a (nearly) correct rate of growth.
And your porn site probably serves a high amount of static content (e.g. images from the disk) and most of the web servers are really good at serving caching hints automatically for static files. Your MMORPG on the other hand, might mostly consist of some dynamic scripts (PHP?) which don't send any caching hints at all and web servers aren't able to determine those caching headers for dynamic content automatically. That's at least my explanation, without knowing your application and server configuration :)
Is there an out of the box solution to check the validity of documents? Let's say when a document has been in a document library for 1 year, the author should get a warning, an e-mail for example, to revise the document.
I didn't find this in SharePoint. So I was thinking of creating my own feature for this:
A timer job which runs every night and check all the documents in the site collection
The timer job can be configured through an admin page in the central admin, for example to configure on which site collections in a web application the job should run.
My concern is, when running this in a heavily used environment, doesn't it burden the servers too much? Let's say for example an environment with 100.000 documents spread out over 5 site collections. And how about looping through all those document libraries in various SPWebs, use an SPSiteDataQuery to retrieve all those documents and loop through that collection? Because opening each document library in each SPWeb in 5 SPSites...
Or is there an other option to accomplish this? With workflows? Because in the end, the owner of the document receives a warning and he needs to confirm if the document is still valid. I haven't touched workflows much to be quite honest.
I would like to hear your thoughts about this.
Maarten.
This SO Question may give you some ideas - workflow/timer jobs/3rd party etc as in essence your requirement for email alerts when documents are 1 year old is basically the same as 'a task is overdue'
Dated reminders in sharepoint calendars
Re: Load - well I can't give you specifics as every situation is different but you've got the ability to run this overnight so I can't imagine that it would really be much of a problem.
Also Remember your not actually retrieving/parsing the documents themselves, just the record containing the documents meta-data such as title, location, modified date, assigne to etc.
this sounds like a job for powershell.
write a little script that queries the document lib's for documents that are older than one year.
then send a email alert or create a task for the user to update the document.
also i would not worry about having 1000's or workflow runing. WFF is an enterprise product. i have had over 60000 running without any problems.
I have the following situation:
MOSS 2007 Server Environment A -> Intranet
MOSS 2007 Server Environment B -> Collaboration Environment (approx. 150 site collections for various issues)
Both environments are on different infrastructures but we use the same Active Directory and the same groups. Now we would like to implement the following 2 things:
An overview page within the intranet with all available site collections on environment b.
An overview page within the intranet with only those site collections the user has access on.
now i'm searching for some good ideas what would be the best way to realise something like this.
thanks in advance for any response.
The main thing to be careful of in a solution like this is performance, particularly for your second requirement. That would require looping through every site collection and retrieving permission data, either using the web services or the object model.
I would recommend writing a custom timer job (or two for each requirement if that makes more sense) to execute at a low-traffic time and aggregate this information for storage in a custom SQL database. If there is never low traffic then delay your requests to reduce impact on the server.
A custom web part (or again, two if more appropriate) can then be deployed to both environments. The web part would query the database for the required information and display it to the user.
If the timer job needs to update this data more frequently then you would need to implement some sort of in-memory caching. Depending on your requirements this may need a lot of memory.