rss newfeed of a subpage but not the whole page - web

Is it possible to get newfeed of the certain section of the webpage? e.g. i want to get the newfeed of this page
http://gulfnews.com/business/property
but the newsfeed i get is of the whole page. Any help?
Is there any tool where i can control which posts to post and which to ignore. I am using http://www.rssgraffiti.com/ but it doesnt have a controlled option in the free package.

If there content management system supported it, you could. But apparently there content management system doesn't support that.

Related

Liferay search results

I am currently involved in project where we are using Liferay (6.1 GA2).
It seems that Liferay search results provide links to Web Content Fragments instead of to the pages containing them.
Have any of you gone through this issue? Do you know how to solve it?
Thanks a lot pals.
Best, Alberto
You can have a lot more content in the backend than actually displayed on any page. Further, you can display any article on multiple pages at once.
A way to work around this is to specify in the "Web Content Search" portlet that you're only interested in content that is actually published. However, this does not solve your second problem: The content can still be published on many different pages.
Every content can have a "Display Page" - the setup of such a display page is well explained in the UI (see the Web Content Editor) so that you'll actually see a proper page with the search results.
If you actually want to search for pages only instead of content (you might miss out on some metadata), I'd recommend to go with some spider solution that spiders your website, indexes the pages independent of their construction elements (articles) and search that external index.

Show Content from other websites using RSS in my Drupal Site

I would like to show content from other websites using RSS Feeds into my drupal site and show it inside a page.
I searched for RSS. There are so many. Can you suggest a good one?
I'm trying to show the content from here
http://feeds.feedburner.com/brazen_careerist
Update: i tried installing the feeds module and i gave this url
http://feeds.feedburner.com/brazen_careerist?format=xml
it imported all the items. How do i auto-import only the latest ones and auto-format it and make it available on my site whenever it's available from the source
Thanks a lot
-Vivek
To have it setup to auto import you have to play around with the "Importers basic settings". Checkout the handbook page for Feeds here
Go to "Basic settings". Decide whether
the importer should be used on a
standalone form or by creating a node
("Attached to content type"); decide
whether the importer should
periodically refresh the feed and in
what time interval it should do that
("Minimum refresh period").
I'm not quite sure what you mean by auto-format but I think you might want to look at field mapping (information is on the handbook page as well).
Hope it helps.

Software for building a sitemap

If I had to create a content inventory for a website that doesn't have a sitemap, and I do not have access to modify the website, but the site is very large. How can I build a sitemap out of that website without having to browse it entirely ?
I tried with Visio's sitemap builder, but it fails great time.
Let's say for example: I Want to create a sitemap of Stackoverflow.
Do you guys know a software to build it ?
You would have to browse it entirely to search every page for unique links within the site and then put them in an index.
Also for each unique link you find within the site you then need to visit that page and search for more unique links.
You would use a tool such as HtmlAgilityPack to easily grab urls and extract links from them.
I have written an article which touches on the extracting links part of the problem:
http://runtingsproper.blogspot.com/2009/11/easily-extracting-links-from-snippet-of.html
I would register all your pages in a Database, and then just output them all on a page (php - sql). Maybe even indexing software could help you! First of all, just make sure all your pages are linked up and submit it to google still!
Just googled and found this one.
http://www.xml-sitemaps.com/
Looks pretty interesting!
There is a pretty big collection of XML Sitemaps generators (assuming that's what you want to generate -- not a HTML sitemap page or something else?) at http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators
In general, for any larger site, the best solution is really to grab the information directly from the source, for example from the database that powers the site. By doing that you can get the most accurate and up-to-date Sitemap file. If you have to crawl the site to get the URLs for a Sitemap file, it will take quite some time for a larger site and it will load the server during that time (it's like someone visiting all pages in your site). Crawling the site from time to time to determine if there are crawlability issues (such as endless calendars, content hidden through forms, etc) is a good idea, but if you can, it's generally better to get the URLs for the Sitemap file directly.

How to programmatically disable ECB menu item for a particular list item?

I have a custom document library which is based on default document library. I'd like to disable Edit physical document for some of the documents depending on their property.
If you don't mind a client side solution then you can use either a single Content Editor Web Part to inject some JavaScript on a single page or execute the script on every page in the site collection using our free SharePoint Infuser.
Plenty of examples, although not your particular problem, can be found here.

Create SharePoint web page outside of document library

I'm reasonably new to SharePoint 2007 and trying to move from an ASP.NET to SharePoint way of thinking has been an interesting experience!
I would like to create a page at the same level as the default.aspx page in a subsite. The "SharePoint way" of doing things involves putting the page into a document library. I am reluctant to do this as the breadcrumb navigation of the page then includes the name of the libarary but I would like the library to be transparent to the user.
I can create a page in the right place in SharePoint designer but I can't find a way to use a SharePoint template. I have tried copying the default.aspx page, but the navigation linkes are not updated.
Am I missing something or can someone suggest a solution?
I see what you are saying. I would like to share my thoughts on how I will do it.
If the Breadcrumb is your only reason why you want to move to a different than lib, the I recommend you to override the ContentPlaceholder that has the BreadCrumb in your Page, so that BreadCrumb wont be there. But URL will be there for the User to Guess ./DocLib/default.aspx.
And if the reason to hide the document library is to make sure that you don't want to allow the user to get into the Lib and change something. I recommend you strip out the permission from the Document Lib and give all the user ReadOnly and add the user with more right who you thing will need to edit the pages.
And finally you wanted to it in a Place, you can try deploying them as a Feature that will provision the pages as Ghostable rather then GhostableInLibray.
While 1,3 cab be packaged in a WSP. 2 one needs bit of a Manual / custom Code if you are trying to automate the process.
Steps to create Ghostable pages you can refer to this
It doesn't seem to be possible. Subsites can be used to be categorise content by topic, but they can't be used very much.

Resources