We currently load our sites content into a container on the main page and never reload the whole site. We use jqAddress for deep linking. I've been tasked with creating a bread crumb solution to use on the site and looked at MVCSiteMap as a solution. The problem is that whenwe load the content into the container and change the address the site map bread crumb isn't updating because the page isn't re-rendering, we are simply loading the partial inside a view container. My idea was to create another partial and have the jqAddress change event reload the partial, but I'm not sure how to get all the data to the SiteMap helpers to tell it where we just went so it can properly generate the breadcrumb path. Any help/ideas would be very much obliged.
Thanks in advance,
Austin
So it appears that the easiest way to fix this is to just include the site map path on every page that gets loaded into the view container. This will cause it to render the path properly. Now I just have to delve into the depths of the mysteries of dynamic node providers!
Related
I am currently involved in project where we are using Liferay (6.1 GA2).
It seems that Liferay search results provide links to Web Content Fragments instead of to the pages containing them.
Have any of you gone through this issue? Do you know how to solve it?
Thanks a lot pals.
Best, Alberto
You can have a lot more content in the backend than actually displayed on any page. Further, you can display any article on multiple pages at once.
A way to work around this is to specify in the "Web Content Search" portlet that you're only interested in content that is actually published. However, this does not solve your second problem: The content can still be published on many different pages.
Every content can have a "Display Page" - the setup of such a display page is well explained in the UI (see the Web Content Editor) so that you'll actually see a proper page with the search results.
If you actually want to search for pages only instead of content (you might miss out on some metadata), I'd recommend to go with some spider solution that spiders your website, indexes the pages independent of their construction elements (articles) and search that external index.
I've downloaded Drupal 6 and installed it on my local server, and ported a basic web site as a custom theme. I've set this as the default theme and everything works okay; the page appears nice with all the images and layout.
The problem is that now, no matter what I type as the URL I always get to my page. So how can I go to the admin page? /q=user does not work. I can change the theme from the database but that is not what I want. I just want to keep this theme and be able to access all the Drupal functionality.
For the custom page i've created the page.tpl.php and .info file. Along with the CSS file I've put them all in a new folder in the themes directory.
#abhaga's answer is spot on - you've turned the entire site's theme into a single page's HTML, so they're all going to be the same.
If you'd like to avoid glitching the admin side of things with a bad template file, you can set one of the core themes as your "administration theme" (in Site Configuration) - that way, the admin backend will always use that theme regardless of the other templates.
Ah! Basically page.tpl.php specifies the over all structure of site. Look at the original page.tpl.php file - it will be printing a variable called $content somewhere. That is the variable holding all the content of your specific page. You will need to print it at appropriate place in your pge.tpl.php
You should check out the right way of creating a theme here: http://drupal.org/theme-guide/6
The current website structure is setup such that all the ASPX pages are in the main folder. It's becoming increasingly difficult to maintain, so I would like to create new folders and move the relevant pages. This would change the URL from say:
http://mydomain.com/DoStuff.aspx
to
http://mydomain.com/DoingFolder/DoStuff.aspx
I fear that this will skew up the google analytics results. Is it recommended I do this change? If so, is there a way to link the page locations of after and before the change?
Also, what would happen when I implement the URL rewrite? Would I run into the same issue again? Anyone?
So in general I think it is a good idea to add the folder for both your users to visually see the section they are in via the URL and to help the search engines figure out the areas and who knows you may even get a (small) SEO benefit out of it.
What I would advise is to setup a second profile in Analytics and then add a filter which removes the folder name from the request and will leave you with the same flat structure in your reports as you have currently. (NB Do this under a new profile with the same tracking code to avoid major mess-ups that you can't undo).
Cheers
Z
Hi I was wondering If there is away to save a page in sharepoint. I want to save the default page, and replicate it on a mirror server. I want the web parts in the same place, and the properties of the web parts to stay intact. Any suggestions? Thank you.
You might also look into using SharePoint Content Deployment which was designed with pushing out pages from a master site out to cloned sites.
You can view the page is SharePoint Designer and disassociate it with its Page Layout. That will bring the content of the page into Designer. You can then copy it to another file. However, this is fraught with problems:
Any dependent list and relative urls would not be copied over. You would have to move it manually.
By disassociating the page from its layout you are in effect going to slow down the retrieval of the page. Since this will also have to be done on the destination page, both pages are going to slow down.
I have found that its always better to do an export of the site and then import it to a destination site.
I don't condone this product as I haven't used it but you can also try:- http://www.softlow.com/windows/business/management/free/web-part-page-cloner-for-sharepoint.html
I'm reasonably new to SharePoint 2007 and trying to move from an ASP.NET to SharePoint way of thinking has been an interesting experience!
I would like to create a page at the same level as the default.aspx page in a subsite. The "SharePoint way" of doing things involves putting the page into a document library. I am reluctant to do this as the breadcrumb navigation of the page then includes the name of the libarary but I would like the library to be transparent to the user.
I can create a page in the right place in SharePoint designer but I can't find a way to use a SharePoint template. I have tried copying the default.aspx page, but the navigation linkes are not updated.
Am I missing something or can someone suggest a solution?
I see what you are saying. I would like to share my thoughts on how I will do it.
If the Breadcrumb is your only reason why you want to move to a different than lib, the I recommend you to override the ContentPlaceholder that has the BreadCrumb in your Page, so that BreadCrumb wont be there. But URL will be there for the User to Guess ./DocLib/default.aspx.
And if the reason to hide the document library is to make sure that you don't want to allow the user to get into the Lib and change something. I recommend you strip out the permission from the Document Lib and give all the user ReadOnly and add the user with more right who you thing will need to edit the pages.
And finally you wanted to it in a Place, you can try deploying them as a Feature that will provision the pages as Ghostable rather then GhostableInLibray.
While 1,3 cab be packaged in a WSP. 2 one needs bit of a Manual / custom Code if you are trying to automate the process.
Steps to create Ghostable pages you can refer to this
It doesn't seem to be possible. Subsites can be used to be categorise content by topic, but they can't be used very much.