I was trying to display a link in a page which will point to previous page the user has visited in drupal .
Previously i was using sessions
echo $_SESSION['back']
$_SESSION['back']=htmlentities($_SERVER['REQUEST_URI']);
This worked fine, but i was told to use variable_get and set in drupal and not to use sessions
So i did this
global $prev_global;
$prev_global=variable_get($prev_page,$default='http://mysite.local');
variable_set($prev_page,htmlentities($_SERVER['REQUEST_URI']));
. . .
echo "PREV:".$prev_global;
But this always points to the current page being viewed , what went wrong here ?
I don't know who told you to use variable_get() and variable_set(), but consider never listening to them again. variable_get() and variable_set() act on global variables, not user-based variables.
You had it right the first time. Use $_SESSION: that's what it's there for.
Related
On the website I run we have a single search field where you can enter a name or profession. When you search you are served with a page full of results that come from 3 seperate sources.
Once you click on one field e.g. John Do, you will be taken to his page. On that page we have a back to search, but it goes to a blank screen.
I want to go back to the actual search results so the person doesn't have to do it all again, but I'm not sure where to start. Any suggestions?
That's a tricky situation.
There can be many solutions for this issue but I'll will name some of them.
Activate the cache of the pages (Quick trick, no suitable for websites that relies on users (*login)), you can go back and your form will be the same with the results without any issue.
Manage the load of the page of Jhon Do as a ajax load and #hashtag references, you don't reload the page but you just manage the states of the HTML. (Can be done with JS frameworks or React)
Depends on which platform are you working try to manage the variable of the search with this concept post-redirect-get
Hope that is helps!
Cheers.
I created an admin section for a website of mine using PHPMaker - as I usually do. The website is made from scratch, no wordpress or anything else involved.
In some apparently "innocent" tables, if I try to edit them, I get a 403 error. It never happened, and I used PHP maker for at least 15 websites of mine, so I am puzzled.
It happens only on 2 tables out of 15, and as said, they are fairly innocent. Nothing fancy compared to the other ones.
This is what I've tried:
there was no .htaccess file in the admin directory, so I also tried to insert an empty one
clearing the cache
visiting the page in private mode
all file/directory permissions are ok
regenerating the project and uploading it again, to a different directory
What else can/should I check?
There is a .htaccess file on the root directory of the server to handle some "pretty url", but it should not matter since the admin section is under a specific directory. Or should it matter?
Thank you
At least, I found the problem: every time I was trying to insert an iframe or a script, somewhere somehow somebody (...) was parsing it and it was blocking it for - I guess - security issues, so I let the user to insert the script without the <script> tags, added later in PHP simply with:
print "<script>$readFromDB</script>";
In each insert action, PHPMaker will parse the word
<script
and converted to
<s<x>cript
You have either get the inserted rowid and update the value again OR each time you have to remove the '< x >' to view the value correctly.
Take a look at my answer here: PHPMaker issue
Xpage (listPostits.xsp) has a "View" container control, where one of the column is set "show values in this column as links".
Now, here comes "Strange behaviour".
When i work with this application on my own (developer) PC (Win XP, Chrome or IE), the Domino generate the link, which can't be really processed:
/servername/db/postit/postit.nsf/listPostits.xsp/onePostit.xsp?documentId=many_numbers&action=editDocument
Namely, the Bold-marked portion shouldn't be there ! This portion is the name of the XPage, where the View control is in.
When i work with the application from other PC (Mac, Firefox) then i get the correct link (the same as above but without the XPage name inbetween):
/servername/db/postit/postit.nsf/onePostit.xsp?documentId=many_numbers&action=editDocument
update: let us leave for the moment the differencies in generated links between two machines. The first question is - why the extra portion is inserted into automatically generated link?
After playing around i think i might have found the reason for this strange behaviour. Namely, the "Substitution" Rules on the server side. One of them is to substitute "*/postit/all" with "/db/postit/postit.nsf/listPostits.xsp"
If i switch it off, then the Links are generated properly. Still, it's pretty strange to me that these settings influence the way Domino generates the links. I thought it works on the fly with them and those settings have nothing to do with the way how Links are generated inside the application.
So, the help now is needed regarding Web Site Rule Topic, but for that, i guess, i have to create another topic. But in case somebody has some good Info on this, please share it with me. I'm a bit confused at the moment :)
Final Update: Spent some more hours of testing and the results confirmed the initial idea.
If i open the page with the standart URL, i.e.
http://servername/db/postit/postit.nsf/listPostits.xsp then everything is fine, links are generated properly. When i however open the same page with short URL http://servername/postit/all , then server adds the substitute URL (db/postit/postit.nsf/listPostits.xsp) to every single link he generates automatically to be used as the link to open/edit the underlying document.
Is it bug or feature ? Don't know.
As a workaround (because i want to keep simple URL's for the application) i have to manually generate links.
I have seen many somewhat similar questions, but nothing quite what I'm looking for. So at the risk of being told this is a duplicate... here it goes.
I've found that there are times I have a node that simply contains content that will be displayed somewhere else, but shouldn't be viewed directly. That is, no one should ever go to node/1234, but the content in node 1234 should be displayed somewhere else.
For example, I create an about page with tabbed content using views. So there are "About Me", "About Us" and "About Them" pages. All of these are displayed in a single page with tabs using Views. So I don't want people to get directly to the "About Us" node because then they wouldn't see the tabs for the other pages. At the same time, I don't want Google giving people a direct link to this node, I want to limit access so users can only get to it through the View (i.e., the tab).
So I need to restrict access to the node, remove it from the Drupal search results, and make sure Google doesn't pick up on it. Any suggestions?
---- Note ----
I've accepted the answer from mingos (thanks btw) because even though it's not a full answer / solution, it gave me some good things to think about. Additional answers are still welcome.
In Drupal 7 you can use: http://drupal.org/project/internal_nodes
Description: Some content/nodes should never be viewed directly; only visible be through something else such as Views or Panels. This module denies access to node/[nid] URLs while allowing the content to stay published and otherwise viewable.
Full disclosure: I am the creator and co-maintainer of Internal Nodes. I found this question while searching to see how the module could be found on Google.
Tough one.
If you want to have many nodes like this and do the "displaying elsewhere" dynamically, I can't think of anything right now (at 2:20 AM I rarely can).
If there is onne such page (or very few), I'd restrict access to it by any available means (Permissions, Nodeaccess, Content Access, TAC, whatever) and then create special themes for the pages where the restricted content should be displayed. The themes would contain database queries, fetching content from the restricted nodes.
Other possibility might include creating a special theme for the hidden nodes in question (perhaps all belonging to the same content type?). Make full node display nothing (or a message saying the access is restricted) and add a ROBOTS meta tag asking Google not to index the page. Make the teaser view available though - you can display it freely inside a view, but since /node/1234 is the FULL view, the actual content will be unavailable here.
Dunno if this solves your problem, hope it helps at least a bit.
I found this page after running into this same problem.
What I found worked for me might be part of the answer you need:
Take a look at the Page Manager Redirect Module http://drupal.org/project/page_manager_redirect . I just started playing with it.
It uses the Page Manager module of CTools to redirect one page to another. What makes this most powerful is that Page Manager uses Contexts. So, if you want to redirect all pages of a particular content type, you can do so.
I just started to use it (instead of Taxonomy Redirect and Path Redirect) to redirect (301 response code) my taxonomy terms for a particular vocabulary to particular nodes.
In your instance, you should be able to use contexts to filter for specific pages.
Of course this doesn't solve the problem of these nodes coming up in search results.
There is also another module Rabbit Hole which has a similar functionality like Internal Nodes but works for all entities, not only nodes.
I am having the same problem, and are currently thinking of the following solution where all the content of a node is to be displayed to certain users (permission based):
- unpublish node
- create a new published checkbox
- create a view with fields that shows alle the content
Haven't tested it thoroughly yet, but it seems to work.
The node is to be displayed to the creator (only one in permission 1), some of it to permission 2 and all of it to permission 3.
Any comments on this solution.
I assume this will also exclude it from search, but permission 2 and 3 needs to be able to search it. Still haven't figured that one out.
I used Rules module with an "entity is of bundle" and the built-in "Page redirect" action.
There is a really easy way to do this if you only want to show a content type through a view.
create a content type as and make it unpublished.
create a view and on the filter option set the filter to "Content: Published (No)"
the view will give anon users access to the content through the view but they won't have access to the unpublished content at the direct link to the content.
I am working on a specialized instance of MOSS for a client where What I am wanting to do is hide elements on the master page. In particular, I want to hide the main top navigation bar, the search functionality and the label that shows up in the upper-left-hand corner that tells you the name of the site you are on. So I made a copy of the default.master, and then in SP Designer I set the visible attributes for the placeholders for these blocks to “false” in the new master file.
I can then assign the master to my normal site collection no problems and it seems to been working like I want it to. But when I go to look at the system pages (i.e. any of the forms or backend stuff), it is still using the old default master. And when I tried to set the System Master Page to my customized master file, my MOSS instance threw a File Not Found error. Then certain parts of the admin area just started failing in that same way (i.e. I would try to go into Site Settings -> Content and Structure and it also would throw a File Not Found error) Then at one point, the whole Site Collection would throw “Unknown Error” and there didn't seem to be a way to recover, short of reverting the state of the VM I am running MOSS in for development purposes.
So I am curious, what is the best way to create a custom master page and then hide elements on that page? I realized that my web cluster didn’t have the proper flag set up to actually show me real ASP error messages, so I am going to change that tonight when I get home and see what SP is really telling me about all of this. I have also read that changing the application.master file is not recommended, but I figured I could get away with making a custom page for the Site and System master pages and not worry about application.master. I have been reading a bunch of Heather Solomon articles as well as various other things. They all basically say that it’s ok to hide elements on a master page, but not delete them outright as SP will break if you do that. Would it be advisable to use a JS/CSS hack to manually hide elements that way, rather than actually making a new master page?
You create an asp:placeholder with the visible attribute set to false and place the contentplaceholders that are to be hidden in that container, weird I know but it works... as for the system.master you probably would want to make a copy of the system.master that SharePoint uses and then alter that one in the same manner.
Thank you so much for posting this. Works like a charm. I was so afraid because everyone says not to mess with the Application.Master. All I did was open it with Notepad and add Visible="false" (I wanted to hide the topnavigation bar because I have custom tabs that display depending on a user's permissions which are controlled by code in default.master. But then if a user had to upload a file, upload.aspx uses application.master and all the tabs would be displayed.)
I edited this line only:
wssuc:TopNavBar id="IdTopNavBar" runat="server" ShouldUseExtra="true" Visible="false"
Works like a charm!
Note that the following pages will also be affected:
Site Settings
View all site content
Workflow settings of a document library
Recycle Bin
Search results