Disable one of the direct download link - text

we offer some freeware and I created text counter to count the downloads but some of the visitors or web pages uses direct link instead of my
txt counter's links for example
My text counter's link is:
http://www.example.org/dns_jumper/downloads26.php
Direct link is:
http://www.example.org/dns_jumper/askadmin/AskAdmin.zip
is it posible to disable second direct link but allow the only firs direct link (there are more than 20 links like that)

Related

Views / edits current page xWiki

Is there a way to publish the amount of views/edits of a page in/on the page itself in xWiki?
I've used parts of this article to create an accessible form to see usages of different spaces. But I would also like to publish the views/edits on the pages itself.
Thanks in advance!
Richard
I would suggest you use an UIExtension point (see https://extensions.xwiki.org/xwiki/bin/view/Extension/UIExtension%20Module and the tutorial https://www.xwiki.org/xwiki/bin/view/Documentation/DevGuide/Tutorials/UIXTutorial/) from the list of available ones (https://www.xwiki.org/xwiki/bin/view/Documentation/DevGuide/ExtensionPoint/) to add the extra information to be displayed on each page.
I guess the most suitable UIXP would be the Content Footer one (https://www.xwiki.org/xwiki/bin/view/Documentation/DevGuide/ExtensionPoint/ContentFooterUIX/).
Inside the UIX you add, you can do a simple query to fetch the view/edit values from the statistics module (either with the API, if it exists) or with a HQL query, like in the example you've mentioned.

How can i get website report showing links from each page

I want to get a report which specifies what all links are there in each page of the website.I tried using different softwares,but the problem is they are just giving all links without showing exactly which links are there in each page.Also the website i am trying to make a report on is very unstructured,so it's not possible to just classify links,based on url forward slashes.For example,links starting with https://example.com/blog, will not give me all links inside the
'https://example.com/blog' page,because links inside 'https://example.com/blog' page can contains links without 'https://example.com/blog/' in the beginning of the link.
What can i do about this?
Thanks.
In Google analytics, there is no such concept as the next page.
Rather, it only knows the previous page.
It is due to the disconnected nature of the web.
You can, however, use the previous page to trace back to get the data you want.
Instead of looking for all links inside the https://example.com/blog, you will be looking at getting all links where the previous page is https://example.com/blog
More detailed explanation

Magento 2: Get store country full name in CMS pages

I want to show the store country full name in a CMS page.
When using:
{{config path="general/store_information/country_id"}}
It's only showing the 2 letter code. Like "FR".
But I want to show "France".
For phtml/php you can use this solution.
But how to solve this for CMS pages and Blocks?
Not the most elegant solution, but you could create a PHTML file from the solution you linked to, then call that from inside your page/static block with {{block class="Magento\Framework\View\Element\Template" template="Vendor_Module::myfiles/myfile.phtml"}}.

Download/Save a webpage including all its linked sites

In Chrome there's this option of saving a complete webpage. I would like to save a complete webpage but that the pages which are linked should also be saved. Is this possible? I.e. I want to go one step further than simply saving the page I'm looking at. Is it possible to go two steps further? I.e. saving all the linked pages in the linked pages? Is it possible to generalized this to N-steps? I realize this would need a lot of memory space, but is there a code available to do this task?

How to copy a website

I want to copy a whole website (uncopyrighted)'s contents to a Blogspot blog.
Basically if a website has a table of contents like this:
Intro (link)
Chapter 1 (link)
Chapter 2(link)
etc...
How do I make a program so that automatically the links to these articles are posted to my Blogspot blog, and when I click on the links it goes to posts within my blogspot as opposed to the actual link?
So basically I want a program that does this:
When there is a series of links on a website, open link 1, copy & paste on Blogspot (on a new post) open link2, repeat until end of link,
and then create a final post that has links with the same title as the original links, to all the Blogspot posts.
Blogger/Blogspot isn't the best-suited tool for this. Wouldn't it be easier to just mirror the website's content elsewhere?
# Mirror an entire website (-m), convert links (-k), and wait (-w) 2 seconds between requests.
wget -mk -w 2s http://www.example.com/
Still, if you're adamant about it, you could take a look at the Import/Export feature in Blogger.
There is by definition no uncopyrighted content.
If the site you want to copy is using javascript magic then there is no easy general option, otherwise i agree with John to use "wget".
But from your question you should take a basic programming course first and come back in a few months.

Resources