Is there any application/software/web tool that creates customized search engine - search

I wonder if there is any application/software or tool that creates a customized search engine and then the results will be classified according to the websites I've defined before.
What I want is combining the World Wide News websites lets say they are 4 sites and then when I search for Global Warming the results are classified according to the sites I specified earlier.
CNN.COM retrieves 509655 global warming
BBC.CO.UK retrieves 303255 global warming
ABC.COM retrieves 4588 global warming
ALJAZEERA.COM retrieves 2699 global warming
Is it possible? is there anything can do like this

Give Apache Solr a try.
http://lucene.apache.org/solr/
From the website:
Solr is the popular, blazing-fast, open source enterprise search platform built on Apache Luceneā„¢

Related

Kentico - Combine search indexes

I have 2 different sites - both runs on Kentico but don't have anything in common with separate servers, separate Kentico setup, license etc. Because the 2 businesses now belong to one organization, and we want to cross promote content between the two, including in the smart search results. My question is it possible to add the search index of one site to the other so that when people search they're searching for content on both sites. If yes, how can it be done? Thanks!
Depending on what version you're using you can create a custom Azure Search index which will support what you're looking for. But this is only available for v11 and newer.
With previous versions, you should be able to create a custom index and take advantage of Azure Search as well.
Yes in theory, you would have to copy the search index from one server to the other, and then you would have to manage the results to know the difference so you could link to the other site. I don't think the link in the search index includes the domain, but you could double check that.
Did have a chance to take a look at page crawler index type? https://docs.kentico.com/k10/configuring-kentico/setting-up-search-on-your-website/creating-search-indexes/defining-page-indexes#Definingpageindexes-Configuringpagecrawlerindexes
See if it is possible to configure it for external domain.

Can system notes be accessed via web services?

I am developing a Net Suite application based on web services (SuiteTalk). I have learned about the concept of System Notes, which are a journal of changes on all types of objects. Yet, I see no way to access the list of system notes (say last N notes) via web services. Are you aware if this is possible and how? If not, what would be an alternative solution?
I know you can do it via a RESTlet/Suitelet, not sure about web services.
In JS you can do a search with a joint field on the results.. something like :
nlobjSearchColumn( 'date','systemnotes');
*I think is systemnotes.
Do the filter criteria on a UI search, and link that search to the service script, from there is pretty standard.

Intranet search engine frontend?

We are currently using a number of open source and commercial products to store different type of information (in our internal network). All these products come with their own repositories (usually a database) and their own search capabilities and store different type of information.
Currently the list of products is as follows:
Wordpress
Jira
Confluence
Sharepoint
Dynamics AX
Moodle
The problem we are facing is that when one needs to search for information, one needs to login into all these different systems and execute a search on each one.
I Googled for "search engine frontend", "meta search engine", etc. but i was not able to find something obvious that solves our problem. At this point, i have to say that we are not interested in building one "central repository" to be searched, but instead we are in need of a frontend that will accept the query from the user, "package it" to the format that each of the individual search engines understand, receive the respone (JSON or XML) and present it to the user
Any suggestions on how we could solve it?
Your strategy is right: If you are not interested in building a central index, you will need an application that accepts the query from the user, converts it to the format that each of the individual search engines understand, receives the responses and presents them to the user. This is exactly what a meta search engine does. Even if you use a framework (e.g. Carrot2), much work will probably remain to write those query and result transformers, and you will probably experience slow results because the meta search can never be faster than the underlying search modules of the components you search through.
Instead of querying each backend separately you can put your data into one backend.
You could export your data to a Apache Solr server and use a frontend like CorePages, http://www.corepages.biz . You could add a backlink to your data so you can directly jump to your search result entry, f. e. a Jira Ticket or a wiki article.

Any Plone product for counting file downloads and pages view?

I'm doing an intranet which will not be accessible from outside the company's network and they want to display in Plone some nice statistics about file downloads and pages most viewed.
With the network constrain I can not use google analytics or any sort of external service, so is there any product that allows to count file downloads and pages viewed?
I've seen an idea on uservoice regarding file downloads, and maybe I could extend plone.piwik.now to get page view statistics but I have a hard time thinking that Plone doesn't have any product that (maybe partially) suits this use case.
Any tip?
Essentially, you've two options, you can use one of the existing HTTP log analysis tools and scrape the information you need from those reports, or you can write a custom analytics tool in Plone.
We're currently working on a version which we plan to release as open source later this year. Essentially the patterns we're using is that we have a small javascript which passes parameters to our lightweight logging app. We're than able to show results from the reporting app like "top downloads" in portlets, even filtering by section and keyword.
I don't know about Plone add-ons (nor do I understand why you'd want to use a Plone add-on to do this) but Webalizer and/or http://awstats.sourceforge.net/ are two of the most popular choices.

Setup Sharepoint search?

For some reason my search in the sharepoint site does not work.
I have set up the SSP, the scopes, the crawls, everything but it still does not work
Can someone explain to me how to setup the search? Maybe I did something wrong in the process.
It's not the simplest thing in the world to setup, as it's comprised of a number of components.
You need to check each one to determine where your problem is.
Start from the crawl, and work your way forward to the search production on the page.
So check the following:
Check some servers have been setup to index pages. (You can see this under services on servers in the central administration pages.)
Make sure they're all running correctly. (Not in a half started state.)
Check your crawl log in your SSP to see if it is indexing anything.
(Index different types of content, like file shares, web sites, and sharepoint itself. (check each one.)).
(Note you need a special plugin to index PDF's.).
Check your index is copied to the front end server where it is used.
If it's not, it may be because this hasn't been configured, (Check Services running on servers again)
Then check your site collection setup, and ensure you have a search site configured.
Ensure the site collection search details are configured to use the search site.
Finally check the user doing the searching actually has access to the content being indexed.
Doing all of that should give you some idea of where the problem is.
In addition to Bravax's answer its worth checking that you are not getting stung by the local loopback check.
I had similar problem and ended up using search server express which is free (see my answer from this link: sharepoint 2010 foundation search not working)
I have installed search server express 2010 on top of SPF which works great. it has additional features and work well with sharepoint foundation. her is a link for upgrade and configuration: http://www.mssharepointtips.com/tip.asp?id=1086
You need to crawl the the contents source and add the website to it, then run full crawl to index data.

Resources