A mashup of two different types of web search tools could revolutionise the effectiveness of internet searching, academics believe.
Information scientists Liu Wei and Chen Junjie, of the Taiyuan University of Technology in Shanxi, China, have combined two distinct types of computer software to build a search engine that can intelligently crawl other search engines.
"Traditional search engines cannot cope easily with this rapid expansion of information resources," explained Junjie.
Junjie and his colleagues turned to the concept of 'search agents' in a bid to cope with this problem.
Search agents are intelligent virtual robots that can scan data very quickly looking for keywords and assessing the context of their findings.
The researchers then combined the search agent idea with the so-called meta search engine. Meta searches involve scanning information not from a single source, such as the Google or Yahoo indexes, but from all available sources.
Sites such as ByteSearch, MetaCrawler and Ixquick are well known meta search tools.
The Chinese team has developed a new intelligent search agent and combined it with a meta search tool.
The intelligent agent can determine the context of the user's search terms and choose appropriate search engines to scan. It then retrieves the most relevant results.
Junjie explained that this approach boosts the precision rate and the recall rate of traditional search engines, and fulfils query requests well.
The scientists describe their new search robot in Inderscience's International Journal of Agent-Oriented Software Engineering.
Search engine mashup to 'tame' the web
By Robert Jaques on Jul 10, 2007 12:46PM