; ;

Digital Bangladesh

“Digital Bangladesh” is a great movement ever I have seen in my life. All the people who are related with this movement take my gratitude. I will be proud if I be the part of this movement because I believe like Renaissance (14th–17th centuries) this movement will change our country...

web search categories

There are three broad categories that cover most web search queries

  • Informational queries – Queries that cover a broad topic (e.g., colorado or trucks) for which there may be thousands of relevant results.
  • Navigational queries – Queries that seek a single website or web page of a single entity (e.g., youtube or delta airlines).
  • Transactional queries – Queries that reflect the intent of the user to perform a particular action, like purchasing a car or downloading a screen saver.

Search engines often support a fourth type of query that is used far less frequently:

  • Connectivity queries – Queries that report on the connectivity of the indexed web graph (e.g., Which links point to this URL?, and How many pages are indexed from this domain name?).

search engine

search engine that finds and organizes the answers you need so you can make faster, more informed decisions.

Web search engine

A Web search engine is a tool designed to search for information on the World Wide Web. The search results are usually presented in a list and are commonly called hits. The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Unlike Web directories, which are maintained by human editors, search engines operate algorithmically or are a mixture of algorithmic and human input.

based local search engine of Bangladesh

Crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner. Other terms for Web crawlers are ants, automatic indexers, bots, and worms or Web spider, Web robot, or—especially in the FOAF community—Web scutter.This process is called Web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Source: wikipedia.org
based local search engine of Bangladesh