Pretty much any scripting language can get a page from a website. So just pick the one you are most comfortable one. Perhaps php if you aren't sure.
This would be a back end process as you have probably guessed. Your user would type the search into a field which would be passed to the php script, and the script would execute the query on each search engine you want. It would then collate the results and filter as required, and present the page to the user.
Google and other search engines provide an API which allows you to interact with their site programatically, and let the search engine know how you want the results returned.
Some do not, in which case you would have to interpret the results yourself. This is referred to as 'scraping'. The better scraping libraries seem to be written for ruby. You could write the whole site in Ruby on Rails to take advantage of this, but check with your host that they support Ruby on Rails applications.
Have you done much coding so far?