A Robots.txt file allows excluding specific webpages from search results. Crawlers ignore the pages and do not include them in the website's index.
Since some crawlers ignore the Robots.txt file, password protection is the best way to hide a webpage.