Robots Text is a way of informing search engines whether to crawl a page or not. All pages on your site will be crawled by default. You will have to modify the Robots.txt file if you want to prevent pages from being crawled by search engines.
Add robots text
- Go to Settings > Robots Text under SEO.
- Copy-paste Disallow: /PageName in the Robots Text box.

- Enter the name of the page in the place marked "PageName"
- Click Save.