Permanently remove your URL from search engines
To remove content or a URL from Google Search permanently, take one of the following actions to remove the page permanently:- Remove or update the content on your site (images, pages, directories) and make sure that your site returns a 404 (Not Found) HTTP status code. Non-HTML files (like PDFs) should be completely removed from your site. These can be indexed by bots.
- Block access to the content: For example, require a password.
- Indicate that the page should not be indexed using the
noindexmeta tag. This is less secure than the other methods. More information can be found here. - Do not use robots.txt as a blocking mechanism.
Use custom code to hide your Kajabi site
This can be prevented by using the Removals Tool to hide the URL. Alternatively, if you are on an account plan that includes code editor access, like the Pro Plan, you have the ability to add custom code to your site to hide your Kajabi site from search engines.Add custom code to hide your Kajabi site
Prevent a page from appearing on most search engines by including anoindex meta tag in your page’s code.
How to include noindex meta tag in your page’s code
- Open the Website tab from the Dashboard.
- Select Modify code from the dropdown option next to your Live site name.
- From the Snippets directory, open the global_head.liquid file.
- Copy the code snippet below and paste between the tags in your file.
- Click Save:
Add custom code to the legacy template, Premier
If you are using the legacy template, Premier, you will add the same code snippet to your Premier template.How to add your code to your Premier template
- Open the Website tab from the Dashboard.
- Select Modify code from the dropdown option next to your Live site name.
- From the Layouts directory, open the theme.liquid file.
- Copy the code snippet below and paste between the tags in your file.
- Click Save:
Block AI bots - How Kajabi can help
Completely blocking AI bots from scraping your content can be tricky, but Kajabi can help you make it harder for them and discourage bots from scraping. Many AI scrapers identify themselves with a unique code called a “user agent.” Kajabi leverages Cloudflare to protect your sites, and can help you block scrapers with this method. Reach out to security@kajabi.com and we’d be more than happy to help. What are some web crawlers associated with AI? (this list is not exhaustive):- User agent: anthropic-ai
- User agent: CCBot
- User agent: ChatGPT-User
- User agent: GPTBot
- User agent: OmigiliBot