Thread: 

How to block inner pages using Robots.txt

Ads
kevinmax Offline referral

Posts: 72
Joined: Jun 2015
Reputation: 0

#1
Member
I need to blog website inner pages using robots.txt, I don’t want to ignore all the pages, I want Google to deindex the specific inner pages of my website which is specified in the robots.txt file.

Examples please.
maya Offline referral

Posts: 138
Joined: Jul 2015
Reputation: 0

#2
Member
Let say I wanna block or disallow Google from crawling directory called TEST and a specific URL called TEST.php too:

Code:
Disallow: /TEST.php
Disallow: /TEST*





User(s) browsing this thread: 1 Guest(s)