Thread: 

How to block inner pages using Robots.txt

Ads
kevinmax Offline referral

Posts: 72
Joined: Jun 2015
Reputation: 0

#1
Member
I need to blog website inner pages using robots.txt, I don’t want to ignore all the pages, I want Google to deindex the specific inner pages of my website which is specified in the robots.txt file.

Examples please.
maya Offline referral

Posts: 133
Joined: Jul 2015
Reputation: 0

#2
Member
Let say I wanna block or disallow Google from crawling directory called TEST and a specific URL called TEST.php too:

Code:
Disallow: /TEST.php
Disallow: /TEST*
salenaadam Offline referral

Posts: 86
Joined: Jan 2017
Reputation: 0

#3
Member
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) ... Block a specific web crawler from a specific web page ... Other than with crawler directives, each search engine interprets REP tags differently. .... External Links · Internal Links · Anchor Text.





User(s) browsing this thread: 1 Guest(s)