Thread: 

How to block inner pages using Robots.txt

Ads
kevinmax Offline referral

Posts: 72
Joined: Jun 2015
Reputation: 0

#1
Member
I need to blog website inner pages using robots.txt, I don’t want to ignore all the pages, I want Google to deindex the specific inner pages of my website which is specified in the robots.txt file.

Examples please.
maya Offline referral

Posts: 133
Joined: Jul 2015
Reputation: 0

#2
Member
Let say I wanna block or disallow Google from crawling directory called TEST and a specific URL called TEST.php too:

Code:
Disallow: /TEST.php
Disallow: /TEST*
salenaadam Offline referral

Posts: 86
Joined: Jan 2017
Reputation: 0

#3
Member
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) ... Block a specific web crawler from a specific web page ... Other than with crawler directives, each search engine interprets REP tags differently. .... External Links · Internal Links · Anchor Text.
rawatgoal Offline referral

Posts: 22
Joined: Jun 2017
Reputation: 0

#4
Junior Member
As per me, you should not use robots.txt to hide your web pages from Google Search results. This is because other pages might point to your page, & your page could get indexed that way, avoiding the robots.txt file. If you want to block your page from search results, use another method such as password protection or noindex tags or directives.
maheshseo Offline referral

Posts: 54
Joined: May 2017
Reputation: 0

#5
Member
Disallow:/ your inner page





User(s) browsing this thread: 1 Guest(s)