Jump to content
xisto Community
Sign in to follow this  
jcguy

Meta Tags To Stop Spiders

Recommended Posts

Sometimes you don't want search engine spiders to spider your site. For example, you may not want your pages to show up in a google search. Here is what you can do: use meta tags in your web pages. You can use meta tags to control indexing and crawling of your site.By default, every single page in your site will be indexed by search engine spiders. To control this default action, just use meta tags!How? Your meta tags must be located in the HTML codes of your pages, in the header between the <head> and </head> tags.So if you want spiders to index your page and follow every link on it, insert:<meta name="robots" content="index, follow">For no indexing but following of links, use:<meta name="robots" content="noindex, follow"> So more combinations are possible, example:<meta name="robots" content="index, nofollow">and:<meta name="robots" content="noindex, nofollow"> Let's say you don't want your pages to be indexed but want spiders to follow links on it, your HTML should starts like this:<html><head> <title>Page title</title> <meta name="robots" content="noindex, follow"> </head><body>body contents</body></html>That's it! Spiders will always check for such meta tags before deciding waht to do with your pages.Hope this helps :)

Share this post


Link to post
Share on other sites

that is very interesting to know jcguy, there are tons of things that you can do with meta tags which i have heard about before, they can be very useful for people in certain situations. This topic seems very familiar to me, were you at Inuration Technologys at all? cos this and that UNI.CC tutorial are familiar to me?

Share this post


Link to post
Share on other sites

If you're using XHTML, be sure to close the meta tags with a / at the end of each tag, like this example:

<meta name="robots" content="noindex, nofollow" />
If having web compliant HTML or XHTML is important to you, also make sure you declare a DOCTYPE in the <html> tag.

Share this post


Link to post
Share on other sites

Does anyone know if there is a way to stop leeches from attaching themselves to files and stealing the bandwidth? could it be done with a specific meta tag or is something else required?

Share this post


Link to post
Share on other sites

another way to block spiders from visiting and indexing your website if you have your own domain is to create a file called "robots.txt" in the root directory of your website. put this in the text file -user-agent: * Disallow: /*- this means that all spiders that are reading this file should not visit or index anything at all.with this file, you can also actually specify which directories and files you do not want the spiders to index, e.g.user-agent: * Disallow: /index.htmlDisallow: /animals/*Disallow: /objects/toaster.html

Share this post


Link to post
Share on other sites

Ohhh wow that's really interesting! =) overture, there is cPanel provided by asta host... I was at inuration so i;ve used cpanel. I think its called Web Protect (htaccess editor)? I'm not too sure...

Share this post


Link to post
Share on other sites

Does anyone know if there is a way to stop leeches from attaching themselves to files and stealing the bandwidth? could it be done with a specific meta tag or is something else required?

<{POST_SNAPBACK}>


 


i assume you're talking about images. here's what you can do -

 

1. create a separate folder and put all your images (or the images you wish to protect) in it.

 

2. create a text file called ".htaccess" in the above folder.

 

3. the text file should contain these lines -

 

SetEnvIfNoCase Referer "^http://www.blah.com/;locally_linked=1

SetEnvIfNoCase Referer "^blah.com; locally_linked=1

SetEnvIfNoCase Referer "^http://blah.com/;locally_linked=1

SetEnvIfNoCase Referer "^blah.com; locally_linked=1

SetEnvIfNoCase Referer "^$" locally_linked=1

<FilesMatch "\.(gif|jpe?g)$">

Order Allow,Deny

Allow from env=locally_linked

</FilesMatch>

 

- and replace blah.com with your domain name.

 

4. now when someone tries to steal bandwidth by linking to your image from their site, the image will not be displayed.

 

 

if there are other files that you want to protect e.g. zip files, just put them in that special folder and add the zip extension to the FilesMatch line -

 

<FilesMatch "\.(gif|jpe?g|zip)$">

 

- that's it!

 

hope this helps :)



Share this post


Link to post
Share on other sites

dissipate, i was talking about leeches witch attach themselves to downloadable files which when clicked use up your bandwidth, as i know someone who had 20 gigs of bandwidth used in a few days due to them. i was wondering how to stop them, would the way you suggest work for downloadable files, like .zip/.rar/.exe...edit:would i have to change the extension types, like you have put gif|jpe?g

Share this post


Link to post
Share on other sites

overture: hmm i'm not really sure what you mean sorry. could you elaborate further? do you mean you're putting up downloadable files like zips, rars, exes and you don't want people to download them or something?? *boggle*

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.