Jump to content
xisto Community
fermin25

Some Html Ways To Increase Your Ranking In Search Engines This topic is based in my own experience and don´t apply to all

Recommended Posts

Your Website is lookin good and has all the things youve been told it needed; great titles, H1-H3 tags, meta tags, good unique content and easy navigation. 

So why arent you getting the traffic you deserve? 

These tips are for beginner to advanced SEOs. I have many other good tips in version 2007 and 2008 Top Tips, and I have struggled to come up with 12 more unique tips that are also worthy of being included in anything called "Top Tips", but I think I have found them. 
Ive laid these tips out in a different way than in the past. Rather than spit out 12 random tips, I have arranged these tips into a step-by-step strategy that most anyone with a bit of knowledge can implement. The first part covers getting your website in shape, and the latter half will give you strategies to take advantage of the work you have done. 
1. Make Sure Your Website Is Ready for SEO 
Most website owners have succumbed to the realisation that one way of another, they are going to have to start injecting content into their website to have any chance at a top 10 ranking. So many site owners set about creating or buying content to add to various sections of their website in an attempt to pacify the search engines. In most cases it lasted for a few weeks and was abandoned because it really takes a lot to create good content on a daily basis. Unfortunately the road to Google Hell (not ranking) is paved with good intentions. Unless you were smart and integrated Wordpress, Joomla or some type of content generating application early on, you may have pages all over the place and this means you may also have internal navigation issues causing problems with the robots when they try to crawl your website. If the bots cannot find your pages or get stuck in a loop as a result of poor navigation then any work could potentially be a waste of your time and money. 
Run Xenu (a free tool) on your website to be sure this isn't the case. This will give you a detailed overview of your websites navigation and if you have any problems with your internal linking structure. It will also notify you of any redirects or errors in linking, particularly any 302 redirects or 404 pages that are being returned throughout your entire website. You want to identify any adverse navigation problems that it finds. 302 redirects for instance are frowned upon by Google but are often left in place whilst working on a site. If you are getting 404's you'll be able to identify and fix these. 
Since we are getting our site healthy to maximise the effects of the other 11 steps, let's go to Webmaster Tools and check out your overview. This also does what Xenu does but it shows you what Google in particular is seeing. The image below is an overview in Webmaster Central Tools from Google. The various errors are problems and need to be fixed. 

Not found errors could be pages that have been moved or replaced, or even renamed. You may have internal, or even worse, external links pointing to these pages and these linking pages could have a significant impact on your rankings. 
URLs timed out could be many issues, but whatever they are they need to be resolved. If the Googlebot canât get to the page then any pages beyond this page, and anything on the page may not be indexed. If this is the case then Google will rely on cached versions or whatever they were able to crawl the last time they were able to crawl the page. This could potentially cause ranking problems because its stale content or content that cannot be verified as being relevant to the backlinks pointing at the page. 
Unreachable URLs are internal and external links pointing at a page within your website that cannot be reached. These are internal and external links that someone was nice enough to put up for you, but you moved the page or they may have misspelled the URL. You could; Contact the site owner and have it fixed or; create the page relevant to the anchor text used or; do a permanent 301 redirect and any Page Rank or authority it has will be passed onto the new destination page. I have seen websites with thousands of these errors caused after re-launching their website or whilst rewriting URLs. This could be huge for some of you! 
Once you have fixed these issues create a new xml sitemap and manually submit it through Webmaster Central. 
2. Check your internal canonicalization 
Websites can have more than one URL. (e.g. 
But this has created a useful tool for search engine optimisation. We can nofollow the links in our website to preserve the PR and pass it on to the pages that we want to rank well. The âAbout Usâ, âContact Usâ and âLogin Hereâ pages/links are obviously not pages that we care to rank for. So by adding nofollows to these internal links, we can funnel more PR to the important pages. 
Without nofollows With nofollows

Another great way to use this tip is when you are creating new pages for niche phrases that you identified through your logfiles or Hittail. Build a doorway page off of your homepage. I use this as a âholdingâ area for newly identified hotlist terms that I want to rank quickly. Put nofollows on all of the template/navigational links. Add relative content on the doorway page using your newly found keyword and anchor the keyword or keyword phrase (turn it into a link) to the new page that you created. 
This will funnel all of the available link juice to the new page, targeting the new term, and you will see your page rise through the rankings. 
7. Test your landing pages and evaluate click through rates 
Google has admitted in the last few months that they do look at click through rates in their rating process. I already knew Yahoo did this so it was no surprise that Google does it as well. It just makes sense. Google takes information gathered from their toolbar and Google Analytics to use in their algorithm. Not much of this can be realistically proven but this is something that you should be doing anyways to identify problems or other issues. A typical CTR is between 25-35%. Do A/B or multivariate testing to improve CTR on your landing page. Utilising a PPC account is a great tool for this. The end-goal is to get the user to click through to at least one additional page within your website. Affiliates should be vigilant in the implementation of this technique because most Affiliate sites that I have seen either have banners, and the like, with CTAâs (call to actions) or they have an immediate redirect to their Providerâs website. Neither of these techniques is beneficial to your overall trust or authority rating. Google looks for traffic to stay on your site (at least past the front page), so I optimise landing pages to provoke click-through rates. There are countless ways to implement this. You can incentivise it or use some other technique, but the goal is to get them through to a secondary landing page before you send them to your Provider. This technique may not be for everyone, but for those who do I felt it was a top tip that even if you canât try out now, you can file away for another day. 
8. Identify Supplemental/Omitted Pages & Get Them Out 
You may have 100âs or 1000's of pages that are in supplemental or omitted results. This usually happens because you have duplicate or similar pages, (as is often the case with Affiliate websites), the page has not been updated recently, Google has clustered the URL's, too little content, no back links or poor internal navigation. 
Google has said that they have eliminated Supplemental Results. I believe this is because of the integration of Universal Search, (a.k.a. Blended Search) which just happened to occur shortly before the change. Since aged or orphan pages could actually be documents, news articles, videos, blogs and forums that held valuable and more relevant information than a new page, this âfilterâ needed to be changed to include all available resources. 
The best way to identify these pages is to do enter âsite:yourwebsite.com into the Google search bar and take note of the number of pages Google is Showing indexed; 

This search shows Google can see 3,930 pages. 
âSupplemental sites are part of Googleâs auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Googleâs supplemental index. 
The index in which a site is included is completely automated; thereâs no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank.â 
Nonsense! 

At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had â - Supplemental Resultâ just after the page size. They arenât showing these anymore. Hereâs what they had to say, âSince 2006, weâve completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. Weâre also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer. 
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that weâve been able to make so far, and thinking ahead to future improvements, weâve decided to stop labeling these URLs as âSupplemental Results.â Of course, you will continue to benefit from Googleâs supplemental index being deeper and fresher.â 
Google then said that the easiest way to identify these pages is like this; âFirst, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files. 

Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you donât see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out.â 
Nonsense! 

Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere. 
Now we need to identify why they are there. The main reasons that a page goes to supplemental results are; 

1. Duplicate Content 
2. 301âs. Redirected Pages that have a cache date prior to the 301 being put in place 
3. A 404 was returned when Google attempted to crawl it 
4. New Page 
5. Bad Coding 
6. Page Hasnât Been Updated in Awhile 
7. Pages That Have Lost Their Back Links 

According to Matt Cutts of Google,âPageRank is the primary focus determining whether a URL is in the main web index vs. supplemental resultsâ 
Now this isnât the end-all, but it covers about 95% of the reason that you may be in the supplementals. 

So now we know what they are, how to find them and why they are most likely in the supplemental results. Now letâs get them out of there. 

Here are the different methods that I use when I find that a page has gone supplemental; 
1. Add fresh content to the page 
2. Add navigation to the page from the main page 
3. Move the pages to the first subdirectory if it is not already there 
4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page 
5. Do some social bookmarking on the page 
6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central. 
7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isnât listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page. 

9. Use Geotargeting for Language and Regional Targeting 

The ways that people search and the results the search engines are delivering are evolving rapidly. Smarter queries and more complex algorithms mean that you need to use various techniques to be sure you are showing up in the results. Local search, advanced search, regional search and language-based searches are some of the filters an end-user or a search engine can use in determining who shows up, when they show up and where they show up. 
Geotargeting is one tool Google has refined and one that you can manipulate to a point in order to increase saturation in any market. 
Beyond the obvious on-page considerations, different searches will deliver (in most cases) a different set of results. The results can differ greatly depending on several considerations; 
1 The IP of the end-user 
2 The server location of the website 
3 Any geographically targeted settings in Webmaster Central 
4 The relationship between the search filters and the resulting web pages (I.e. Did they search for Pages from [region] or Pages in [language] 
5 If the end-user is searching a different extension than the defaulted engine (they manually enter Google.com searching for US or English results in a non-US region. 
The other elements that will affect rankings will be back links; 
1 Are the links from a TLD that matches the destination URL (I.e. .nl linking to a .nl website)? 
2 Is the IP linking website located in the same region and the linked URL? 
3 Page rank,linking anchor text, additional outbound links on the page linking to you 
4 On-page relevancy 
5 Language based meta-tags 
6 Everything in the above 5 items relating to the linking website/page 
Any one of these elements can give you an edge over your competition. 
Searching any of Google's (non-US) datasets will generally return a variety of websites when no language or location filter is selected. These can include internal pages in a website, subdirectories (http://ww38.yoursite.com/french), subdomains (http://ww38.french.yoursite.com/), and various TLD's (top level domains like .com and .nl). All 11 of the above factors are present in the automatic algorithm. 
The problem is that no one really knows which approach is best, or which algorithmic attribute is the most effective, so what can we do with this? 
What we want to do is to look at the existing results using the available search filters, and the existing websites that are ranking high and determine what the best strategy for your website is. This takes deep page analysis of your competitors. 
The important thing to note is that there is a hierarchy between one and the other in terms of which is the best solution. Every website has its own individual solution based on their demographics, site mechanics and available resources. What you need to consider are; 
1 Your target market? 
2 If you need or don't need geographical targeting? 
3 If you need language based subdomains or subdirectories? 
4 Should you move hosting? 
5 Can I afford to do it all? 
How & When to Use Geographical Targeting 
Here's what to do if you wish to; 
Geographically target a region? 
1 Create a subdomain or a subdirectory in the native language and use Webmaster Central to geographically target it 
2 Host the subdomain on a server in the native region and use geographical targeting 
3 Build back links from similar TLD's 
Target a specific language? 
1 Create a subdirectory in the native language (I.e. http://ww38.yoursite.com/nl/) 
2 Build back links from same language websites 
3 Do not use geographical targeting 
The reason that you do not want to use geographical targeting along with a language-based strategy is that if the end-user searches in the native language on Google.com, a site using content in that language will be stronger than the same site with geographical targeting in place. (This isn't dependent on whether you use subdirectories or subdomains unless you hosted the subdomain in the target region). 
The answer for me is that I want it all...and NOW!! 
I've recently had subdomains rank with geographical targeting turned on and in the native language rank top 10 in 6 weeks. I've had brand new websites with the appropriate TLD's (I.e. .nl, .de & .es) show up in 8 weeks. I've even had a .com hosted in the US without geographical targeting show up in the top 10 results for âHollywoodâ terms when they had never been in results in the UK. 
You can start with subdomains. Look at your logfiles to determine where the current traffic is coming from to tell you what to do first. Bounce rates can also tell you a lot. 
For example, if your secondary traffic source is Germany and you have a high bounce rate, start with a language-based subdirectory, then maybe move onto creating a subdomain, hosting it in Germany, then set the geographical targeting to Germany in Webmaster Central. Then go back and start all over again using the region that has the next highest contribution. 
Important Things to Remember! 
⢠To target a language using only subdirectories do not use geographic targeting 
⢠You can target a language with both subdomains and subdirectories but if you have a top-level TLD (.com) use subdirectories versus subdomains. 
⢠You can use Google geographical targeting on subdomains and subdirectories 
⢠Your title should be in the native language and/or use regional slang terms where they apply. 
⢠Use language-based meta tags whenever targeting language-based searches 
⢠Host subdomains that are for geographical targeting in the target region 
⢠When you implement the subdomain strategy, link to it from the original website 
⢠Create new sitemaps for each subdomain 
⢠When creating meta tags and content be sure to use native slang. (If you sold pants in the US and the UK. Pants are referred to as trousers. Sweaters are referred to as jumpers. 
⢠Get back links from same TLD's (get a .nl link to your .nl site in the native language) 
⢠If you have a TLD (like .nl or .de) do not use geographical targeting. These domains are already associated with its designated region 
So in a nutshell, I recommend that if you already have an existing website with a TLD like a .com or .cu.uk, and they are your target market, do not use the geographical targeting option. Start building subdirectories using the top native language determined by looking at Google Analytics or your log files. Identify your top referrer language. If the languages are close, as it the case between the US, UK, New Zealand and Australia, use native slang in the title, metatags and content. Build a new xml site map and manually submit it through all the main search engines. 
The next step is to create a subdomain and get it hosted in the region that you are targeting. Build content in the native language and get r submit it, as well as setting up the geographical target in Webmaster Central. 
By implementing this strategy, you will have a significant advantage over most of your competition (or a little less after this article is released). Whether the search is initiated in the region or outside the region, whether your site is located in the region or just hosted there, or even if they search in the native language or manually enter a specific Google engine like Google.com.mx or Google.es, you will have improved saturation. 
10. Use social bookmarking websites for short-term ranking boost and blogs/forums to establish long term trust and authority 

Social Bookmarking - Wikipedia defines it: In a social bookmarking system, users store lists of Internet resources that they find useful. These lists are either accessible to the public or a specific network, and other people with similar interests can view the links by category, tags, or even randomly. Most social bookmarking services allow users to search for bookmarks which are associated with given âtagsâ, and rank the resources by the number of users which have bookmarked them. Many social bookmarking services also have implemented algorithms to draw inferences from the tag keywords that are assigned to resources by examining the clustering of particular keywords, and the relation of keywords to one another. 
GaryTheScubaGuy defines it this way: 
One of the best free ways to get increased ranking, back links and traffic, for very little time commitment other than setup. 
This very moment most search engine algorithms are placing a ton of weight on end-user âbookmarkingâ, âtaggingâ or one of various types of end-user generated highlighting. 
Before doing any of this run a rank report to track your progress. I have tested this on terms showing on page one, on terms ranked 11th through 12th and others buried around pages 5-10. It works on them all in different time frames, and they last for different periods of time. This you will need to test yourself. Be careful because you donât want to be identified as a spammer. Be sure to use genuine content that provides a benefit to the user. 
Here is how I recommend using social bookmarking; 
1. Download this; Roboform. (It says it will limit you but Iâve had as many as 30+ passwords created and stored in the trial version) This will allow you to quickly fill out signup forms and store passwords for the 10 Bookmark sites that I am going to be sending you to. 
2. Within Roboform go to the custom area and put a username and password in, as well as your other information that sites usually ask for to register. This way when you are using these different bookmarks itâs a 1-click login in and becomes a relatively quick and painless procedure. 
3. Establish accounts with these Social Bookmark Sites; 
a. Digg 
b. Technorati 
c. Del.icio.us 
d. NowPublic 
e. StumbleUpon 
f. BlinkList 
g. Spurl 
h. Furl 
i. Slashdot 
j. Simpy 
k. Google Toolbar (w/Google Bookmarking) 
4. Internet Explorer, Firefox and most other browsers have an âadd a tabâ option, but I use Firefox because I can bookmark the login pages in one file, then âopen all tabsâ in one click. From here I click on each tab and in most cases, if you set it up right, Roboform will have already logged you in. Otherwise youâre on the login page and by clicking on the Roboform button everything is prefilled, all you need to do is click submit. (some of the bookmark sites will allow you to add their button into your browser bar, or you can get an extension from Firefox like the Digg Add-on to make things quicker) 
5. Lastly, Install the Google Toolbar. It has a bookmark function as well, and you can import all your bookmarks from Firefox directly into it. Google looks at many different things when assigning rank and trust. For instance, when you search for something and go into a website, Google will remember how long you stayed, how deep you went, and if you came back out into the search to select another site, which means you didnât find what you were looking for. This is all part of the Privacy Issues that have been in the news. 
Hereâs what Google actually says! 
âThe Google Toolbar automatically sends only standard, limited information to Google, which may be retained in Googleâs server logs. It does not send any information about the web pages you visit (e.g., the URL), unless you use Toolbarâs advanced features.â 
They practically spell it out for you. Use their bookmark feature just like you were doing the social bookmarking I outlined above. This is just one more click. 
Some of the elements that Google looks at when grading a website are; 
⢠How much time did the average visitor spend on the site? 
⢠What is the bounce rate on the landing page? 
⢠How many end-users bookmarked the page? 
⢠How many users returned to the search query and then on to a different site? 
Each time you publish an article put a Google Alert on a unique phrase. Each time Google sends you an alert, bookmark it on every bookmark site. This will take some getting used to, but will eventually become second-nature. Remember what I said in the beginning; âOne of the best free ways to get links and traffic, for very little time commitment other than setupâ. 

11. Target Universal Search Results 

Universal or âBlended Searchâ is still fairly new in the search engines and they are working hard at filtering the bad sites from the good sites, but they are also delivering much more than websites in the results. You may have seen this when doing a search and you see a video in the top 5 results. 

Google has turned off their supplemental filter and each time a query is entered, they virtually search their entire database for relevant results. 
The significant difference now is that the results will often include videos, news articles, .doc, .xls and .pdf docs, forums posts and other data in their inventory. 

All of these can be optimised for search. 

One example is to use Adobe Pro to convert pages of your website into a PDF format. Name the file using your keywords and optimise the document just like you would a website using H1-H5 header tags, linked images and keyword anchor text that links back to your website. 
Any links within your PDF or Word doc will credit the links within the document and will deliver additional traffic streams. I used this method a year or so ago on many of my documents and all are indexed and all show in the results. (Search Top 12 SEO Tips for 2008) 

PDFâs are showing up more and more in the top results lately so this can be significant. Video is incredibly viral. Great examples are video of people hitting jackpots on slots. Even just a snapshot of a winner has proven to be a huge traffic source for casino and slot websites. 

Create separate xml sitemaps for each set of videos or documents and submit them manually through Webmaster Central. Be sure to list each individually in your robots.txt file to tell Google where they are located. 
12. Create a Link Acquisition Campaign 
If you havenât done this yet, you are already behind. Link building is an acceptable practice if it is done the right way. Here Iâll tell you the right way. 
You need to set some type of budget. Whether youâre an individual with one or two accounts, or an agency with dozens, you need to have some type of budget set aside for this. It can be money or it can be time. 
Here is how I segment my campaigns; 
15% â 25% to purchase 1-way backlinks. I create custom/bespoke articles that will compliment the ownerâs site, and that have my keyword phrase within it as my anchor text. I also make sure that it is a relevant site to my article/anchor text. 
25% - 30% reciprocal link exchange. Not text links. I create custom/bespoke articles that will compliment the ownerâs site, and that have my keyword phrase within it as my anchor text. I also make sure that it is a relevant site to my article/anchor text. 
25% for blogs and forums. Itâs considered Guerilla Marketing. This takes a little longer because you need to establish yourself within communities and become somewhat of an authority that can post links to relevant and useful content on a site. This will attract actual traffic (and improved rankings), and also create natural back links from other end-users. 
25% Use one of many automated tools (I.e. IBP 9.0 â Axandra) to find potential link partners. 
Now whether you hire students to do these tasks or you do them yourself, they need to be part of your daily routine. I have tested dozens of techniques, each having its own merits dependent on actual demographics, but every campaign has a planned strategy. 

Obviously there are other considerations such as building good content that people want to link to, creating top 10 lists, how-to guides and reviews, but not all markets have the ability to do these in a relevant way. My recommendation in this type of situation, and really any others, is to do a âwho-isâ lookup and pick up the phone and start calling. These are the best kind of back links.



Share this post


Link to post
Share on other sites

Well fermin why you copied only hald article ? i know that you copied this and as it's from multiple sources it's really hard for mods and us to catch you. You really learned to vary the text by rehashing it with other content. Anyway, the stuff where you missed this is at the end. Why didn't you end the post by posting the end of article ? you're such in hurry aren't you. :PYou know what i individually pasted some sourced of this post and found the related article. Now that you rehashed it there is no way i can file flag for it. But remember that, this is plagiarism nonetheless. :P

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.