Tuesday, August 28, 2007

Google Page Rank - important or just another number?

In my last newsletter I wrote about how your websites Alexa rating is not actually that important to the success of your online business. In this issue, I want to look at another popular statistic - Google Page Rank - and ask a similar question - is it that important?

First a quick overview as to what the Google Page Rank actually is...

Google Page Rank (or PR as it is often referred to as) is simply an indication of the number of websites that link to a specific website. It also attempts to indicate the quality of those links. PR ranges from 0 to 10 (with 10 being the 'best' PR and 0 being the 'worst'). The vast majority of small business websites will usually find they have a PR of between 0 and 5.

To calculate a particular sites PR, Google uses a fairly complicated algorithm based on the number of web links that it is aware of that link to the site in question. This algorithm will also take into account the PR of the page that is providing the link, thus a link from a web page that has a PR of 7 will be considered more valuable than a link from a page with a PR of 4.

Because of the way in which links from higher PR-ranked sites are considered more important, many people are choosing to buy links from websites with high PR's just so that they can increase their own PR. I have seen sites selling a simple text link on their home page for over $700 a month purely based on the fact that they have a PR of 7 or above. This may seem like a lot of money but when you consider that the website owners that are buying these links often have websites that are in no way relevant to the content of the site linking to them, it is absolutely ridiculous.

Take this example, let's say you have a website about health and fitness and you buy a link for $500 a month from a random website because it has a PR of 7. This random website has no relevance to your health and fitness site so what is going to happen? Well, your own PR may increase as a result of the link. You may get a bit of extra traffic but probably not much since people don't click on links that that they are not interested in. You will definitely be $500 poorer at the end of the month!

Instead, why not spend the $500 on pay-per-click advertising and benefit from some quality, targeted traffic?

Of course, there is a bit more to it than that and the reason that most people want to increase their PR is because Google takes this statistic into account when determining where a website will be displayed in their search results. Many people assume that a high PR automatically equals a high search engine placement for their chosen keywords. Not so....

PR is just one of over 100 different factors that Google takes into account when deciding where your website will feature (and these factors and the main algorithm change on a very regular basis). It is perfectly possible for a website with a PR of 5 to get a higher ranking than a PR 7 site if it has better content or is more relevant for the search term in question.

Remember that relevance is all important with Google and a link from a website that is not relevant to your own site will be considered far less important than a relevant one (which makes buying links from random sites purely because they have a high PR even more crazy).

I have read several rumours lately that Google haven't updated PR's for a couple of months and they are considering phasing PR out or modifying it in some way. This is pure speculation but it wouldn't surprise me in the least. PR is easily manipulated (for example by purchasing links as described above) and Google doesn't like to have their calculations or search results manipulated. It stands to reason that they will be looking at ways of preventing this.

So, in summary, is Google Page Rank important to your business?

Well, it is a good indicator of how many other sites link to yours and how important Google considers your site to be BUT I personally don't place too much importance on this statistic and I certainly won't be paying out for a link from a website just because it has a high PR.

As I said above, Google changes it's rules on a regular basis and I see little point in chasing a particular PR on the basis that it might get you higher search engine rankings. If Google do decide to do away with PR, all your work will have been for nothing.

Instead, concentrate on building quality, relevant links from sites that are connected in some way to your own site content. This will ensure that any traffic you receive via these links will at least have an interest in your site. Building links on this basis will automatically increase your PR over time (without the need to pay out for overpriced, irrelevant links). If you do things this way and Google does scrap the PR indicator, it shouldn't affect you in any way and the links you have in place will continue to benefit you.

Remember, in the same way that a low Alexa rating doesn't guarantee traffic or sales, neither does a high PR. Sure a high PR is a 'nice to have' but lots of traffic and high sales is even nicer :-)

Copyright 2004 Richard Grady

About the author:
Richard Grady has been helping people earn online since 1998. Find out more about Richard at: http://www.thetraderonline.com Free wholesale search engines: UK- http://www.wholesale118.co.uk and US http://www.thewholesaletrader.com
Circulated by Article Emporium

Sunday, August 26, 2007

Internet Users Benefit From The Search Engine / Seo Cat-and-mouse

A constant cat-and-mouse game between the major Internet search engines and search engine optimization (SEO) companies has an interesting result: As these two groups try to psych each other out to beat the other’s system, search capabilities are constantly improving for the consumer of Internet services.

Here’s what happens: Each search engine develops a formula for producing the most accurate, relevant results according to the Internet user’s keyword search, and then the SEO companies analyze the search engine’s function and develop a system of strategies to allow their customers to take advantage of that formula. Common strategies include providing enriched text blocks (a few paragraphs of text containing multiple keywords likely to be picked up by the search engines), general interest articles, with embedded links, on particular topics relevant to the customer, development of multiple links to the customer’s site from other websites, and a combination of sponsored links and pay-per-click ads (paid ads which appear on other websites). This combination of strategies promises to improve a website’s ranking in the search results, a valuable asset in cyberspace; if a company can land on the first page of a keyword search, the probability of the Internet user visiting its website skyrockets.

Of course, as soon as the SEO companies figure out a particular search engine’s formula, the search engine changes the rules, since the search engine’s honchos want to provide the most accurate, up-to-date, and relevant information, and not allow website owners to game the system. And of course the SEO companies respond with new, more sophisticated strategies. The result to Internet users is a constantly improving ability to search efficiently for the goods, services, and information they need.

Google is the biggest game in town, but other search engines have their following; Alta Vista is popular among college students and serious researchers, for instance. There’s also dogpile.com, a search engine that searches all the other major search engines and pulls up the most relevant results from each. And new search engines pop up all the time – Ask.com and AskJeeves.com are two examples of newer search engines with popular followings. A good SEO company will develop strategies not just for Google, but for all the major search engines.

Search engines aren’t perfect, of course, and one thing they can’t do at the moment is access specific information contained in the multitude of databases available on the Internet. Federal, state, and some local governments have searchable, free databases which allow users to access this information; other databases, such as Ancestry.com, charge a subscription fee for their use. For public information on the federal level in the U.S., FirstGov.gov is a great place to start.

Search strategies exist for the web surfer as well. Just try doing a Google search for tax information, for instance. Type “taxes” into the keyword search bar and click on Search. In the U.S. you’ll see irs.com, and then numerous companies advertising their tax services. But type in “taxes .gov” and you’ll pull up the same IRS site, followed by many more government website pages, some of which may provide more direct access to the information you need than going through irs.com, and without having to wade through a bunch of accounting firms. Type in “taxes UK .gov” and you pull up government information for the United Kingdom, and so on. Doing a keyword search on the general topic you’re researching should pull up relevant database sites. (By the way, if you want to see an example of great search engine optimization strategies, do a search on “genealogy” and look at all the links to Ancestry.com on the websites you find—they’re everywhere!)

As the cat-and-mouse game continues between search engine companies and SEO firms, the result promises to be ever more sophisticated, powerful, and accurate techniques for accessing valuable information over the Internet. As the continuing success of more and more new search engines suggests, we may have only scratched the surface of our Internet information technology capabilities.

About the Author:
Sam Vogel writes about search engine optimization-seo www.albaengine.com
Article Source: NewFREEArticles.com

Wednesday, August 22, 2007

I Can Get you Removed From Search Engines

One of my clients was talking to me today. He is using our search engine submission tool and was very confused as why his website was showing some weird string queries.

First background about his site. His site has been online since 2000 and is ranked very well on all search engines. He site is very much devoted to real estate.

Couple of days he discovered that his site shows some weird things. To protect my client website I will use (example.com).

What we have done is google “site:www.example.com”

This will show you how many urls of your website google has. However one search showed example.com/?link1=10

He was puzzled as to what the ?link1=10 actually means. So he called us and ask me about this weir stuff that was happening. I guess sometimes we go the extra mile for our clientsJ

What has happened?

What is this /?link1=10?. Nothing was indicating that he has this page on his website, in fact this exact page does not even exist, but yet google found it.

Page /?link1=10 or anything that has example.com/?something_goes_here are called string queries which are used to identify products or even real estate search for homes if customers can view homes on your website.

With result of google finding this weird page, google automatically lowered my client site to the unknown. Basically my client lost ranking from google and other search engines.

Why this happened?

Using sting queries you can simply remove any website from search results, however it will take time to do that. Your website first must return status code 200-OK. This 200 page is simply a soft lending page for your 404 page.

To give you an example, If you remove a page from your web-site and than you re-visit that page again it will show 404 error. In some cases you may set up a custom 404 page which results in “not found”.

To see what you have just go to your website and type “www.mysite.com/anythinguwant.html”.

This should return 404 page or custom 404 page.

But back to 202 pages. To see if your website returns 202 page, search for a tool online. Use "header checker" as your search term.

When you get result it should be similar to this: “HTTP/1.1 200 OK”. It has to show 200OK.

Here is how it works:

To remove pages from search engines you need to have a lots of duplicate content. When I say a lot, we are talking 100’s of pages of duplicate content. Simply one page has to be same as the other page and 3rd page has to be same as 4th pages…etc. They all have to be on your website.

So how this “hacker”..let’s call him that, did this. He did not have access to my clients server, but still he was able to remove his site.

What he has done is checked if his website returns 202 pages OK. If it did, he than assigned random pages to his domain name.

For example: example.com/index.html?haha=iwillremoveyou or example.com/index.html?link1page=330, or example.com/index.html?haha=link2page

You get the idea. What he has done, he has assigned a string query of “haha”. Than he copied all urls into some free hosting company and just wait and relax until search engine picks it up.

Form search engine point of view these are links that point to his domain. More links you have the quicker search engines return to your website.

Because these links were pointing to some new pages on his domain, search engine though that this was a duplicate page, and now there are two main pages, and suddenly his website dropped on all search engines.

See how easy that was? Guess what, MSN is very vulnerable to this! I am not sure if they fixed it yet. If you do this on MSN, it is almost guaranteed someone site will just disappear.

Rest of the search engines, google, msn, yahoo will just lower your ranking, but will not remove you. Not matter what , it is pain to get back to top search results.

How to fix it

You will need to do mod rewrite or 301 redirect .

About the Author:
Martin Lukac, represents EnginePromoter.com, http://www.EnginePromoter.com search engine marketing web-site for search engine optimization and website submission. Promote your website and get top rotating positions on over 250+ search engines, including a niche website submission. EnginePromoter.com also operates http://www.1AmericanFinancial.com and real estate portal http://www.RateEmpire.com
Posted: 27-02-2007
Article Source: ArticlesBase.com

Saturday, August 18, 2007

5 Things To Keep An Eye On In The SEO World In 2005....

After the latest PR update at Google and MSN’s beta search going live, there is one thing for certain in 2005: the world of search is in for some major changes. There has been growing speculation around the SEO world that reciprocal linking is a thing of the past. Rumors are abound that PR means less and less, if anything. Bill Gates came out of his cave to say that “Today’s search is nothing” and that it won’t be that way for long. There are quiet rumblings in the SEO back alleys of a new, state-of-the-art search engine currently indexing the internet. Websites are dropping off the face of the planet. And we’re all left to sit here and put together the pieces. So what is in store for 2005?

1) Reciprocal links, while not becoming totally dead, are decreasing in value, and there will most likely be an algorithm update to lessen their importance. The original thought process behind the importance of a link was that it was seen as a “vote” for the linked-to site. Now that reciprocal links are everywhere, it is hardly a great way to count “votes” for a website. Reciprocal linking will continue around the internet, although the amount of people who try to get away with one-way links (by never getting back to you once you’ve added their link) will increase significantly. This will, of course, be an attempt to acquire one-way links, which brings us to our next subject....

2) One-way links and triangle linking, though already quite popular, should explode over the course of 2005. Both are much harder to control and acquire, which makes Google happy. The triangle link “ploy” makes links look like one-way links even though “Site A” is returning the favor to “Site B” through “Site C”. There will be attempts to sell triangle linking programs and systems by SEO companies, however, the complexity, difficulty and time involved in this scheme will produce ridiculous prices.

3) What this about a new search engine that is going to index every site on the internet, EVERY 10 seconds? Become.com has turned a few heads with it’s claims. Site owners have reported Become Bots spidering “like crazy”. It’s all quite hush, hush, however and you need to have an invite in order to test it out. It should be interesting to see what they’re capable of if and when they decide to go live. I’ll go out on a limb and say that it’s a household name by this time next year.

4) MSN will scrap the “beta” tag on February 1st from it’s sparkling new search engine, which is currently live at search.msn.com and Bill Gates thinks it will rival Google. There is a lot of debate over this issue, but there is no denying that it is far better than the old chugger they were using before. Love him or hate him, Gates has most likely given a hard right to the chin of Yahoo!, which seems to be suffering from a magnitude of quality problems. MSN will be second to Google in total searches in 2005.

5) PR still has importance. However, it is also decreasing in value. PR is only based on the quantity and quality of links (both inbound and outbound) from the given web page. The most obvious reasoning for the declining importance theory is due to the fact that on any given search on Google, the PR of each page seems to have barely any correlation with it’s place in the rankings. For all you PR lovers out there, hold on to your toolbar’s tight, because this could be a bumpy ride.

About the Author:
Bobby Heard is the Vice-President of Abalone Designs (http://www.abalone.ca), which offers Search Engine Optimization at affordable prices.
Article Source: www.iSnare.com

Wednesday, August 15, 2007

Website Accessibility Important For Disabled Visitors

A study released by computing and disability charity, AbilityNet, found that people with disabilities favour the most accessible websites when using the Internet. A poll of over 100 people with disabilities showed that the disabled community use the Internet for information, shopping, banking and leisure, just like everyone else. However, most disabled users will spend their time and money only on businesses that cater to their needs with more accessible sites.

Demographics of UK's Disabled Audience

The International Labour Organisation (ILO) estimates that there are 610 million disabled persons worldwide. The European Parliament estimates that there are 37 million disabled individuals in the EU, over 50% of whom are of working age. 10.3 million UK residents declare that they have a limiting long-term illness. These numbers are significant enough to make anyone realise the potential audience they could be ignoring by creating websites that are hard to access.

Disability Discrimination Act

The Disability Discrimination Act ascerts that website owners have had a legal duty, since October 1999, to ensure that all services provided via their website are accessible to disabled people. Any company not complying with the accessibility guidelines could potentially open itself up to legislation and discrimination lawsuits from disabled customers.

Disability Rights Commission: User-friendly websites for all

The disability rights commission launched guidance yesterday on how to launch a website that is user friendly. The document, Publicly Available Specification (PAS) 78, was developed by the British Standards Institution (BSI) and sponsored by the Disability Rights Commission.

Common Accessibility Problems

Typical accessibility problems encountered by most disabled users include:

  • Text size on some sites is hard-coded so that it cannot be easily enlarged
  • Text labels attached to images Alt text are often uninformative or absent
  • Pictures of text are often used instead of actual text
  • Adverts and features made up of moving images distract visitors with a cognitive impairment
  • Interactive presentations such as 'Flash Movies' present access problems for visitors who cannot use a mouse, are vision impaired or who use speech output or voice recognition software
  • Mini programs embedded in the page such as JavaScripts prevent full access the sites.

Accessible websites are also search engine friendly, which means that by ensuring that your website is accessibility compliant you not only increase your potential audience but also optimise your website for higher search engine rankings.

If your website presents any of the above problems for users with disabilities, you are most likely alienating a very large customer segment. AccuraCast provides usability and accessibility consultation and design services. Feel free to contact us to discuss your website's accessibility.

Other articles to read:

SEO Weekly

About the Author:
Farhad Divecha
Before founding AccuraCast, Farhad worked for and provided consulting services to a number of large and medium sized enterprises in the UK and USA including 3Com, Proctor & Gamble and Household (HFC, HSBC). He has over 7 years of experience marketing products and services online and offline.
Added: 18 Oct 2006
Article Source: http://articles.simplysearch4it.com/article/39812.html

Saturday, August 11, 2007

Do Search Engines Scare You?

What do the words "Search Engine" make you think of? I get immediate mind-pictures which vary but all have similar themes. Sometimes I see one of those tiny submarines which are used to explore the deepest parts of the ocean bed. I have this picture in my mind of the tiny submarine underwater in complete darkness apart from the beam of its searchlight which is probing the gloom. Other times I see a horse-drawn carriage driving through dense fog with only its weak lamp relieving the darkness. This scene is from a movie where the police are racing through London in search of Jack The Ripper. The final picture is of a line of policemen in old-fashioned uniforms advancing across a moor in darkness through rising mist with only their flashlight-beams to light their way.

It can get a bit scary inside my mind at times. If these were dreams rather than passing thoughts, they would probably be analysed as meaning an obsession with darkness and getting lost. Maybe the lights penetrating the darkness symbolise a fear of ignorance (or maybe a fear of getting found out). The obvious link is that these pictures all relate to a search of one kind or another. The first one might represent a search for knowledge, the second one is obviously the search for a criminal and the final one a search for clues at the scene of a crime.

Why would the words "search engine" conjure up these dark visions? If you look at the words we associate with search engines, I think the connection will be clear. We talk about "submitting" our websites to search engines. We don’t "send" our websites or "apply" to search engines or "register" our websites with search engines. We are submissive and the search engines dominate us.

We adopt this submissive posture and the search engines send their robots to "crawl" over our websites. Doesn‘t sound very pleasant does it? The future of your website is entirely in the power of these monsters. You have spent time and perhaps some cash getting your website ready for the visit. Your website, the shop window to your home business, has been tweaked and "search engine optimised", now you can do nothing except hope that your efforts will be rewarded with approval by these invisible inspectors.

Web site owners act as if the search engines are neighbourhood dignitaries: "We must tidy up, the search engines are coming to visit". An anxious time follows while your website, brushed, polished and optimised to the best of your ability, waits to greet these visitors. The most respected visitor is Googlebot. He causes most anxiety and is renowned for being unpredictable. Webmasters try to analyse the "Googledance" in the hope of making Googlebot's visit enjoyable. If we can get the mighty Googlebot to dance instead of merely crawling, he might give us a good report, but Googlebot can't seem to decide what algorithm he prefers. How are you supposed to get his feet tapping? Unfortunately, the search engines are not the most communicative visitors and you only realise they have carried out their examination when something (either good or bad) happens to your protégé's page rank.

The search engines are not like your school teacher who gives you a class test, they are more like the university Board Of Examiners: when you have passed or failed their test, you will never get to know which questions you got right or where your weakness lies. Further improvement will have to be a matter of guesswork but suppose your guess is wrong? You might destroy the very things which met with the search engines’ approval. Then you hear that the search engines don’t agree amongst themselves, so what pleases some of them might lose you points with others. Which ones should you try hardest to impress? Should you turn your website into some sort of private Googledancer and risk offending the rest of the robots? Would it be better to try to please one or two of the larger search engines or a big bunch of the lesser ones? Finally you hear a rumour that the search engines are changing their secret rules anyway but nobody knows what the changes will entail. You despair of ever satisfying the masters of your fate, you feel as if you are stumbling around in the dark. Panic takes you over.

Should you be afraid of search engines?

Anybody in his right mind should be afraid, very afraid.

About the author:
This is one of a series of articles published by the author, Elaine Currie, BA(Hons) at
http://www.Huntingvenus.com
For honest advice on working from home subscribe to Online Profits newsletter by
mailto:huntingvenus@SubscribeMeNow
Circulated by Article Emporium

Thursday, August 09, 2007

How to Generate Quality One way links?

Jack had an amazing idea & started his online business with a vision of making it big someday. But few years down the line Jack is left wondering as to why his business isn’t turning up the way he expected it to!

Mark thinks that he has a better service than his competitors, but still fails to understand why their Balance Sheet figures are running into millions, while his are still in few thousands!

All these puzzling facts make one doubt himself. But my friend, the problem is not with the idea or the service! The problem lies in the promotion strategies. No matter how good an idea or a service is, it won’t be successful unless you put it to the right people in the right manner. Good Web Promotion can do wonders to an average idea; forget what it can do to a good one. Had Jack & Mark adopted the right kind of Internet Promotion strategy, they would have achieved what they wanted to.

Everyone will agree that today’s Web Promotion is dominated by Search Engines, or to be precise, Google. If your site is in the top thirty, for the right “Keyword”, you can be a winner. But how can one make it to the top & stay there for a long time? (Mind you, in this dynamic world called Internet, sticking to your position is as important as getting there!)

The best way to “optimize” your site is through a new “Organic Method of Search Engine Optimization” (SEO) that Google & other search engines just love, generating “Quality One Way Links”. One can generate quality back links by way of “Directory Submission” & “Article Submission”.

Online directories provide you with quality back links, which is precisely what you are looking for. Apart from this, you need to submit your site to the most appropriate category & with a brief description about your business. Thus, they also send targeted audience your way!

Another way of generating quality targeted audience & one way links is by writing quality articles & submitting them to Article Directories! By writing a quality article, you are not just informing your potential customers about your service, but are also sharing your knowledge with them. Again, Article Directories gives you an opportunity to inform people about yourself & also provide a link to your site. This way, people get to know about you & your site. And if your article provides a visitor what he came looking for, you have most definitely found yourself a new client!

Had Jack written an article about his idea & explained his service to his target audience and submitted his site to Hundreds of Search Engine Friendly Directories, he would have been on “Cloud 9” by now. Had Mark written an article informing his potential customers, how his service is better than his customers & also submitted his site to Online Directories, his Balance Sheet too might have run into Millions. To summarize, when you submit your article or your site to various online directories, this is what you get -

1. You generate hundreds of quality back links to your site.

2. People who search these directories may find your article or your sites description interesting & visit your site. Thus you get large targeted audience.

The end result is that your site satisfies all the criterions set by various search engines to have a favorable Rank. You thus reach high positions in search engines & stay there for a long time!

About the Author:
Submit2Please
www.submit2please.com
Article Source: http://www.articles411.com

Tuesday, August 07, 2007

Ambatchdotcom Seocontest Seo Tips

Well, here I am going to tell what I did to get a good position at the ambatchdotcom seocontest. First of all I created a blog post at my personal blog, when I wrote the article I tried to repeat the words ambatchdotcom seocontest many times to increase keyword density. Then, as wordpress do the onsite optimization, I started trying to get backlinks, one of the things I did is backlinked to the entry from my other sites.


Other thing I did to get some links is write articles and submit them to article directories, that gave me a lot of backlinks, maybe with a low pagerank but from different ips and related, so I think it was a great technique.


The last thing I did is buy some links, I got them pretty cheap, that is why I bought them hehe.


Well, I am now in the #7 position, but some time ago I was #1, I hope to get back this position so that I can get the money: $4,000. The second one will get $1,000 and the third $500. I want any of them, I only want to end in the top #3 that is why I am working very hard. If you want to help me you can, simply link to my entry using the anchor text ambatchdotcom seocontest and you will be helping me a lot.


For the ones that don't know which my entry is, it is the following one: http://www.estebanpanzera.com/ambatchdotcom-seocontest/
Lets hope I can win.


About the Author:
Esteban Panzera is the owner of Ambatchdotcom Seocontest Blog. You can find more information at www.estebanpanzera.com/ambatchdotcom-seocontest/
Article Source: NewFREEArticles.com

Sunday, August 05, 2007

Seecrets on Website Promotion: Search Engine Wars - a Different Perspective

The objective of all search engine providers is epitomized by the ideal search often portrayed in the popular television and movie series “Star Trek”. When the captain issues a request for all information on a Klingon spaceship, the search engine intuitively understands that he wants military information. It does not provide information on how much the spaceship costs in the commercial market and where it can be obtained, nor the scientific details that would interest an engineering student.

Search engines are similar to rating agencies like Moody and Standard & Poor’s (S&P). These engines rate web pages similar to the way Moody would rate a company credit rating by giving it a rank. Google’s latest patent application are similar to the methods used by stock charting (technical analysis) – using the same ideas such as Rate-of-Change, Momentum and so forth. It is comparable to Moody apply a patent for some of the methods use for ranking and henceforth, restraining S&P from employing similar algorithms.

The upcoming Sony Playstation 3 will have 1% the processing power of a human brain. Given that most people uses 10% of their brainpower, it is years away, not decades, that machines will have equivalent brainpower.

The respective percentages of all searches done in February 2005 are 36% for Google, 31% for Yahoo, 16% for MSN and the rest shared by the smaller providers. As evident by these numbers, market share can fluctuate significantly over a few months.

Microsoft is numero uno when it comes to desktops and internet browsers although it is facing challenges from Mozilla (internet browsers). Sony (the leader in game consoles), Linux (desktop). Given its large base of customers, this dogged competitor is flexing its muscles against Yahoo and Google. It is foolhardy to write-off Microsoft, given its resilience, market-savvy, financial resources and history of handling challenges from upstarts. MSN spiders are faster in indexing web pages than its two main rivals.

Yahoo is the perennial internet competitor. It has more than 100 million customers and has a presence in every piece of the internet pie. Its search engine revenues rose and it is closing the gap to Google’s dominance.

Some possible dark horses in the race may include Clusty and Accoona. Clusty has some nifty clustering technology that can provide different categories while Accoona has search abilities based on artificial intelligence. Accoona also has a large customer base in China where the internet population is the biggest in the world surpassing that of U.S.

This author’s take:

Despite its pronouncements, Google will need a large customer base to be able to provide “more personalized searches”. Providing 2GB free email is the first step, from which it can harvest a lot of raw data which many privacy advocates are strongly oppose to.

Google’s plan to digitize all the books in the major libraries may lead the company into a legal quagmire of copyright issues involving publishers and authors.

Maybe, the emergence of some “intelligent” browsers with in-built artificial intelligence capabilities which can bypass search engine servers altogether – instead these agents would apply filters or weights according to the wishes of the individual searcher.

Google typifies the successful upstart – self-assured, confident and secretive, causing dismay to analysts and fund managers. During its recent IPO (Initial Public Offering), Google bypassed the traditional route via underwriters and brokers thus denying them a share of the spoils. This slap on the wrist would not go down too well with these lords of the financial world. Should Google face some difficulties or stumbles, this publicly traded company may give new meaning to the expression “When it rains, it pours”.

Expect mergers, takeovers to be the norm. The real leader of the internet would evolve from the diverse parties and it would include the Ebay-Paypal component. We live in interesting times.

Stan Seecrets’ Postulate: “The imminent war for world domination will be fought between the gods of the internet and the gods of finance.”

[This article may be freely reprinted providing it is published in its entirety, including the author’s bio and activating the link to the URL below.]

About the Author:
The author, Stan Seecrets, is a veteran software developer with 25+ years experience at (http://www.seecrets.biz) which specializes in digital asset protection. You can email him with comments and criticism via Stan at Seecrets dot biz. © Copyright 2005, Stan Seecrets. All rights reserved.
Article Submitted On: May 29, 2005
Article Source: http://EzineArticles.com/

Saturday, August 04, 2007

Top Ten SEO Factors

These are what I believe to be the top 10 most important things (not necessarily in order) that you need, in order to get your website found in the search engines.

There are many other factors as well, but if you follow these guidelines, you'll stand a much better chance, and you'll be off to a good start.

1. Title Meta Tag

The title tag is what displays as the headline in the SERPs (Search Engine Results Pages). It's also what displays in the top blue band of Internet Explorer when your site is displayed.

Your title tag of your website should be easy to read and designed to bring in traffic. By that, I mean that your main keyword phrase should be used toward the beginning of the tag. True there are websites being found now that do not use the phrase in the title, but the vast majority still do as of this writing.

Don't make the mistake of putting your company name first, unless you are already a household name, like Nascar or HBO. People are likely searching for what you have to offer, not your name.

Your title tag should be written with a capital letter starting the tag, and followed by all lowercase letters, unless you're using proper nouns. Some people prefer to capitalize every word, too.

2. Description Meta Tag

The description tag is the paragraph that people will see when your page comes up in the search results.

Your description tag should be captivating and designed to attract business. It should be easy to read, and compel the reader to act right now and follow your link. Without a description tag, search engines will frequently display the first text on your page. Is yours appropriate as a description of the page?

A proper description tag is what people will see below your title. You should make proper use of punctuation, and with readability, use your subject and geographical references.

3. Keywords Meta Tag

The importance of Meta keyword tags fluctuates from month to month among different search engines. There is a debate in the SEO community as to whether or not they help at all on certain search engines. In fact, in the summer of 2004 it appeared as if they were losing importance altogether.

However, you'll NEVER be penalized on any search engines for using relevant targeted keywords in moderation, and they can only help you with most, especially Yahoo.

Avoid stuffing your keyword metatags with too many keywords. Just use relevant tags that apply directly to the content of that particular page, and don't overdo it.

4. Alt Tags

The small yellow box that comes up when your mouse cursor is placed over an image is called the ALT tag. Every relevant image should have an alt tag with your key words or phrases mentioned in the tag.

A proper ALT tag goes after the file name, and before the Align indicator.
* - The ALT tag is no longer being considered for ranking purposes by some search engines. That said, it still cannot HURT you, and will still help you with some engines. My recommendation is to continue to use them, but be sure to avoid keyword stuffing. Besides, who nows when the pendulum will swing back the other way?

5. Header Tags

The text of each page is given more weight by the search engines if you make use of header tags and then use descriptive body text below those headers. Bullet points work well too. It is not enough to merely BOLD or enlarge your text headlines.

6. Link Text

Search engine spiders cannot follow image links. In addition to having image links or buttons on your web pages, you should have text links at the bottom or elsewhere. The text that the user sees when looking at the link is called the link text. A link that displays products does not carry as much weight to the search engines as a link called oregon widgets. Link text is very important, and is actually one of the most frequently overlooked aspects of web design that I've seen.

7. Site Map

Using a site map not only makes it easy for your users to see the entire structure of your website, but it also makes it easier for the search engines to spider your site. When the search engine spiders come to visit, they will follow all of the text links from your main index page. If one of those links is to a site map, then the spiders will go right to the sitemap, and consequently visit every page you have text linked to from that site map. On the site map page, try to have a sentence or two describing each page, and not just a page of links.

8. Relevant Inbound Links

By relevant, I mean similar industry or subject related sites. Right now, no single strategy can get your site ranked higher faster than being linked to by dozens of other relevant websites. It used to be that the quantity of incoming links mattered most, but today, it's much better to have three highly relevant links to you from other popular related websites than 30 links from unrelated low ranked sites. If there are other businesses in your industry that you can trade links with, it will help your site enormously. Link to others, and have them link to you. It's proven, and it works. To see who's linking to you, in Google type the following...
link: yourdomain.com

9. Your Content

Not to be forgotten of course, is the actual content of your webpage. It must be relevant helpful information that people want to read. These days, each webpage should be laser focused on one specific product or subject, in order to rank highly for that search phrase. The days of writing one webpage to appeal to dozens of search terms are long gone. Ideally, each page should have between 400 to 650 words on it. Too few, and the search engines won't consider it to be relevant enough. Too many words and the search engine spiders may have a hard time determining the actual subject or focus of the page.

Use your keywords or phrases often, and use them at the beginning of your paragraphs wherever possible. Don't overuse them and make the page sound phony, but don't write a page about a certain subject, and not mention that subject repeatedly either. Reading it out loud to yourself is a great way to judge how natural your text sounds.

Concentrate on writing quality pages that actually appeal to the human reader. Write pages that provide the reader with exactly what they are looking for; that is, information about the exact search phrase they've entered.

10. Avoid Cheating

With all of these tidbits of information, it's tempting to think that you can stuff 100 keywords into your title, or create a page with the phrase oregon widget company being used 100 times in headers, text links, ALT tags, bullet points etc. but that cannot help you. In fact, it can penalize you, and get your website banned from certain search engines.

About the Author:
Scott Hendison is an internet consultant that specializes in search engine optimization and internet marketing. He has written over 100 articles that are available on his website. He has also developed a tutorial area for beginning search engine optimization, at 'SEO101'
Submitted: 2006-10-23
Article Source: GoArticles

Friday, August 03, 2007

What is the Google Sandbox Theory?

Ok, so over the past month or so I've been collecting various search engine optimization questions from all of you. Today, I'm going to answer what was the most frequently asked question over the past month.

You guessed it... What is the Google Sandbox Theory and how do I escape it? When you finish reading this lesson, you'll be an expert on the good 'ole Google Sandbox Theory and you'll know how to combat its effects. So, pay close attention. This is some very important stuff.

Before I start explaining what the Google Sandbox theory is, let me make a few things clear:

The Google Sandbox theory is just that, a theory, and is without official confirmations from Google or the benefit of years of observation.

The Google Sandbox theory has been floating around since summer 2004, and has only really gained steam after February 4, 2005 , after a major Google index update (something known as the old Google dance).

Without being able to verify the existence of a Sandbox, much less its features, it becomes very hard to devise strategies to combat its effects.

Almost everything that you will read on the Internet on the Google Sandbox theory is conjecture, pieced together from individual experiences and not from a widescale objective controlled experiment with hundreds of websites (something that would obviously help in determining the nature of the Sandbox, but is inherently impractical given the demand on resources).

Thus, as I'll be discussing towards the end, it's important that you focus on ·good' search engine optimization techniques and not place too much emphasis on quick ·get-out-ofjail' schemes which are, after all, only going to last until the next big Google update.

What is the Google Sandbox Theory?

There are several theories that attempt explain the Google Sandbox effect. Essentially, the problem is simple. Webmasters around the world began to notice that their new websites, optimized and chock full of inbound links, were not ranking well for their selected keywords.

In fact, the most common scenario to be reported was that after being listed in the SERPS (search engine results pages) for a couple of weeks, pages were either dropped from the index or ranked extremely low for their most important keywords.

This pattern was tracked down to websites that were created (by created I mean that their domain name was purchased and the website was registered) around March 2004. All websites created around or after March 2004 were said to be suffering from the Sandbox effect.

Some outliers escaped it completely, but webmasters on a broad scale had to deal with their websites ranking poorly even for terms for which they had optimized their websites to death.

Conspiracy theories grew exponentially after the February 2005 update, codenamed ·Allegra' (how these updates are named I have no clue), when webmasters began seeing vastly fluctuating results and fortunes. Well-ranked websites were loosing their high SERPS positions, while previously low-ranking websites had gained ground to rank near the top for their keywords.

This was a major update to Google's search engine algorithm, but what was interesting was the apparent ·exodus' of websites from the Google Sandbox. This event gave the strongest evidence yet of the existence of a Google Sandbox, and allowed SEO experts to better understand what the Sandbox effect was about.

Possible explanations for the Google Sandbox Effect

A common explanation offered for the Google Sandbox effect is the ·Time Delay' factor. Essentially, this theory suggests that Google releases websites from the Sandbox after a set period of time. Since many webmasters started feeling the effects of the Sandbox around March-April 2004 and a lot of those websites were ·released' in the ·Allegra' update, this ·website aging' theory has gained a lot of ground.

However, I don't find much truth in the ·Time Delay' factor because by itself, it's just an artificially imposed penalty on websites and does not improve relevancy (the Holy Grail for search engines). Since Google is the de facto leader of the search engine industry and is continuously making strides to improve relevancy in search results, tactics such as this do not fit in with what we know about Google.

Contrasting evidence from many websites has shown that some websites created before March 2004 were still not released from the Google Sandbox, whereas some websites created as late as July 2004 managed to escape the Google Sandbox effect during the ·Allegra' update. Along with shattering the ·Time Delay' theory, this also raises some interesting questions. This evidence has led some webmasters to suggest a ·link threshold' theory; once a website has accumulated a certain amount of quantity/quality inbound links, it is released from the Sandbox.

While this might be closer to the truth, this cannot be all there is to it. There has been evidence of websites who have escaped the Google Sandbox effect without massive linkbuilding campaigns. In my opinion, link-popularity is definitely a factor in determining when a website is released from the Sandbox but there is one more caveat attached to it.

This concept is known as ·link-aging'. Basically, this theory states that websites are released from the Sandbox based on the ·age' of their inbound links. While we only have limited data to analyze, this seems to be the most likely explanation for the Google Sandbox effect.

The link-ageing concept is something that confuses people, who usually consider that it is the website that has to age. While conceptually, a link to a website can only be as old as the website itself, yet if you have don't have enough inbound links after one year, common experience has it that you will not be able to escape from the Google Sandbox. A quick hop around popular SEO forums (you do visit SEO forums, don't you?) will lead you to hundreds of threads discussing various results · some websites were launched in July 2004 and escaped by December 2004. Others were stuck in the Sandbox even after the ·Allegra' update.

How to find out if your website is sandboxed

Finding out if your website is ·Sandboxed' is quite simple. If your website does not appear in any SERPS for your target list of keywords, or if your results are highly depressing (ranked somewhere on the 40 th page) even if you have lots of inbound links and almostperfect on-page optimization, then your website has been Sandboxed.

Issues such as the Google Sandbox theory tend to distract webmasters from the core ·good' SEO practices and inadvertently push them towards black-hat or quick-fix techniques to exploit the search engine's weaknesses. The problem with this approach is its short-sightedness. To explain what I'm talking about, let's take a small detour and discuss search engine theory.

Understanding search engines

If you're looking to do some SEO, it would help if you tried to understand what search engines are trying to do. Search engines want to present the most relevant information to their users. There are two problems in this · the inaccurate search terms that people use and the information glut that is the Internet. To counteract, search engines have developed increasingly complex algorithms to deduce relevancy of content for different search terms.

How does this help us?

Well, as long as you keep producing highly-targeted, quality content that is relevant to the subject of your website (and acquire natural inbound links from related websites), you will stand a good chance for ranking high in SERPS. It sounds ridiculously simple, and in this case, it is. As search engine algorithms evolve, they will continue to do their jobs better, thus becoming better at filtering out trash and presenting the most relevant content to their users.

While each search engine will have different methods of determining search engine placement (Google values inbound links quite a lot, while Yahoo has recently placed additional value on Title tags and domain names), in the end all search engines aim to achieve the same goal, and by aiming to fulfill that goal you will always be able to ensure that your website can achieve a good ranking.

Escaping the sandbox...

Now, from our discussion about the Sandbox theory above, you know that at best, the Google Sandbox is a filter on the search engine's algorithm that has a dampening influence on websites. While most SEO experts will tell you that this effect decreases after a certain period of time, they mistakenly accord it to website aging, or basically, when the website is first spidered by Googlebot. Actually, the Sandbox does ·holds back' new websites but more importantly, the effects reduce over time not on the basis of website aging, but on link aging.

This means that the time that you spend in the Google Sandbox is directly linked to when you start acquiring quality links for your website. Thus, if you do nothing, your website may not be released from the Google Sandbox.

However, if you keep your head down and keep up with a low-intensity, long-term link building plan and keep adding inbound links to your website, you will be released from the Google Sandbox after an indeterminate period of time (but within a year, probably six months). In other words, the filter will stop having such a massive effect on your website. As the ·Allegra' update showed, websites that were constantly being optimized during the time that they were in the Sandbox began to rank quite high for targeted keywords after
the Sandbox effect ended.

This and other observations of the Sandbox phenomenon · combined with an understanding of search engine philosophy · have lead me to pinpoint the following strategies for minimizing your website's ·Sandboxed' time.

SEO strategies to minimize your website's "sandboxed" time

Despite what some SEO experts might tell you, you don't need do anything different to escape from the Google Sandbox. In fact, if you follow the ·white hat' rules of search engine optimization and work on the principles I've mentioned many times in this course, you'll not only minimize your website's Sandboxed time but you will also ensure that your website ranks in the top 10 for your target keywords. Here's a list of SEO strategies you should make sure you use when starting out a new website:

Start promoting your website the moment you create your website, not when your website is ·ready'. Don't make the mistake of waiting for your website to be ·perfect'. The motto is to get your product out on the market, as quickly as possible, and then worry about improving it. Otherwise, how will you ever start to make money?

Establish a low-intensity, long-term link building plan and follow it religiously. For example, you can set yourself a target of acquiring 20 links per week, or maybe even a target of contacting 10 link partners a day (of course, with SEO Elite, link building is a snap). This will ensure that as you build your website, you also start acquiring inbound links and those links will age properly · so that by the time your website exits the Sandbox you would have both a high quantity of inbound links and a thriving website.

Avoid black-hat techniques such as keyword stuffing or ·cloaking'. Google's search algorithm evolves almost daily, and penalties for breaking the rules may keep you stuck in the Sandbox longer than usual.

Save your time by remembering the 20/80 rule: 80 percent of your optimization can be accomplished by just 20 percent of effort. After that, any tweaking left to be done is specific to current search engine tendencies and liable to become ineffective once a search engine updates its algorithm. Therefore don't waste your time in optimizing for each and every search engine · just get the basics right and move on to the next page.

Remember, you should always optimize with the end-user in mind, not the search engines.

Like I mentioned earlier, search engines are continuously optimizing their algorithms in order to improve on the key criteria: relevancy. By ensuring that your website content is targeted on a particular keyword, and is judged as ·good' content based on both on-page optimization (keyword density) and off-page factors (lots of quality inbound links), you will also guarantee that your website will keep ranking highly for your search terms no matter what changes are brought into a search engine's algorithm, whether it's a dampening factor a la Sandbox or any other quirk the search engine industry throws up in the future.

Have you taken a look at SEO Elite yet? If not...
What's stopping you?

Now, get out there and start smoking the search engines!

About the author:
Brad Callen
If you liked the lesson above and want to learn more about SEO, visit http://www.seoelite.com/7DaysToMassiveWebsiteTraffic.htm and get your free copy of "7 Days To Massive Website Traffic!" right now! Brad Callen
SEO Elite
Circulated by Article Emporium

Thursday, August 02, 2007

Strengthening Your Website's Keyword Density

There are many companies offering Search Engine Optimization. Put them to work, and you're bound to see improvement in your search engine rankings. But what if you can't afford to pay for SEO service? Don't worry, because there are many things you can do yourself to make your site "search engine friendly." Here's a few:

Body Text:
Many site owners use keywords inappropriately. You visit their site and all you see everywhere is keywords repeated endlessly all over the page. This can leave a bad taste in a customer's mouth, especially if they don't realize why exactly you chose to write "dog food" 73 times on your home page. In addition, you might give the search engines the idea that you're "keyword stuffing," which can get your site banned from the search engine all-together. Don't get me wrong, it is important to repeat keywords, but tastefully and strategically. As a whole, keyword density should never be more than 7% of your page. This will ensure that your keywords are listed enough to be effective for Search Engine Optimization, while at the same time keeping the search engines (and your customers) happy.

Meta Tags:
Meta tags give search engines and browsers a detailed description of your website. They describe everything from the title of your site to the language your site is using. Keywords can and should be used in Meta Tags. "Title," "Subject," "Description" and "Keywords" are specific meta tags where keywords should be used. They should only be used once in each tag, and when writing, remember that "Title," "Description" and "Subject" are often used to display your site in search engine results, so write carefully. (One quick note: Google is one of the search engines that no longer utilizes the "Keywords" Meta Tag, so be sure to use keywords in the other Meta Tags, as well as the Body, Alt and Anchor Text.)

Alt Text:
This one is easy, and it's a great way for you to list your keyword phrase an additional three times. Within every image tag, Alt Text should be present. This is the text that sometimes shows up when your mouse hovers over an image, or when a surfer chooses to browse the web without images. Search engines have no idea what images actually look like, so they read Alt Text instead. In short, including keywords in the "alt text" of images will increase keyword density throughout your page. Stick to just three images, and you won't be penalized for "Keyword Spam."

Anchor Text:
Anchor Text is the actual text that is displayed in a link. Many links do a poor job of describing their destination, but by using keywords in your Anchor Text, it will not only help describe your links to customers, but to search engines as well. It's not always easy to combine targeted keywords with the title of every internal link, but using this strategy will improve keyword density throughout your page while simultaneously strengthening your site's relationship in the search engines with your targeted keyword(s).

No Script:
No Script tags are widely used to notify users when surfing without the use of Java Script. When Java Script is employed, the message is automatically hidden by the user's browser. In addition to a standard message, a text link navigation menu using keyword Anchor Text can be included. These days, many websites use image, flash and java script links for navigation menus. This can hurt your site's ability to be indexed correctly, since search engines only recognize text links. Creating a text link navigation menu inside a No Script tag will fix this problem. It will also provide another opportunity to improve keyword density using the Anchor Text method. Be sure to use the no script tag for legitemate reasons, like the one mentioned above. Any other use is considered "cloaking," and can lead to severe penalties from search engines.

Inbound Links:
Inbound links can greatly increase your website's visibility to search engines. A search engine's "spider" has no rhyme or reason as to where it crawls. It simply starts with the pages in its index, and follows text links endlessly around the web. If using Keyword Anchor Text within your site is important, then the same applies for inbound links. When the spider sees your inbound URL it makes a record of the Anchor Text associated with it. Using Keyword Anchor Text in you inbound links is the #1 way to strengthen your site's relationship with a keyword!

There are many other things you can do to make your site search engine friendly, but strategic keyword placement should be at the top of your list. Using keywords with Body Text, Meta Tags, Alt Text and Anchor Text are four easy and effective ways to optimize your site.

Publication/Reprint Terms:
-You have permission to publish this article electronically or in print, free of charge, as long as the "About the Author" footer is included. A courtesy copy of your publication would be appreciated.
==================

About the Author:
Jason McElwaine is the owner of P.B. Boston Website Design located at P.B. Boston Website Design. He can be contacted at contact@pinbottle.com
Added: 25 Sep 2006
Article Source: http://articles.simplysearch4it.com/article/37653.html

Wednesday, August 01, 2007

Writing SEO Copy – 8 Steps To Success

We all know that the lion’s share of web traffic comes through the search engines. We also know that keywords and links to your site are the two things that affect your ranking in the search engines. Your keywords tell the search engines what you do, and the inbound links tell them how important you are. This combination is what determines your relevance. And relevance is what the search engines are after.

There’s a lot of information around about how to incorporate keyword phrases into your HTML meta tags. But that’s only half the battle. You need to think of these tags as street-signs. That’s how the search engines view them. They look at your tags and then at your copy. If the keywords you use in your tags aren’t used in your copy, your site won’t be indexed for those keywords.

But the search engines don’t stop there. They also consider how often the keyword phrase is used on the page.

To put it simply, if you don’t pepper your site with your primary keywords, you won’t appear in the search results when a potential customer searches for those keywords.

But how do you write keyword-rich copy without compromising readability?

Readability is all-important to visitors. And after all, it’s the visitors that buy your product or service, not search engines.

By following these 8 simple guidelines, you’ll be able to overhaul the copy on your website ensuring it’s agreeable to both search engines and visitors.

1) Categorise your pages

Before writing, think about the structure of your site. If you haven’t built your site yet, try to create your pages around key offerings or benefits. For example, divide your Second Hand Computers site into separate pages for Macs, and PCs, and then segment again into Notebooks, Desktops, etc. This way, you’ll be able to incorporate very specific keyword phrases into your copy, thereby capturing a very targeted market. If you’re working on an existing site, print out each page and label it with its key point, offering, or benefit.

2) Find out what keywords your customers are searching for

Go to http://www.wordtracker.com and subscribe for a day (this will only cost you about AUD$10). Type in the key points, offerings, and benefits you identified for each page, and spend some time analysing what words customers use when they’re searching for these things. These are the words you’ll want to use to describe your product or service. (Make sure you read WordTracker’s explanation of their results.)

3) Use phrases, not single words

Although this advice isn’t specific to the web copy, it’s so important that it’s worth repeating here. Why? Well firstly, there’s too much competition for single keywords. If you’re in computer sales, don’t choose “computers” as your primary keyword. Go to Google and search for “computers” and you’ll see why… Secondly, research shows that customers are becoming more search-savvy – they’re searching for more and more specific strings. They’re learning that by being more specific, they find what they’re looking for much faster. Ask yourself what’s unique about your business? Perhaps you sell cheap second hand computers? Then why not use “cheap second hand computers” as your primary keyword phrase. This way, you’ll not only stand a chance in the rankings, you’ll also display in much more targeted searches. In other words, a higher percentage of your site’s visitors will be people after cheap second hand computers. (WordTracker’s results will help you choose the most appropriate phrases.)

4) Pick the important keyword phrases

Don’t include every keyword phrase on every page. Focus on one or two keyword phrases on each page. For your Macs page, focus on “cheap second hand macs”. For the PCs page, focus on “cheap second hand pcs”, etc.

5) Be specific

Don’t just say “our computers”. Wherever you would normally say “our computers”, ask yourself if you can get away with saying “our cheap second hand Macs” or “our cheap second hand PCs”. If this doesn’t affect your readability too badly, it’s worth doing. It’s a fine balance though. Remember, your site reflects the quality of your service. If your site is hard to read, people will infer a lot about your service…

6) Use keyword phrases in links

Although you shouldn’t focus on every keyword phrase on every page, it’s a good idea to link your pages together with text links. This way, when the search engines look at your site, they’ll see that the pages are related. Once again, the more text links the better, especially if the link text is a keyword phrase. So on your “Cheap Second Hand Macs” page, include a text link at the bottom to “Cheap Second Hand PCs”. If you can manage it without affecting readability, also include one within the copy of the page. For example, “As well as providing cheap second hand Macs, we sell high quality cheap second hand PCs”. TIP: If you don’t want your links to be underlined and blue, include the following in your CSS file:

Then format the HTML of each link as follows:

As well as providing cheap second hand Macs, we sell high quality cheap second hand pcs.

7) Use keyword phrases in headings

Just as customers rely on headings to scan your site, so to do search engines. This means headings play a big part in how the search engines will categorise your site. Try to include your primary keyword phrases in your headings. In fact, think about inserting extra headings just for this purpose. Generally this will also help the readability of the site because it will help customers scan read.

8) Test keyword phrase density

Once you’ve made a first pass at the copy, run it through a density checker to get some metrics. Visit http://www.gorank.com/analyze.php and type in the domain and keyword phrase you want to analyse. It’ll give you a percentage for all the important parts of your page, including copy, title, meta keywords, meta description, etc. The higher the density the better. Generally speaking, a density measurement of at least 3-5% is what you’re looking for. Any less, and you’ll probably need to take another pass.

Follow these guidelines, and you’ll be well on your way to effective SEO copy.

Just remember, don’t overdo it. It’s not easy to find the balance between copy written for search engines and copy written for customers. In many cases, this balance will be too difficult to achieve without professional help. Don’t worry, though. If you’ve already performed your keyword analysis, a professional website copywriter should be able to work your primary keyword phrases into your copy at no extra charge.

About the Author:
* Glenn Murray is an SEO copywriter and article submission and article PR specialist. He is a director of article PR company, Article PR, and also of copywriting studio Divine Write. He can be contacted on Sydney +612 4334 6222 or at glenn@divinewrite.com. Visit http://www.DivineWrite.com or http://www.ArticlePR.com for further details.
Article Source: www.iSnare.com