Wednesday, April 28, 2010
Making the Most Profit From your Articles
You may be thinking that that is a crazy idea. The point is that the author’s resource box remains unchanged and you still get the recognition.
You will still benefit by getting recognized and getting traffic to your site. Doing it this way will compel ezine publishers and website owners to use your articles because they will benefit from this as well.
This works like a charm if you happen to have affiliates promoting your product or affiliates that are signed under you when you are promoting someone else’s products, too. It actually motivates them to use your articles and will bring you more in commissions.
In lots of cases, it is more appropriate to let the publisher replace the links within the article as well as the resource box. This means they would replace the author’s information with their own information. This all depends on what you are attempting to accomplish with the article in the first place.
Other options include letting the affiliates replace all of the links within the article to their own personal links but leaving the resource box as is. This will give you the recognition of an expert and still benefit you with increased commissions. It will also benefit you in the long run because you will eventually become an established expert in that field.
Selling Your Private Label Rights
There are plenty of ways to profit from your articles. You have the option of writing and distributing articles in an attempt to drive traffic to your site where sales are made. You can allow your affiliates to replace your links with their links or you can compile your articles and sell them as an informational product.
Another way to profit from your articles is to sell the private labels to them but you will not benefit from the links in the article, only from the people who purchase the rights of the article from you.You won’t benefit from the recognition of being an expert either. You will only benefit from the income you make from selling the article and the rights to it.
The ideal way to do it would be to bundle articles in one file. They should touch one the same subject and the folder should be compressed. Then, you would upload it to your website and link it from a webpage.
If you write a large number of articles on a regular basis, it would be a great idea to join a monthly membership site. You would pay a monthly fee and continually upload new batches of private label right articles.
If the articles you are writing are of high quality, this can become a very profitable business for you. You wouldn’t even have to write the articles yourself. You can hire a ghostwriter to do them for you, choosing the topics based on the request of the members.
If you decide to go this route with it, you will need to create a private label rights license, which will be part of all of your articles from that point forward. The license will have to be published within the area of your member’s website also.
In some cases, you may not have to create a members website at all. It is sometimes a good option to contact the owner of the membership site and sell the batches of articles straight to them. Make sure to establish this relationship before you start producing articles for this purpose. You would hate to write all of those articles with nowhere to sell them. These options are great fro consideration and have been very profitable for the ones involved already.
Saturday, April 24, 2010
What is Sitemap and How It Works?
What is a Google Sitemap?
A Google Sitemap is a very simple XML document that lists all the pages in your website, but the Google Sitemaps program is actually much more important than that. In fact, the Sitemaps program provides a little peek inside Google's mind - and it can tell you a lot about what Google thinks of your website!
Why Should You Use Google Sitemaps?
Until Google Sitemaps was released in the summer of 2005, optimizing a site for Google was a guessing game at best. A website's page might be deleted from the index, and the Webmaster had no idea why. Alternatively, a site's content could be scanned, but because of the peculiarities of the algorithm, the only pages that would rank well might be the "About Us" page, or the company's press releases.
As webmasters we were at the whim of Googlebot, the seemingly arbitrary algorithmic kingmaker that could make or break a website overnight through shifts in search engine positioning. There was no way to communicate with Google about a website - either to understand what was wrong with it, or to tell Google when something had been updated.
That all changed about a year ago when Google released Sitemaps, but the program really became useful in February of 2006 when Google updated it with a couple new tools.
So, what exactly is the Google Sitemaps program, and how can you use it to improve the position of your website? Well, there are essentially two reasons to use Google Sitemaps:
1. Sitemaps provide you with a way to tell Google valuable information about your website.
2. You can use Sitemaps to learn what Google thinks about your website.
What You Can Tell Google About Your Site
Believe it or not, Google is concerned about making sure webmasters have a way of communicating information that is important about their sites. Although Googlebot does a pretty decent job of finding and cataloging web pages, it has very little ability to rate the relative importance of one page versus another. After all, many important pages on the Internet are not properly "optimized", and many of the people who couldn't care less about spending their time on linking campaigns create some of the best content.
Therefore, Google gives you the ability to tell them on a scale of 0.0 to 1.0 how important a given page is relative to all the others. Using this system, you might tell Google that your home page is a 1.0, each of your product sections is a 0.8, and each of your individual product pages is a 0.5. Pages like your company's address and contact information might only rate a 0.2.
You can also tell Google how often your pages are updated and the date that each page was last modified. For example your home page might be updated every day, while a particular product page might only be updated on an annual basis.
What Google Can Tell You About Your Site
Having the ability to tell Google all this information is important, but you don't even need to create a sitemap file in order to enjoy some of the perks of having a Google Sitemaps account.
That's because even without a Sitemap file, you can still learn about any errors that Googlebot has found on your website. As you probably know, your site doesn't have to be "broken" for a robot to have trouble crawling it's pages. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. Therefore, you can see where these problems are and fix them before your pages get deleted from the index.
You can also get information on the types of searches people are using to find your website. Of course, most website analytics tools will give this information to you anyway, but if the tool you use doesn't have this feature, then it's always nice to get it for frëe from Google.
But the best part of the Sitemaps program is the Page analysis section that was added in February of 2006. This page gives you two lists of words. The first list contains the words that Googlebot associates with your website based on content on your site. The second list contains words that Googlebot has found linking to your site!
Unfortunately, Google limits the number of words in each list to 20. As a consequence, the inbound links column is partly wasted by words such as "http", "www", and "com" - terms that apply equally to all websites (hey Google, how about suppressing those terms from the report?). That said, this list does provide you with a way to judge the effectiveness of your offsite optimization efforts.
When you compare these two lists, you can get an understanding of what Google thinks your website is about. If the words on your Site Content column are not really what you want Googlebot to think about your site, then you know you need to tweak your website's copy to make it more focused on your core competency.
If, on the other hand your inbound links don't contain any keywords that you want to rank well for, then perhaps you should focus your efforts in that direction.
Above all else, you really want these two lists to agree. You want your inbound linked words to match up to the site content words. This means that Google has a clear understanding of the focus of your website.
Additional Benefits of the Sitemaps Program
Google has even started notifying Sitemaps-participating Webmasters if they are breaking any of Google's Webmaster Guidelines. This can be very valuable information if your site suddenly becomes de-listed on Google and you don't know why.
Only Sitemaps participants can get this information, and it is only provided at Google's discretion. In fact, Google will NOT notify you if you are creating worthless websites that offer no original content, or if you are creating thousands of doorway pages that are redirecting to other web sites. Google doesn't want to give the sp@ammers any clues as to how to improve their techniques.
How Do You Get Started with Google Site Maps?
The first thing you must do is obtain a Google Account. If you already have a Gmail, Adsense, or Adwords account, then you are all set. If not, you can register an account by visiting the Google Accounts page.
Building your sitemap file is pretty easy to do if you are familiar with XML, and if you aren't you can always use a third-party tool such as the ones that are listed on Google's website. Google also has a "Sitemap Generator" that you can download and install on your server, but unless you are fairly adept at managing Python scripts, you should probably stick to the third-party tools.
At any rate, once you have your Google Account and your Sitemap file built, the rest is very easy. All you have to do is:
1. Log into your account
2. Type your website's URL into the "Add Site" box and clíck on "OK"
3. Clíck on the Manage Sites link for the website you are adding, and add your sitemap file to your account.
Google Sitemaps - An Excellent SEO Tool
Google Sitemaps help Googlebot quickly find new content on your website. They allow you to tell Google what's important, what's new, and what changes often. The tools provided to webmasters through the program can play a vital role in helping you understand how the search engines (especially Google) view your website.
Using this information you can dramatically improve the position of your website and quickly clear up any issues Google finds. You can also use the tools provided by Google to gauge the effectiveness of your off-site optimization efforts so you can better focus your time and energy on activities that bring you the most success.
Search Engines History
SEO began in the mid-1990s, as the first search-engines were cataloging the early Web. Initially, all a webmaster needed to do was submit a site to the various engines which would run spiders, programs to "crawl" the site, and store the collected data. The default search-bracket was to scan an entire webpage for so-called connected search-words, so a page with many not the same words matched more searches, and a webpage contains a dictionary-type listing would match closely all searches, limited only by unique names. The search-engines then sorted the information by topic, and served results based on pages they had spidered. As the number of documents online kept growing, and more webmasters realized the value of organic search listings, so stock search engines began to sort their listings so they could display the most relevant pages first. This was the start of a friction between search engine and webmaster that continues to this day.
At first search-engines were guided by the webmasters themselves. Early versions of search algorithms relied on webmaster-provided that information such as category and product meta tags. Meta-tags provided that a guide to each page's content. When some webmasters began to abuse meta-tags, causing their sides to rank for irrelevant searches, search engines abandoned their consideration of meta-tags and instead developed more complex ranking algorithms, taking into account factors that elevated a limited number of words (anti-dictionary) and was more diverse, including:
Text within the title tag
Domain name
URL directories and file names
HTML tags: headings, bold and emphasized text
product density
product close proximity
Alt attributes for images
Text within NOFRAMES tags
But, relying so extensively on factors that were still within the webmasters' restricted control, search-engines continued to hurt from abuse and ranking manipulation. In order to provide better results to their users, search-engines had to adapt to ensure their SERPs showed the most relevant search results, rather than useless pages filled with numerous keywords by unscrupulous webmasters, using a bait-and-switch lure to display unrelated webpages. This led to the rise of a new kind of search engine.
The History of Yahoo! - How It All Started...
Yahoo! began as a student hobby and evolved into a global brand that has changed the way people communicate with each other, find and access information and purchase things. The two founders of Yahoo!, David Filo and Jerry Yang, Ph.D. candidates in Electrical Engineering at Stanford University, started their guide in a campus trailer in February 1994 as a way to keep track of their personal interests on the Internet. Before long they were spending more time on their home-brewed lists of favorite links than on their doctoral dissertations. Eventually, Jerry and David's lists became too long and unwieldy, and they broke them out into categories. When the categories became too full, they developed subcategories ... and the core concept behind Yahoo! was born.
The Web site started out as "Jerry and David's Guide to the World Wide Web" but eventually received a new moniker with the help of a dictionary. The name Yahoo! is an acronym for "Yet Another Hierarchical Officious Oracle," but Filo and Yang insist they selected the name because they liked the general definition of a yahoo: "rude, unsophisticated, uncouth." Yahoo! itself first resided on Yang's student workstation, "Akebono," while the software was lodged on Filo's computer, "Konishiki" - both named after legendary sumo wrestlers.
Jerry and David soon found they were not alone in wanting a single place to find useful Web sites. Before long, hundreds of people were accessing their guide from well beyond the Stanford trailer. Word spread from friends to what quickly became a significant, loyal audience throughout the closely-knit Internet community. Yahoo! celebrated its first million-hit day in the fall of 1994, translating to almost 100 thousand unique visitors.
Due to the torrent of traffic and enthusiastic reception Yahoo! was receiving, the founders knew they had a potential business on their hands. In March 1995, the pair incorporated the business and met with dozens of Silicon Valley venture capitalists. They eventually came across Sequoia Capital, the well-regarded firm whose most successful investments included Apple Computer, Atari, Oracle and Cisco Systems. They agreed to fund Yahoo! in April 1995 with an initial investment of nearly $2 million.
Realizing their new company had the potential to grow quickly, Jerry and David began to shop for a management team. They hired Tim Koogle, a veteran of Motorola and an alumnus of the Stanford engineering department, as chief executive officer and Jeffrey Mallett, founder of Novell's WordPerfect consumer division, as chief operating officer. They secured a second round of funding in Fall 1995 from investors Reuters Ltd. and Softbank. Yahoo! launched a highly-successful IPO in April 1996 with a total of 49 employees.
Today, Yahoo! Inc. is a leading global Internet communications, commerce and media company that offers a comprehensive branded network of services to more than 345 million individuals each month worldwide. As the first online navigational guide to the Web, www.yahoo.com is the leading guide in terms of traffic, advertising, household and business user reach. Yahoo! is the No. 1 Internet brand globally and reaches the largest audience worldwide. The company also provides online business and enterprise services designed to enhance the productivity and Web presence of Yahoo!'s clients. These services include Corporate Yahoo!, a popular customized enterprise portal solution; audio and video streaming; store hosting and management; and Web site tools and services. The company's global Web network includes 25 World properties. Headquartered in Sunnyvale, Calif., Yahoo! has offices in Europe, Asia, Latin America, Australia, Canada and the United States.
Who We Are
Founded in 1994 by Stanford Ph.D. students David Filo and Jerry Yang, Yahoo! began as a hobby and has evolved into a leading global brand that changed the way people communicate with each other, conduct transactions and access, share, and create information. Today, Yahoo! Inc. attracts hundreds of millions of users every month through its innovative technology and engaging content and services, making it one of the most trafficked Internet destinations and a world class online media company. Our offerings to users on Yahoo! Properties currently fall into five categories: Integrated Consumer Experiences, Applications (Communications and Communities), Search, Media Products & Solutions, and Mobile. The majority of our offerings are available in more than 30 languages. The company is headquartered in Sunnyvale, California, with a presence in more than 25 countries, provinces, and territories.
What We Do
Yahoo!'s vision is to be the center of people's online lives by delivering personally relevant, meaningful Internet experiences.
How We Make a Difference
Yahoo! is also committed to empowering its users and employees through programs, products, and services that inspire people to make a positive impact on their communities. Yahoo! for Good connects people with causes through our products and services, as well as through partnerships with nonprofits such as Global Green, Network for Good, and DonorsChoose. Yahoo! also channels the generosity of its employees through the Yahoo! Employee Foundation, a grassroots philanthropic organization that brings together the talents, time, and financial resources of Yahoo! employees. The foundation has given millions of dollars in grants to organizations around the world.
Keeping an Eye Out for E-zine Publishers
You definitely want to know ezine publishers, especially the ones within your niche. These people are extremely important in your article marketing venture and you should start building your own database of ezine publishers. You want to stay in good terms with these individuals because they can directly alter your status in article marketing.
You will need to start by visiting the many ezine directories and subscribe to the ones within your niche. You should try to read one or two issues before you start to submit articles. Some ezine directories will even provide information regarding whether or not article submissions are even accepted, where they should be sent to, and the article's guidelines. This is really valuable information and should be treated as such.
Once you make the decision to submit your articles to ezines, then you should contact the publisher directly. Ezine companies usually show favoritism to people they have a direct relationship with in comparison to someone who just submits their articles. The ezine publisher may accept article submissions but you will have a much better shot by establishing a relationship with them. Just submitting articles with no other form of contact does no form any kind of relationship.
You could write a personal email instead. Make it a point to tell the publisher how much you enjoy the site and include what you like best about it. Inform them that you have many articles that you know their readers would be interested in. Then, ask them if they would like you to submit them or not. If you are able to point out a particular subject within your niche, do it. This method is much more effective when you are trying to get your foot in the door.
Do not overwhelm ezine publishers with your articles. These are very busy people that get a boatload of email. At first, send only one email a week. This way they can actually get a good feel of your writing style and amount of content. You will have a much better chance getting anywhere with them this way.
You can also send exclusive emails to individual ezine publishers. Publishers really like exclusives. Once a week, write an article real quick exclusively for one ezine. Submit it to the publisher and make sure to let them know that it was written exclusively for them.
Other Article Distribution Strategies
There are tons of ways to get your articles into circulation and you would be well served to use them all. These methods include distributing articles to the article directories or repositories, sending your articles to a list of ezine publishers, using a blog to publish your articles, using other people's blog to promote your articles, submitting your articles to private sites or member only sites, and submitting your articles ion to forums that accept articles. That is a lot of places to use to get your name out there.
There are a few other ways to distribute your articles. Remember, the purpose of the article is to get traffic to your website and increase revenue. Even so, some people forget to put their own articles on their websites. You have written content and websites need content. Make sure that your content makes it to your website.
Another way to distribute your articles is in an eBook where you compiled all the articles that you have for a particular subject. The eBook should be free and should be listed at the many eBook directories on the Internet. This will allow people to give your book away. This is meant to promote your online business so the eBook needs to be done well. The more content and useful information it contains, the more likely that a reader will actually pass it on.
You should also use your articles as part of n email course or series. Again, the series should not cost a thing. Set up an auto responder for the series or course and put a sign up sheet on your website. This will build a bigger email list as well as help your article distribution.
The way you do this is by putting all of the articles in text format and into a folder. From there, compress the folder and upload it to your website. Then, on every single email you send out, include a signature file that lets everyone know that they can download the file for free and use the content however they want as long as the content is not changed and the author's box stay intact.
If you think about it, articles can be used in tons of creative ways. Keep an eye out for new ways to promote and distribute. Never let one of those opportunities pass you by.
Thursday, April 22, 2010
Types of Black Hat SEO Techniques
1- Hidden text – Create modern CSS based websites with JQuery effects. They often hide large portions of text in layers to display them on click or mouse over for usability reasons. Example: CSS pagination.
2- IP delivery – Offer the proper localized content to those coming from a country specific IP address. Offer the user a choice though. Shopping.com does a great job here.
3- 301 redirects – Redirect outdated pages to the newer versions or your homepage. When moving to a new domain use them of course as well.
4- Throw Away Domains – Create exact match micro sites for short term popular keywords and abandon them when the trend subsides. Something like tigerwoodssexrehab.com
5- Cloaking – Hide the heavy Flash animations from Google, show the text-only version optimized for accessibility and findability.
6- Paid links – Donate for charity, software developers etc. Many of them display links to those who donate.
7- Keyword stuffing – Tags and folksonomy. Keyword stuff but adding several tags or let your users do the dirty work via UGC tagging (folksonomy) every major social site does that.
8- Automatically generated keyword pages – Some shopping search engines create pages from each Google search query and assign the appropriate products to each query. You can do that as well if you have enough content.
9- Mispsellings – Define, correct the misspelled term and/or redirect to the correct version.
10- Scraping – Create mirrors for popular sites. Offer them to the respective webmasters. Most will be glad to pay less.
11- Ad only pages – Create all page ads (interstitials) and show them before users see content like many old media do.
12- Blog spam – Don’t spam yourself! Get spammed! Install a WordPress blog without Akismet spam protection. Then create a few posts about Mesothelioma for example, a very profitable keyword. Then let spammers comment spam it or even add posts (via TDO Mini Forms). Last but not least parse the comments for your keyword and outgoing links. If they contain the keyword publish them and remove the outgoing links of course. Bot user generated content so to say.
13- Duplicate content on multiple domains – Offer your content under a creative Commons License with attribution.
14- Domain grabbing – Buy old authority domains that failed and revive them instead of putting them on sale.
15- Fake news – Create real news on official looking sites for real events. You can even do it in print. Works great for all kinds of activism related topics.
16- Link farm – Create a legit blog network of flagship blogs. A full time pro blogger can manage 3 to 5 high quality blogs by her or himself.
17- New exploits – Find them and report them, blog about them. You break story and thus you get all the attention and links. Dave Naylor is excellent at it.
18- Brand jacking – Write a bad review for a brand that has disappointed you or destroys the planet or set up a brand x sucks page and let consumers voice their concerns.
19- Rogue bots – Spider websites and make their webmasters aware of broken links and other issues. Some people may be thankful enough to link to you.
20- Hidden affiliate links – In fact hiding affiliate links is good for usability and can be even more ethical than showing them. example.com/ref?id=87233683 is far worse than than just example.com. Also unsuspecting Web users will copy your ad to forums etc. which might break their TOS. The only thing you have to do is disclose the affiliate as such. I prefer to use [ad] (on Twitter for example) or [partner-link] elsewhere. This way you can strip the annoying “ref” ids and achieve full disclosure at the same time.
21- Doorway pages – Effectively doorway pages could also be called landing pages. The only difference is that doorway pages are worthless crap while landing pages are streamlined to suffice on their own. Common for both is that they are highly optimized for organic search traffic. So instead of making your doorway pages just a place to get skipped optimize them as landing pages and make the users convert right there.
22- Multiple subdomains – Multiple subdomains for one domain can serve an ethical purpose. Just think blogspot.co or wordpress.com – they create multiple subdomains by UGC. This way they can rank several times for a query. You can offer subdomains to your users as well.
23- Twitter automation – There is nothing wrong with Twitter automation as long as you don’t overdo it. Scheduling and repeating tweets, even automatically tweeting RSS feeds from your or other blogs is perfectly OK as long as the Twitter account has a real person attending it who tweets “manually” as well. Bot accounts can be ethical as well in case they are useful no only for yourself. A bot collecting news about Haiti in the aftermath of the earthquake would be perfectly legit if you ask me.
24- Deceptive headlines – Tabloids use them all the time, black hat SEO also do. There are ethical use cases for deceptive headlines though. Satire is one of course and humor simply as well. For instance I could end this list with 24 items and declare this post to a list of 30 items anyways. That would be a good laugh. I’ve done that in the past but in a more humorous post.
25- Google Bowling – The bad thing about Google bowling is that you hurt sites you don’t like. You could reverse that: Reverse Google bowling would mean that you push sites of competitors you like to make those you dislike disappear below. In a way we do that all the time linking out to the competition, the good guys of SEO who then outrank the ugly sites we like a lot less.
26- Invisible links – You’d never used invisible links on your sites did you? You liar! You have. Most free web counters and statistic tools use them. Statcounter is a good example. So when you embed them on your site you use invisible links.
27- Different content for search engines than users – Do you use Wordpress? Then you have the nofollow attribute added to your comment links. this way the search engine gets different content than the user. He sees and clicks a link. A search bot sees a no trespass sign instead. In white hat SEO it’s often called PageRank sculpting. Most social media add ons do that by default.
28- Hacking sites – While crackers hack sites security experts warn site owners that they vulnerabilities. Both discover the same issues. Recently I got an email by someone who warned me to update my WordPress installation. That was a grand idea I thought.
29- Slander linkbait – Pulling a Calacanis like “SEO is bullshit” is quite common these days. Why don’t do it the other way around? The anti SEO thing doesn’t work that good anymore unless you are as famous as Robert Scoble. In contrast a post dealing with “100 Reasons to Love SEO Experts” might strike a chord by now.
30- Map spam – Instead of faking multiple addresses all over the place just to appear on Google Maps and Local why don’t you simply create an affiliate network of real life small business owners with shops and offices who, for a small amount of money, are your representatives there? All they need to do is to collect your mail from Google and potential clients.
Top 5 White Hat SEO Techniques
1. Quality Content
When we first started looking at SEO as a separate entity to website build there was one phrase that we would continually hear, “content is King”, and it’s true. There is nothing more valuable you can do to optimise your site for search engines than offer unique well written content. A search engines aim is to serve up what it believes to be the most appropriate website for any given search to the end user.
Imagine we are the end user and we are searching for a portable air conditioner for hire. We go to our favourite search engine and search for the phrase “portable air conditioner hire”. In this imaginary scenario let’s assume there are only 2 websites that target that phrase,
Website 1
Website 1 consists of a single page with 3 paragraphs of text. The text tells us that the company does portable air conditioning hire and give us a phone number to call.
Website 2
Website 2 contains 30 plus pages all focusing on various portable air conditioning units that we can hire, costs and technical explanations of how portable air conditioning units work.
Which website do you think the search engine is likely to offer to the user first? It’s a rather obvious example but it illustrates the importance of good content so your priority should be good quality content.
2. Use Structural (Semantic) Mark Up and Separate Content from Presentation
Semantically structuring your mark up helps search engines understand the content of your webpage which is of course a good thing. Making proper use of heading elements is essential because search engines give more weight
to the content within the heading elements.
Using CSS to separate the design elements from the content makes for much leaner code and makes it easier for search engines to find what they’re looking for, which is content. Remember content is king!
3. Titles and Meta Data
Providing pages with proper titles and meta data is essential. As discussed in the top 5 black hat SEO techniques section the meta description and meta keywords elements have been so misused in the past that Search
Engines now regard them as less important, it’s still important to use them and use them properly. Titles however still carry a lot of weight and when we think of semantic mark up it is obvious why. The title of anything is a declaration as to what the content might be, so make sure your page titles are a true representation of the content of the page.
4. Keyword Research and Effective Keyword Use
Create your website with keywords and key phrases in mind. Research keywords and key phrases you think people might use to find your site. Single words are not always the most effective target, try multi-word phrases that are much more specific to your product/service and you’ll be targeting end users that are much more likely to want what you are offering.
Use the keywords and key phrases you’ve identified effectively throughout your website. Assign each page 2-3 of the keywords you’ve identified and use the keywords throughout all the important elements of the page. Those are,
Title
Meta Description
Meta Keywords
Heading Elements
Text
Alt Tag
Title Tag
Links
5. Quality Inbound Links
Having inbound links to your website can be likened to having a vote for the good but there are good links and bad links so therefore votes for the good and votes that are bad. Good links are links from other web pages
that are regarded highly by the search engines and are contextually relevant to the content of your page. Bad links are links from web pages that aren’t regarded highly or potentially banned by search engines and have no relevance to the content of your page.
For example;
Imagine we have a website that sells telephones.
Link A: Link on the homepage of the British Telecoms website. = Good
Link B: Link on John Smiths Beer and Ale appreciation links page = Bad
The amount of quality inbound links to your site therefore have some relevance on how high up the search engine your site is placed. When sourcing links you should be thinking of quality over quantity and deep linking to pages within your website not just the home page.
Wednesday, April 21, 2010
Black Hat Search Engine Optimization
Black Hat search engine optimization is customarily defined as techniques that are used to get higher search rankings in an unethical manner. These black hat SEO techniques usually include one or more of the following characteristics:
breaks search engine rules and regulations
creates a poor user experience directly because of the black hat SEO techniques utilized on the Web site
unethically presents content in a different visual or non-visual way to search engine spiders and search engine users.
A lot of what is known as black hat SEO actually used to be legit, but some folks went a bit overboard and now these techniques are frowned upon by the general SEO community at large. These black hat SEO practices will actually provide short-term gains in terms of rankings, but if you are discovered utilizing these spammy techniques on your Web site, you run the risk of being penalized by search engines. Black hat SEO basically is a short-sighted solution to a long-term problem, which is creating a Web site that provides both a great user experience and all that goes with that.
Black Hat SEO Techniques To Avoid
Keyword stuffing: Packing long lists of keywords and nothing else onto your site will get you penalized eventually by search engines. Learn how to find and place keywords and phrases the right way on your Web site with my article titled Learn Where And How To Put Keywords In Your Site Pages.
Invisible text: This is putting lists of keywords in white text on a white background in hopes of attracting more search engine spiders. Again, not a good way to attract searchers or search engine crawlers.
Doorway Pages: A doorway page is basically a “fake” page that the user will never see. It is purely for search engine spiders, and attempts to trick them into indexing the site higher. Read more about doorway pages.
Black Hat SEO is tempting; after all, these tricks actually do work, temporarily. They do end up getting sites higher search rankings; that is, until these same sites get banned for using unethical practices. It’s just not worth the risk. Use efficient search engine optimization techniques to get your site ranked higher, and stay away from anything that even looks like Black Hat SEO.
20 Tools for Tracking Social Media Marketing
Social media websites such as Facebook and Twitter make it easy for people to come together and share opinions,experiences and thoughts on a number of topics. Smart companies understand this and are using the power of social media to connect and inform their customers, and potential customers. Referred to as "Social Media Marketing", it's a smart way to open the lines of communication between you and your prospects.
Social media activities run the gamut from Blogging, micro blogging sites such as Twitter, social networking communities such as LinkedIn and Facebook, video and music uploading sites, discussion forums, photo sharing and more. With so many different sites and ways to participate, it can be difficult to keep track of all your efforts.
Participating in social media doesn't take a lot of money, but it is very time consuming and businesses want to know that all of this investment in time is paying off. Before launching a campaign, you shouldhave a firm grasp on what it is you're trying to accomplish. Is it increasing website traffic? Getting more ezine subscribers? Having more people download your free ebook or whitepaper? Or maybe you just want to work on your company's brand image. Whatever it is, you need to have a plan. As the old saying goes, "If you don't know where you're going, you'll never get there". Have your game plan intact before gettingstarted in marketing yourself, or your company with social media.
There are many different forms of social media, so it's impossible to use them all. Pick three or four, and funnel the majority of your efforts there. Even if you won't be working them all, at the very least you should claim your name or company name on as many social services as possible. You don't want to find out later that someone has the user name that you want. If you need to see if your chosen user name is available try http://Namechk.com which checks dozens of social media networking and bookmarking sites all at once to see if it's available. Claim your name now so you won't end up being sorry later.
So how do you monitor all the buzz? How do you monitor your brand and protect your hard earned reputation? I thought you'd never ask. There isn't one fool-proof method but there are many services and tools out there that will make it easy to see who's talking about you online. Some are free and others will make you pull out your wallet.
These "online reputation management" tools, as they're often referred to, will help you to define keywords, or phrases you wish to track and then watches for any mention of your company name, products, or services. It's important to defend and monitor your online reputation. Similar to High School reputations, protecting your image online is the name of the game, and just as in real life, everyone has one to maintain.
Let's take a look at some of the measuring and tracking tools at your disposal:
1) http://BackTweets.com : A search engine for Twitter. See who's tweeting your links and more. Can also sign up for email alerts of new findings.
2) http://Addictomatic.com : A little different than the others , you type in a keyword, topic or phrase and out it goes searching the top blogs, news sites, Google, Technorati, Ask, YouTube, Flickr, Digg, Topix and more. You'll be given a personalized results page to bookmark with everything it finds related to your topic.
3) http://Buzzoo.net : All about Internet buzz, it tracks several different websites to bring you what's "hot" right now.
4) http://Surchur.com : Search for the latest and greatest on topics that are popular right now. Type in a keyphrase and it searches blogs, social news sites, photo and video sites for your chosen topic.
5) http://Commentful.Blogflux.com : This service watches for comments on blog posts, Digg, Flickr, and others and notifies you of any findings.
6) http://AlertRank.com : A better way to organize and sort Google alerts. Get a daily report emailed to you in a spreadsheet format of what it finds.
7) http://BoardTracker.com : A search engine for forums only. Monitor discussion boards and be notified by email when a thread matching your search terms is discovered. Free to use.
8) http://www.google.com/alerts : I've been using this "secret weapon" for years. Simply type in your name or company name and receive daily emails of results found. They do the work, you receive the links. Free and nice.
9) http://BrandsEye.com : An online reputation management tool with a real-time, concise overview of your online reputation. Multiple levels of services and pricing available. Starting at $1.00.
10) http://Twazzup.com : Another Twitter only search engine.
11) http://SiteMention.com : Type in your url and find out what's being said about you. The results returned are gathered from Google Blog Search, Twitter, FriendFeed, YouTube, MySpace, Digg, Delicious and many more.
12) http://Brandwatch.net: This service tracks your brands, companies, even the competition. Sign up for free weekly updates on any brand. Their detailed reports break down what sites like you, your most talked about features, weekly summary of all blogs and forum activity. Very similar to the old "press clipping" service.
13) http://Trackur.com : A tool that scans many websites including blogs, news, image and video sites, forums and notifies you of any mention of your brand, products/services. Easy to use and affordable. Prices vary depending on need, a personal account is only $18.00 a month, corporate account $88.00 a month with other options also available. Try a "personal" account free for 14 days.
14) http://FiltrBox.com : This one searches online news sources, Twitter and others to find out what's being said about you or your company. Pricing is based on the number of users, but there is a free version that provides "5 filters" and 15 days of what they call "article history".
15) http://SocialMention.com/alerts : Just like Google Alerts but for social media. Enter your keyword phrase and email address to be notified of any new findings. Searches blogs, microblogs like Twitter, bookmarks, comments, events, images, news, videos and more.
16) http://BlogPulse.com : A search engine that searches only for data posted to blogs. Enter your keyword, hit submit and off it goes to gather results.
17) http://BackType.com : Billing itself as a "conversational search engine" they index millions of conversations from social networks, blogs and other social media.
18) http://sm2.techrigy.com : Industry insiders claim this to be the leading social media monitoring solution online. Choice of free or paid version. Free is limited to five searches and 1,000 results. There are three paid professional levels: Gold, Diamond, or Platinum.
19) http://ReputationDefender.com : This paid service finds out everything there is to know about you online, and if negative information is found they try to have it removed. Different types of plans are available such as "My Reputation", "My Privacy", starting at only $14.95 a month.
20) http://Topsy.com : Topsy will track your tweets that have been retweeted so you can find out who's been sending you all that "link love". Type in your Twitter user name and you'll be amazed at what you find.
If you'd like to track incoming traffic from your various social media profiles, an easy way to do it using Google Analytics can be found here http://Tinyurl.com/kuc9rL
Just as there are many ways to market your company using social media, as you can see, there's a multitude of tools and services at your disposal to track and see if all of that hard work is paying off.
Smart companies realize the importance of social media in their marketing efforts and are utilizing it on some level.
How smart are you?
20 Tools for Tracking Social Media Marketing
Social media websites such as Facebook and Twitter make it easy for people to come together and share opinions,experiences and thoughts on a number of topics. Smart companies understand this and are using the power of social media to connect and inform their customers, and potential customers. Referred to as "Social Media Marketing", it's a smart way to open the lines of communication between you and your prospects.
Social media activities run the gamut from Blogging, micro blogging sites such as Twitter, social networking communities such as LinkedIn and Facebook, video and music uploading sites, discussion forums, photo sharing and more. With so many different sites and ways to participate, it can be difficult to keep track of all your efforts.
Participating in social media doesn't take a lot of money, but it is very time consuming and businesses want to know that all of this investment in time is paying off. Before launching a campaign, you shouldhave a firm grasp on what it is you're trying to accomplish. Is it increasing website traffic? Getting more ezine subscribers? Having more people download your free ebook or whitepaper? Or maybe you just want to work on your company's brand image. Whatever it is, you need to have a plan. As the old saying goes, "If you don't know where you're going, you'll never get there". Have your game plan intact before gettingstarted in marketing yourself, or your company with social media.
There are many different forms of social media, so it's impossible to use them all. Pick three or four, and funnel the majority of your efforts there. Even if you won't be working them all, at the very least you should claim your name or company name on as many social services as possible. You don't want to find out later that someone has the user name that you want. If you need to see if your chosen user name is available try http://Namechk.com which checks dozens of social media networking and bookmarking sites all at once to see if it's available. Claim your name now so you won't end up being sorry later.
So how do you monitor all the buzz? How do you monitor your brand and protect your hard earned reputation? I thought you'd never ask. There isn't one fool-proof method but there are many services and tools out there that will make it easy to see who's talking about you online. Some are free and others will make you pull out your wallet.
These "online reputation management" tools, as they're often referred to, will help you to define keywords, or phrases you wish to track and then watches for any mention of your company name, products, or services. It's important to defend and monitor your online reputation. Similar to High School reputations, protecting your image online is the name of the game, and just as in real life, everyone has one to maintain.
Let's take a look at some of the measuring and tracking tools at your disposal:
1) http://BackTweets.com : A search engine for Twitter. See who's tweeting your links and more. Can also sign up for email alerts of new findings.
2) http://Addictomatic.com : A little different than the others , you type in a keyword, topic or phrase and out it goes searching the top blogs, news sites, Google, Technorati, Ask, YouTube, Flickr, Digg, Topix and more. You'll be given a personalized results page to bookmark with everything it finds related to your topic.
3) http://Buzzoo.net : All about Internet buzz, it tracks several different websites to bring you what's "hot" right now.
4) http://Surchur.com : Search for the latest and greatest on topics that are popular right now. Type in a keyphrase and it searches blogs, social news sites, photo and video sites for your chosen topic.
5) http://Commentful.Blogflux.com : This service watches for comments on blog posts, Digg, Flickr, and others and notifies you of any findings.
6) http://AlertRank.com : A better way to organize and sort Google alerts. Get a daily report emailed to you in a spreadsheet format of what it finds.
7) http://BoardTracker.com : A search engine for forums only. Monitor discussion boards and be notified by email when a thread matching your search terms is discovered. Free to use.
8) http://www.google.com/alerts : I've been using this "secret weapon" for years. Simply type in your name or company name and receive daily emails of results found. They do the work, you receive the links. Free and nice.
9) http://BrandsEye.com : An online reputation management tool with a real-time, concise overview of your online reputation. Multiple levels of services and pricing available. Starting at $1.00.
10) http://Twazzup.com : Another Twitter only search engine.
11) http://SiteMention.com : Type in your url and find out what's being said about you. The results returned are gathered from Google Blog Search, Twitter, FriendFeed, YouTube, MySpace, Digg, Delicious and many more.
12) http://Brandwatch.net: This service tracks your brands, companies, even the competition. Sign up for free weekly updates on any brand. Their detailed reports break down what sites like you, your most talked about features, weekly summary of all blogs and forum activity. Very similar to the old "press clipping" service.
13) http://Trackur.com : A tool that scans many websites including blogs, news, image and video sites, forums and notifies you of any mention of your brand, products/services. Easy to use and affordable. Prices vary depending on need, a personal account is only $18.00 a month, corporate account $88.00 a month with other options also available. Try a "personal" account free for 14 days.
14) http://FiltrBox.com : This one searches online news sources, Twitter and others to find out what's being said about you or your company. Pricing is based on the number of users, but there is a free version that provides "5 filters" and 15 days of what they call "article history".
15) http://SocialMention.com/alerts : Just like Google Alerts but for social media. Enter your keyword phrase and email address to be notified of any new findings. Searches blogs, microblogs like Twitter, bookmarks, comments, events, images, news, videos and more.
16) http://BlogPulse.com : A search engine that searches only for data posted to blogs. Enter your keyword, hit submit and off it goes to gather results.
17) http://BackType.com : Billing itself as a "conversational search engine" they index millions of conversations from social networks, blogs and other social media.
18) http://sm2.techrigy.com : Industry insiders claim this to be the leading social media monitoring solution online. Choice of free or paid version. Free is limited to five searches and 1,000 results. There are three paid professional levels: Gold, Diamond, or Platinum.
19) http://ReputationDefender.com : This paid service finds out everything there is to know about you online, and if negative information is found they try to have it removed. Different types of plans are available such as "My Reputation", "My Privacy", starting at only $14.95 a month.
20) http://Topsy.com : Topsy will track your tweets that have been retweeted so you can find out who's been sending you all that "link love". Type in your Twitter user name and you'll be amazed at what you find.
If you'd like to track incoming traffic from your various social media profiles, an easy way to do it using Google Analytics can be found here http://Tinyurl.com/kuc9rL
Just as there are many ways to market your company using social media, as you can see, there's a multitude of tools and services at your disposal to track and see if all of that hard work is paying off.
Smart companies realize the importance of social media in their marketing efforts and are utilizing it on some level.
How smart are you?
SEO Dirty Tricks vs Unconventional Website Marketing
SEO dirty tricks fall into a number of different categories, everything from spamming message board postings with certain keywords and links, to keyword cloaking and lots in between. I am certainly not advocating the use of any dirty trick, or ‘black hat’ (look up the term in Google to get a greater understanding of this rather clichéd phrase) method.
What I am recommending is that sometimes you look just beyond the fold, at perhaps some slight unconventional methods like, for example, contacting your nearest competitor and suggesting you both do a bit of a link swap, hey, some people will like your site, some will like their site – you could benefit from each other and drive traffic both ways, who knows what might be next – shared cost advertising, taking both families to Bermuda on vacation (who knows what the future holds for us).
The search engine companies themselves have become much cleverer with their ranking methods, and dirty tricks in the SEO of a website are more than often picked up upon. If you use obvious techniques to raise your site ranking, aka ‘white hat’ methods, you will certainly get to where you want to go – but do not forget that most other sites are doing exactly the same thing, so look at what else you can do outside of SEO to build up some much needed customers.
Think ‘out of the box’ (I hate that phrase but what else to use!) and when we say ‘out of the box’ what we mean is this; what car do you drive? Have you thought about spending a couple of hundred bucks to get it re-sprayed, advertising your website? Not just adding your website name to the car itself because that would just be boring, plus who actually is interested in going to a website address just because they saw the name of it, no, what we are talking about here is a full blown advertisement for your site on your car, your mates car, your dads car – who wants it, who drives the furthest, etc etc.
Think ‘out of the box’ and certainly undertake ‘white hat’ SEO but also add unconventional methods to your promotional campaigns.
Going back the dirty tricks side of things, never ever forget that Google and other search providers can see right through them. It’s all about coming at it from a different angle and finding a more unconventional way to improve your sites popularity.
One of these dirty tricks, and probably the best known is SEO cloaking; which is almost like a deceptive shifting of words or slight of hand. An ‘SEO‘ consultant can be so good in the usage of cloaking and deceptive practices that an article a consumer thinks will be about the rock and roll hall of fame ends up being about their rock and geode collection. These deceptive cloaking practices could actually make sure the website is actually about the geode Hall of Fame. You can end up thinking you are about to read new information on Bob Dylan and end up getting new information good old Britney (A Ha) Spears, thanks to SEO cloaking and its deceptive ways!
To re-cap on all of this, avoid ‘black hat’ techniques, but do not avoid ‘unconventional marketing;’ He who dares, wins!
Where to Use Keywords For Search Engine Optimization (SEO)
Keywords are, simply put, the SEO fuel. All the efforts put in to fine-tune your website for top rankings on Search Engines will not pay-off if the targeted keyword is not right. The key here is to use most relevant and important keywords in different parts of the website. Instead of using all keywords at once, it is advisable to use the keywords in several portions on your website. Higher relevance is given to certain areas of the HTML documents by various search engines.
Choosing the right keyword is the first and the most critical step of any SEO Campaign.
Following are the Best Practices for how to use your keywords:
In the TITLE Tag – Search results always shows the title tag as your page title. The Title must be short and precise, say 6-12 words. The Keyword can be anywhere in the tag but preferably near the beginning.
In Anchor Text – Anchor text with keyword focus from external links is an important factor. inbound links positive not only for the site but for the keyword as well.
In Headings – The Keywords placed on the heading in H1 is an important factor but the page should have actual text relevant to the keyword. Try to keep the heading precise (7-10 words). It is good to have the keyword as the first word of the heading.
In Domain Name – Domain names incorporating vital keywords is a very important factor.
In URL – Helps a lot to rank well but keyword should be in the other parts of the document. It should not be over optimized else you might be penalized.
Density – It is a measure to check the relevance of the web content. Keep less than 10% keyword density. Upto 7-8% for targeted 2-3 keywords and 1-2% for minor or secondary keywords on a page.
In the beginning of the document – It is a good practice to have the keyword in first 25-50 words of the document. Be it a table or a paragraph, try to keep it in the beginning of the document.
In Alt Tags – Label images on your page with ALT/TITLE tag filled with some keywords as spiders do read these tags.
In other Headlines – It is a good practice to have the keywords in other headlines (h2 – h6).
In Meta Tags – Yahoo and Bing still rely on them, Google does not. Having these tags properly filled will not harm, so just do it.
Proximity – It is the closeness between the 2 or more keywords. The closer the keywords the more likely it is to rank higher. It is best to have the keywords one after the other i.e. with no words in between them. Eg : “discount perfume” is better than “discount on perfume”. In Image names – Eg : Perfume.jpg – Good to have keywords in images names.
Secondary Keywords – Targeting and optimizing secondary keywords can be beneficial as they might have a lot less competition than the most popular keywords. Eg: discount Calvin Klien perfumes will have a lot less hits and a better targeted traffic.
Synonyms – Using synonyms on web pages is good as Search Engines have algorithms to take in to consideration synonyms for Keyword match and ranking especially for sites in English but it is not taken into account for many other languages. For instance, in the perfume site, “fragrance” is a synonym, Include synonyms on the page but do not rush to optimize every synonym.
Worst Practices:
Misspellings – Spelling mistakes do not make a good impression. If you know your target keywords have popular misspellings, you might want to optimize them and they can get you more traffic. But it’s better if you do not misspell. However, if you want popular misspellings, do them in Meta Tags.
Dilution – Excessive unrelated keyword optimization will affect performance of all keywords. Yes! even the major ones will be lost.
Stuffing – Keyword density of more than 10% is keyword stuffing and puts you at risk by to get banned by Search Engines.
Tuesday, April 20, 2010
Purchasing Domain Names - What You Should Know
1) Accredited Registrars
All domain names must be sold by accredited Registrars certified to do so by ICANN. Registrars are required to follow the procedures set forth by ICANN, giving consumers a dispute organization in the event one is needed.
Registars are required to pay a fee to ICANN for the purchase of each domain name. One of the areas that separate registars is the price charged to the consumer. Domain name registration varies from $4.99 at the low end to $65.00 depending on which registrar you choose. Services offered with domain name purchases also vary depending on the registrar selected, such as DNS, forwarding, email, hosting and parked pages.
The services offered are just as important to purchasing a domain names as the name itself. If you own a .com domain name and purchase the .net name, then free forwarding would be a great deal. If you want to point the domain name by DNS then not only is free DNS pointing required, but if you have no experience with DNS the support to set up DNS records will also be required. All registrars are not created equal and some sell low cost domains but with limited support.
When selecting a registrar look at your short term and long term needs, whether it is one domain or multiple domains. Although a low cost looks good up front, do your research and make sure all your needs will be filled. When looking for a domain contact the registrar and ask questions like: Do you have 24/7 tech support, does your support cover DNS record set up, advance record set up, what will you be expected to do and what will they do for you.
2) Domain Resellers
Resellers are partners of accredited registrars, reselling their products and services, but are not accredited to sell domain names. When purchasing a domain name from a reseller, be prepared to be patient. Most offer very little to no technical support or rely on the accredited registrar for support.
There are also times the reseller goes out of business without notifying the domain owner leaving them with little knowledge of who to contact if the domain requires DNS record changes or even simple renewals.
Domain resellers are the most unknown group of domain sellers on the internet. When searching for a registrar look at the site you are on, check to see if they list themselves as an ICANN accredited registrar. Domain sellers not displaying this are almost guaranteed to be resellers. Check icann.org and review the list of accredited registrars to see if the company you are looking to purchase your domain from is accredited or not.
3) Domain Deals
Searching the Internet for a registrar will lead a surfer to many results. Beware of some of the results and domain gimmicks. Yahoo offers domains for a low price with their hosting, but is not an accredited registrar or a reseller. Yahoo uses a third party company that is an accredited registrar. Most consumers do read the Terms of Service when making a domain purchase, and doing so will allow you to see that Yahoo states that they assist you in purchasing your low cost domain.
Unless you read the Terms of Service, knowing that your domain will be registered somewhere else could lead to long term issues. These issues include domain renewal notifications or confusions when wanting to transfer the domain to a new registrar.
Check out all domain offers, read the fine print and make sure you know who your are really dealing with.
4) Domain Protection?
When you purchase a domain it is yours to use for the period your selected to register. Once purchased many registrars automatically put on a service to prevent your domain from being transferred to someone or somewhere else without your permission. This service has different names depending on where your domain is registered, but may be called domain lock and domain protect.
Domain protection is a valuable service and should always be left on unless you intend to transfer your domain to a new registrar. Never allow anyone to tell you to turn off your domain protection for any reason but to transfer. Many times a hosting company or web designer will tell a client to turn off the lock so they can set up services, but this is not required to set up any service.
5) Additional Domain Services
When going through your purchase flow you will be offered a number of additional services, none of which are required. Web hosting will be needed if you do not have your own server, but if you just plan on parking the domain or not using it then why get hosting. Email can be used for your domain with many Internet Service Providers like Verizon. Check with your provider amd ask them if you can set up your domain and receive email using your current service. If so there is no need to buy the email service.
Domain registration with a particular registrar does not mean you must use all or any of the services they offer. You can purchase a domain with a registrar and host with a different company.
Private Registration is the only service that must be purchased form your registrar, this product will remove your personal information from the WHOIS database and make ownership of your domain anonymous.
Google's Goal of Quality Search
Google started as a high quality search engine and continues to be the best search engine today. It has managed to stay true to its original intent to be a search engine that not only crawls and indexes the web efficiently but also a search engine that produces more satisfying results in comparison to other existing search engines. To stay true to the goal of providing the best search results, Google knew right from the start that it had to be designed so that the search engine could catch up with the web's growth. According to Brin and Page "In designing Google we have considered both the rate of growth of the Web and technological changes. Google is designed to scale well to extremely large data sets. It makes efficient use of storage space to store the index". They knew that they needed much space to store an ever growing index.
Google's index size, which started out as 24 million web pages, was large for its time and has grown to around 25 billion web pages, still keeping Google ahead of its competitors. However, Google is a company that doesn't settle for just beating the competitors. They truly aim to give their users the best service there is and that means as a search engine they want to give users access to all or at least most of the quality information that is available on the web.
Google's New System for Indexing More Pages
As mentioned earlier, Google aims to give access to even more information and has been devoting time and much effort to realize this goal. It seems that the new patent entitled 'Multiple Index Based Information Retrieval System' filed by Google employee Anna Patterson might be the answer to the problem. The patent published just this May of 2006 and filed way back in January of 2005 shows that Google might actually be aiming to expand their index size to as much as a 100 billion web pages or even more.
According to the patent, conventional information retrieval systems, more commonly known as search engines, are able to index only a small part of the documents available on the Internet. According to estimates, the existing number of web pages on the Internet as of last year was around 200 billion; however, Patterson claimed that even the best search engine (that is Google) was able to index only up to 6 to 8 billion web pages.
The disparity between the number of indexed pages and existing pages clearly signaled a need for a new breed of information retrieval system. Conventional information retrieval systems just weren't capable of doing the job and just wouldn't be able to index enough web pages to give users access to a large enough percentage of the present existing information available on the web.
The Multiple Index Based Information Retrieval System, however, is up to the challenge and is Google's answer to the problem. Two characteristics of the new system makes it stand out compared to the conventional systems. One is that it has the "capability to index an extremely large number of documents, on the order of a hundred billion or more". And the other is its capability to "index multiple versions or instances of documents for archiving...enabling a user to search for documents within a specific range of dates, and allowing date or version related relevance information to be used in evaluating documents in response to a search query and in organizing search results."
With the new system developed by Patterson, Google now has the ability to expand its index size to unbelievable proportions as well as improve document analysis and processing, document annotation, and even the process of ranking according to contained and anchor phrases.
History of Google's Index Size
Google started out with an index size of around 24 million web pages in 1996. By August of 2000, Google had managed to quadruple their index size to approximately one billion web pages. In September of 2003, Google's front-page boasted an index of 3.3 billion web pages. Microdoc, however, revealed that the actual number of web pages Google had indexed during that time was already more than five billion web pages. In their article 'Google Understates the Size of Its Database', they emphasized that Google not only specialized in simplicity but also in understating their power and complexity. Google was still managing to stay ahead of its competitors and continued to surprise everyone with what they had up their sleeves.
As Google's index continued to grow the number in their front page grew impressively large as well before it plateaued at eight billion web pages. This was around the time that Patterson filed the new patent. Then in 2005, with controversies in index size growing, Google decided to stop counting in front of the public and simply claimed that their index size was three times largër than the nearest competitor's index size. Google also maintained that it was not just the size of indexed pages that was important but how relevant the results they returned were.
Then in September of 2005, as part of Google's 7th anniversary, Anna Patterson, the same software engineer who filed the patent on the Multiple Based Index Information Retrieval System posted an entry on Google's official blog claiming that the index size was now 1,000 times larger than the original index. This pegged their index size at around 24 billion web pages, about a fourth of Google's goal of indexing a 100 billion web pages. It seems then that Google must have started using the new system in mid 2005. With the new system in place, we can only wait and see how fast Google will reach the goal of a 100 billion web pages in its index. It's most likely though that when Google has reached that goal it will set an even higher goal to provide continuous quality service.
Google will remove all search queries after 18 to 24 months
By the end of 2007, Google expects to purge important identifying information on its computer servers about the sources of virtually all search queries after 18 to 24 months.
Subsequently, the company will have access to only partial records, so that no one can trace the queries back to individual users.
Google's move is intended to comply with various foreign laws and proposed legislation dictating that Web sites must keep user information for up to two years in case it is needed for legal proceedings. Similar rules are under consideration in the United States.
Google is the first major search engine to set a time limit for retention of search information, which can reveal a great deal about an individual such as whether they're sick (as indicated by a number of queries about cancer) and political affiliation (demonstrated by searches for certain blogs).
Until now, the company kept search logs indefinitely, raising criticism that the data could be misused by Google, law enforcement or marketers. Google said the changes are in response to feedback from privacy groups and government agencies, including the Norwegian Data Protection Authority, which raised concerns about Google's existing practices. The new policy, Google said, provides more transparency to users about data retention and better protects their privacy.
Kurt Opsahl, an attorney for the Electronic Frontier Foundation, a digital rights group, gave measured praise to Google's decision, calling it a step in the right direction.
He asked that Google similarly purge information collected about users of its other products, such as YouTube. Retention of search records emerged as a hot-button issue last year after a demand by the Justice Department that several Web sites turn over query data became public.
Yahoo Inc., Time Warner's AOL and Microsoft Corp. handed over the information, to the consternation of many privacy advocates, but Google fought the request in court and ultimately got the amount it had to provide reduced.
Separately, AOL made a high-profile blunder by posting 19 million search queries online as part of a research project. Ostensibly anonymous, the information was used to identify some of the users responsible for the queries, prompting a public apology by the Web site and a series of resignations.
"By taking some technical measures to anonymize this data, there is an extra layer of protection," Opsahl said. "You can't disclose what you don't have."
As part of the new policy, Google will erase eight of the bits that make up an Internet Protocol address, known commonly as an IP address, that identifies the computer used to make a search query. It will also make cookies -- the small files that help track user visits to specific Web sites and preferences -- anonymous.
After the plan is implemented, Google intends to keep the partial records and associated search query terms, explaining that the information will help the company improve its services and help detect fraud.
Next Generation SEO Tactics
Many refer to these evolving formats and scripting platforms under the name AJAX and no this does not refer to the popular cleaning agent under your kitchen sink! This AJAX (Asynchronous JavaScrípt and XML) comes from Google and takes web based interactive programs such as Google Maps, something that's usually associated with desktop applications, but is now being applied on the web. If you have used Google Earth, you will realize how powerful and revolutionary these new applications can be, not to mention, they are a whole lot of fun.
Where Did The Name Web 2.0 Come From?Many point to Tim O'Reilly, the constant innovator of many technological changes on the web. O'Reilly has been at the forefront in discussions and conferences on the nature and substance of the 'meme' open source platforms dominating the new social media.How all this new media plays out is anyone's guess, but all webmasters should optimize their sites for this new Web 2.0 and take full advantage of all the SEO possibilities presented by this brave new Internet.Here are a few SEO suggestions you can try:1. RSS/Blogging: You must place a blog and RSS feed on all your sites. This is a fairly simple procedure to do with frëe server-based programs such as Wordpress. Having a blog and RSS feed will place your site into the whole tagging process. Each category you create in your blog will be seen as a tag by such sites as Technorati. RSS stands for 'Really Simple Syndication' and your RSS feeds will get your content distributed across the web. A simple and easy way to tap into the new Web 2.0 universe.2. Create some Google Juice!: Join as many of these highly interactive sites as you can: MySpace, YouTube, Del.icio.us, Digg, Wikipedia ... my favorite is Squidoo, where you can create Lenses on different topics that interest you. User driven content that's utilized by all the major social media sites. Of course, link back to your sites in your posts and creations in these user-created content havens and watch your PR ratings go way up.
3. Use Interactive Scrípts: Place interactive JavaScrípts and platforms on your own sites. Have membership forums, polls, blogs, feedback forms, user-contributions... to build unique content driven sites. Become the spider!4. Tagging (Folksonomy): Be constantly aware of the tags (keywords) you're creating with your blogs and sites. This can have a very beneficial effect on your traffíc and rankings. Closely relate these tags to the content on your sites and build higher rankings in all the major search engines.
5. The Long Tail: Especially important for affilíate marketers, you need to cover special niches where there is less competition and content. These narrow niches make up a large portion of the whole vast web, creating content in these unique areas will get your site included in the search engines a lot quicker and keep them there a lot longer.6. Holistic Web 2.0: Be constantly vigilant in placing your sites in the whole 'Interactive Game', building links and partnerships with the important YouHubs: MySpace, Del.icio.us, YouTube, Digg, Squidoo... the more connections you have, the more your own sites will prosper.Be The Spider!No doubt, Web 2.0 will play an ever increasing role in the development and evolution of the web. Make sure your sites are optimized and in the 'You' game. Create blogs, RSS feeds, interactive forums, membership areas, user-generated content and truly make your sites interactive havens in their own right. Just remember to tag everything and your sites will reap the benefits of this new Web 2.0 generated SEO gold rush.Google History
- Age of site
- Length of time domain has been registered
- Age of content
- Frequency of content: regularity with which new content is added
- Text size: number of words above 200-250 (not affecting Google in 2005)
- Age of link and reputation of linking site
- Standard on-site factors
- Negative scoring for on-site factors (for example, a dampening for websites with extensive keyword meta-tags indicative of having been optimized [^SEO-ed])
- Uniqueness of content
- Related terms used in content (the terms the search-engine associates as being related to the main content of the page)
- Google Pagerank (Only used in Google's algorithm)
- External links, the anchor text in those external links and in the sites/pages containing those links
- Citations and research sources (indicating the content is of research quality)
- Stem-related terms in the search engine's database (finance/financing)
- Incoming backlinks and anchor text of incoming backlinks
- Negative scoring for some incoming backlinks (perhaps those coming from low value pages, reciprocated backlinks, etc.)
- Rate of acquisition of backlinks: too many too fast could indicate "unnatural" link buying activity
- Text surrounding outward links and incoming backlinks. A link following the words "Sponsored Links" could be ignored
- Use of "rel=nofollow" to suggest that the search engine should ignore the link
- Depth of document in site
- Metrics collected from other sources, such as monitoring how frequently users hit the back button when SERPs send them to a particular page
- Metrics collected from sources like the Google Toolbar, Google AdWords/Adsense programs, etc.
- Metrics collected in data-sharing arrangements with third parties (like providers of statistical programs used to monitor site traffic)
- Rate of removal of incoming links to the site
- Use of sub-domains, use of keywords in sub-domains and volume of content on sub-domains… and negative scoring for such activity
- Semantic connections of hosted documents
- Rate of document addition or change
- IP of hosting service and the number/quality of other sites hosted on that IP
- Other affiliations of linking site with the linked site (do they share an IP? have a common postal address on the "contact us" page?)
- Technical matters like use of 301 to redirect moved pages, showing a 404 server header rather than a 200 server header for pages that don't exist, proper use of robots.txt
- Hosting uptime
- Whether the site serves different content to different categories of users (cloaking)
- Broken outgoing links not rectified promptly
- Unsafe or illegal content
- Quality of HTML coding, presence of coding errors
- Actual click through rates observed by the search engines for listings displayed on their SERPs
- Hand ranking by humans of the most frequently accessed SERPs