I have a long history of working on search engine projects and feel nothing reveals tricks like building a search engine to see how it works. One project was the Initiative to evaluate XML Retrieval (which I was largely working with Universities on advance ontological and syntax mechanisms) while another one was the GoXML Contextual XML Search Engine launched in 1998, days after the recommendation was done. GoXML is still featured on PatentStorm.
SEO has little to do with indexing the content of a page nowadays. All that does is give you a starting reference point. The indexbot parses your page noting some particulars and provides you a weight with various terms.
The key metrics are:
1. Domain name matches search string (note – since hyphens and periods are removed during the webbots normalization process, things like www.ford.com are equal to www.f-or.d.com). Not many people know this since they do not write code to parse domain names. The hyphens are removed since not many people search on hyphens and the search engine index needs to be as efficient and lean as possible.
2. How many relevant sites that point at the site are very important. I showed some Adobe colleagues how to use this to our advantage to beat out Microsoft and Sun for the term “Enterprise Developer Resources”. All I did was ask that everyone make a signature to their email that said “Adobe Enterprise Developer resources – http://www.adobe.com/devnet/lifecycle” and then go about our normal business of posting to public threads. The index assumed that we must be relevant given the other top sites seemed to have links pointing at Adobes site. In reality, these were only archived email threads with the signature being treated as a link. All the Search Engine saw was “
3. How people click on the top ten search results in Google. Google uses an adaptive algorithm which is a variation of the GoXML algorithm of which I co-wrote. We had 51 unique patent points in 1998 on this. When you click on one of the top ten results, Google simply tracks the result via a pass through. You can see this in action by doing any search on google, then right clicking the link and copying it. Where you see www.adobe.com/devnet/livecycle/ or get that URL if you copy/cut, when you right click and copy link, that actually translates to http://www.google.com/url?sa=t&ct=res&amp;amp;amp;cd=1&url=http%3A%2F%2Fwww.adobe.com%2Fdevnet%2Flivecycle%2F&ei=JzZkRvzMEZqUgwPkmNmKBw&usg=AFQjCNGx7iKEUn38Kcfk8woBnWtcNueL9g&sig2=ope-x2wZBZhBXtNlk_fj0w
A case study is “Architectural patterns metamodel”. Matt Mackenzie and I wrote a variation of the gang of fours template for architectural patterns using UML 2.0 and linking known uses. It is now ranked #1 since it is the most used template by many software architects. See http://www.google.com/search?hl=en&q=architectural+Patterns+meta+model&btnG=Search
Note that this is referenced by the unique IP address bound form the incoming HTTPRequest header so you cannot spoof it without additional tricks. Since you have to first receive the call back code from google to build the new outgoing request in order for it to register, there is almost no way to spoof it ;-)
4. The meta tags are indexed and useful but only up to a certain point. Many people who have no clue how code works try in vain to do things like META Content=”mountain, bike, mountain bike, mountain bike clothing . Etc. The truth is that the meta keywords are parsed and normalized stripping out both the commas and spaces except for one space or other delimiter to separate the array. All the indexbot see from the above example is “mountain:bike:mountain:bike:mountain:bike:clothing” Any word repeated is generally disallowed completely and interpreted as spamdexing the bot.
5. Any keywords that do not appear in the body in plain text at least once are heavily discounted unless the core content of the page has no visible words, then the indexbot defaults to what it has to work with to establish the baseline weights.
6. Any keywords that appear more than approx. 7% of the total word count for the body are discounted as spam. (Note - this cannot be verified lately but it used to be true in the early part of the decade).
8. Google overlays the search matrix with an ontology classified by a first order of logic that separates all results into a modal array. The ontological nodes are also ranked at the meta level based on the preceding and mix into the pages dynamically but within the constraints defined by their librarians. That is why a term like “washington” will have results for the president, the state, the university, the actor etc all in the top ten. One way to trick this is to find the least common context then build a site to get #1. Once you have done that, replace the words for the context of your choice and you will usually stay in the top ten since the visibility draws
I have other tricks but have never failed to get any site less than 3rd for the terms including “mountain bike”, aromatherapy, whistler rentals, enterprise developer resources and many others...
Oh yeah - these are the tricks I am willing to share. I am still keeping some others as closely guarded secrets. It's not hard to figure out since it is all based on simple logic. Enjoy and good optimizing. Post your success back here if you find this helped.
Some good information on SEO. I have a long way to go obviously. A query that you may be able to address. How do the search engines treat redirect links? I am writing articles and would like to have the signature go to an affiliate link via my website. At the moment it just goes to a simple file on my site with a redirect link to the affiliate. An example of the script is at http://www.ronskruzny.com/dg.htmlReplyDelete
location = "http://www.marketingtips.com/backstage-pass/t/898269";
I printed out the blog and then had another read of it. I was particularly intrigued by #8. I read into this that there are a matrix of results that are then presented together. Each column in the matrix would correspond to a separate theme that uses the same keyword. How would you apply this to gain top listing? In the "Washington" case, does this mean titling your page "George Washington" and then having all your copy reference "Washington Real Estate" if that was your intention?ReplyDelete
Technoracle, fantastic blog mate.ReplyDelete
I have a question mate.
You spoke about getting a site to number 1 then changing the content and the site maintaining I high search ranking. How does this work mate?
Online Marketing Australia
It is because of Google's dynamic page ranking algorithm that your content will keep near the top, even if you change the content. It does depend on the amount of re-indexing the Google bots do however. pages reindexed more frequently will be subject to re-ranking quicker than those who get reindexed once a year.
It stays on top as long as the content is relevant and searchers do not bounce from the page. If you change the content where it is no longer relevant to the searchers, google detects that they bounce off the page and your page gets ranked lower. There is really no logical point why you would want to keep an irrelevant page on the top of the search though, is there?
LOL - I see you are already using the linked signatures on this blog. Well done! Let me know if it works well.ReplyDelete
AFAIK, Google maintains a log of not only the link (URL) but also the unique cookies values used by the user agents and the IP addresses. This is what makes it hard if not impossible to spoof on a large scale basis. I would presume that while you can still manipulate it manually, the engineers at Google probably figured that anyone smart enough to do that would also notice that putting up content irrelevant to searches does not benefit them so it would not be something they would do.
You cannot do this with a script unless you have the script running on machines with different UP addresses. The search values returned by google have keys in them which are generated based on those IP addresses. The same IP address presumably will not count more than once and you cannot spoof the IP addresses as it would return the search results to the wrong IP address, therefore denying you the ability to use them. By doing the same search on different machines using Google's personalized search page, you can probably ascertain the keys that change, but unless you know the hashing or other algorithm they use, you cannot beat the system.
For more details on how this works, go to http://tv.adobe.com and view Duane's World Episode 3.
I am aware that this page is now ironically ranked in the top 5 for "Search Engine Optimization Tricks" in Google and want to discourage people from posting empty comments just to put a link to their page here. If you want to post something relevant and keep a link to your site, that is fine as I would be a hypocrite if I preached this and disallowed it. Please keep your comments focused on active discussion however and if this blog can help you, make use of it. It's here to enjoy.
Also - Many of you have emailed me privately for more information on SEO tips and tricks. The deluge of incoming email is staggering now and I want to post the links to additional information here to share. Here are some great reading links on SEO:
1. 6S's blog on SEO - http://www.6smarketing.com/blog/. These guys offer a lot of free information on this topic (Disclosure: the owners are personal friends of mine but I really think their work is great).
2. Duane's World Episode 3 (an online video with some additional tutorials on this subject)
3. Searchable Flash - a blog post I made in response to some email asking specifically what I knew about Flash and SEO (I am an Adobe employee for people who don't know).
4. SMX - Danny Sullivan and Vanessa Fox's company who offer several conferences on SEO and have tons of great speakers, tutorials etc. Most of what I learned on this subject between 1995-1999 was gleaned from Danny's website.
I took a look at your site and recommend it to my visitors. I agree with you on the importance of becoming valuable in many different areas. I believe that it sustains any entrepreneur during challenges that inevitably occur.ReplyDelete
This comment has been removed by a blog administrator.ReplyDelete
Thanks for useful infoReplyDelete
Here is an old rule! If you want to be really successful in affiliate marketing, you ought to drive traffic to your website. The more visitors to the website, the higher the probability of click through. Many affiliate guides forget to mention that it is always prudent to build traffic first and then consider affiliate marketing. There is no magic potion. If there is no traffic, there are no profits. Don’t worry, if you haven’t got hordes of visitors, even a few visitors will do initially. Once these visitors start trickling down the web drain, you can place banners and advertising in appropriate places to get the results. A good affiliate marketer doesn’t care about the number of clicks but on the average number of clicks per visitor.ReplyDelete
Such techniques, slowly but surely brings success. And with it comes a potential for much higher rewards
I am very new to your blog and have watched your video through adobe on seo for flash websites. I have a silkscreen business in los angeles ca. My website is completly flash and was warned about creating it in flash, due to low ranking through search engines. I am convinced there has to be a way to get the job done and keep my flash site. I will not mention my company due to your advertising fee, which would bankrupt me since i am a fairly new business owner. If there is any guidance i would appreciate it. Thanks again for all your resources and the advice.
@jbear32 During the Adobe MAX conference process, I was asked to do some deep research to try and find out exactly how Google uses Icahbod, the headless Flash player for indexing swf content. During the course of this research, we discovered that the actual content on the webpage is almost completely irrelevant to Google. Google primarily uses links to a site and dynamic search engine ranking algorithms that track user interactions to "vote" for sites. Many sites written completely in Flash come up before HTML sites.ReplyDelete
The research was compiled into this presentation. You may want to watch this movie.
BTW - also do not use any of the people who post comments to this page to try and get link equity from my blog to their SEO page. If they knew the first thing about SEO ranking, they should know that Google does not use links from blogger to count as link equity. They are just demonstrating a complete lack of knowledge on the topic.
This comment has been removed by a blog administrator.ReplyDelete
Hi! I want to share some thing with you all. i have recently worked with weblarge company. They helped me expanding my market internationally with lowest possible cost. I am great full to them.ReplyDelete
NOTE: Because I get a constant stream of SPAM on this post from companies who are advertising SEO services, I have decided to allow the spammers to post on this topic. WHY? Because each of them is demonstrating that they DO NOT UNDERSTAND SEO!!! They think that Google will index and follow their links and their own rankings will rise. Nothing is further from the truth. In fact, Google states they will not follow these links.ReplyDelete
THEREFORE, the below listed companies should be avoided at all costs as they are oblivious to how SEO really works.
Let the spam begin!!!
That is really a good logic when looking at people leaving links with their comments. But will not really stop the spammers.ReplyDelete
You are so right. Your comment is in context so it stays. Interestingly enough, since I posted the previous comment not one single SEO firm posted a comment. I dare say they may have learned from this blog. OMG!!!
Peace and love to you