Website trust and authority are completely different things; they are commonly clumped together and confused. Website trust is essentially a measure showing search engines that a website provides real value and does not engage in unethical techniques (aka blackhat seo). Where your backlinks are from is an important factor in domain / website trust. If you have backlinks from highly trusted editorial websites (Newspapers, universities, goverment sites, professional associations, etc.), then you will more likely be trusted than a website which only has backlinks from spammy or bad neighborhoods.
The other side to this determining trust is who you link out to. If you link to non-reputable and spammy resources, then your website will lose trust. One last factor may play into the trust that is designated to a certain domain or websites; this is registration details. Google is a registrar of domains; thus they can see who owns what domains, and they already know what these websites link to. If the majority of your portfolio contains spammy or link farm like websites, one could make the assumption that on a whole, domains with your contact details could be flagged as having a greater chance of not being trustworthy. These are the factors that play into determining the trust a domain or website has.
Website authority is a bit simpler to understand; authority is a measure of how many website link to your domain and how these websites link to you. Different links hold various authorities; the sum of these links and the authority associated with them determines your authority. Google PageRank was once a good measure of authority, but it has recently been tainted by manipulative webmasters. Google likely accounts for this tainting by adjusting the trust of these websites. The diversity of a websites link profile is another factor. If your website has 100 links from 1 different website, this will not be as affective at raising authority as having 100 links from 25 different websites.
Authority can also be considered on a webpage to webpage basis compared to trust which is generally viewed as a global website factor. A singular webpage of a website may have authority, while the rest of the website might be filled with non-authoritative content. This touches upon the concept of deep linking and the importance of building deep links. Domain / website authority and trust can be broken down into these concepts, but their value to a website in seo is clearly significant.
There is clearly a bias regarding search engines giving the webpage that exists (aka is indexed) first a boost in the search engine rankings. Another way to look at this ranking factor is as the age of a webpage. An older webpage is given more authority by search engines, like Google. By being the first or one of the first to post regarding a new upcoming topic or keyword phase, that webpage is given an advantage over a webpage that is posted a week or a month later. When a new keyword phase first comes about, there are a few webpages in existence referencing or mentioning it. As word starts to get around, there is a huge influx of new webpages that are created that will target this keyword phase; then the amount of webpages that are created drops down and stabilizes to a constant level, until a new fade or popularity occurs and repeats the same process.
If you are one of the lucky and few people who first post and provide information regarding this keyword phase, your webpage will have a distinct advantage over all of the webpages that target that keyword phase. I have seen instances where a website with little authority or age itself and a moderately aged review post with no links pointing toward it and little on page optimization outranks a well optimized and linked to post on a more authoritative domain.
The first review post had little going for it aside from being one of the first to write a review prior to the huge spike of posts that followed. The second website posted their review after this spike and likely ranked for this keyword phase because of their quality on page optimization, links, and the authority of the website. In addition to the benefits provided by search engines, being the first to post also give other advantages. When that huge spike occurs and even after as the topic dies down, people that write about it will be more likely to cite and/or link to your webpage because of its age and the visibility of it in search engines.
Better visibility in search engines essentially means that the higher you rank for a keyword phase, the more likely people will be to link to you simply because you rank for that phase. Granted having quality content helps increase the chances of people linking to you, even if you content is sub par compared to other websites in the rankings, people will still link to the higher ranked website more often than not. The benefits of being the first to post are important with regards to search engine rankings and linking. Instead of trying to catch up with all of the runaway popular keyword phrases, try to be one of the first to post about a new up and coming topic.
In order to compare search engine ranking factors, one must choose a seo testing grounds environment. These different seo testing grounds vary greatly and the ability to attribute change to a ranking factor’s level of significance also has a spread of reliabilities. This post was inspired by choosing your seo testing grounds, and it my variation of what the search engine testing ground models currently are. These seo testing ground models can briefly be summarized as the following:
- The “real” testing ground: This is the serps for a key phrase or key phrases that is actually searched for by people; webmasters are constantly doing promotion work on the websites that rank for these search terms. There is a lot of noise and competition within this testing ground and attributing change to a seo ranking factor has a low level of reliability.
- The “semi-real” testing ground: This is the serps for the search terms that contain real words with meaning, but all of the terms are rarely, if ever, combined (ex. red coconut technology). This seo testing grounds provides an environment of real websites to rank against. These real websites do provide the link addition and search history parameters that replicate the “real” testing ground to a point.
- The “semi-virgin” testing ground: This is the serps for nonsense key phrases that are no longer optimized for. Old seo contests are perfect candidates for this realm, such as seraphim proudleduck or nigritude ultramarine. These testing grounds are quite useful, but they do not provide the semantic relevance or other algorithm parameters that define real serps.
- The “virgin” testing ground: This is the serps of a keyword that has no results in any search engine serps prior to the test. New or existing websites are used in this tests and different factors can easily be compared in this environment. The seo expert performing the test completely controls this testing ground and the parameters of each website involved. This is the furthest from the “real” testing ground as one can get as link addition, semantic relevance, and search history do not exist.
These are the variety of seo testing grounds that exist. Each of these testing environments has their benefits and disadvantages. The level of noise and competition within this testing grounds clearly has an effect upon the reliabilities of the results. Before engaging in search engine optimization testing, pick your testing ground carefully and acknowledge the live and controlled factors within each environment.
According to Matt Cutts, PageRank sculpting is now dead and has been for some time; people simply didn’t notice the change in Pagerank designation that Google secretly made sometime ago in 2008.
At first, we figured that site owners or people running tests would notice, but they didn’t.
Classic PageRank (ignoring the decay flow factor of 10-15%) was seen like this: a site with “10 PageRank points” and 10 outgoing links (5 of which are nofollow) will give each link without nofollow 2 PageRank points. However, now a site with the same structure will only give each of the links without nofollow 1 PageRank point. Links with the nofollow attribute are designated their respectful PageRank, but when the times comes to assign it, this PageRank simply disappears.
What does this mean for webmasters? Matt Cutts suggests that you let PageRank flow freely throughout your website; I agree with him to an extent, but I think that people will still try to PageRank sculpt in a new way. This new PageRank sculpting will encourage webmasters to reduce the number of links on a webpage or hide these links away from getting PageRank by using iframes or other techniques to ensure that the other links on this webpage get the maximum flow of PageRank. Some webmasters might even consider removing comments on their blogs because all of the links from comments is destroying their PageRank flow throughout their websites.
PageRank is clearly a much more complex topic than this. For instance, link placement on a webpage affects the authority it is rewarded; a navigational sidebar link is going to be designated less PageRank that an in context link. Time will tell if people start this new PageRank sculpting, but I doubt it will be beneficial for the time it takes to implement and its implementation will likely make a website awkward. If it does become a new practice to optimize websites, one thing I believe is that Google will pickup on this and change their algorithms accordingly.
It is still quite early, but Microsoft’s Bing has had a second good week according to ComScore:
Bing has gained 3% of the search engine market share for Microsoft compared to where they were at pre-Bing, regarding number of searchers and total number of queries. “It appears that Microsoft Bing has continued to generate interest from the market for the second consecutive week,” said Mike Hurt, ComScore senior vice president. “These early data reflect a continued positive market reaction to Bing in the initial stages of its launch.” Bing has held strong for 2 weeks, but the true question is will this gain in market share stick or will is fade away.
Recently, I created another blog which has the theme of controversial issues. I bought the domain name controversialissues.net and later purchased the .org version. The .net held a stable rank of #14 in Google for the search phase “controversial issues”. I bought the .org version and proceeded to 301 redirect the .net to the .org version. The blog did the Google dance and bounced between the ranks of #3 to #19 for the keyword phase “controversial issues”. It finally stabilized and maintained the ranking of #9 in Google for that keyword phase. The blog also gained search engine rankings for other search phases. During the time that the blog changed domain extension, no changes were made to the content, nor were any links added (to my knowledge). This means that all factors which affect the search engine ranks were equal, except for the domain extension. From this experiment, one can conclude that the .org extension ranks higher than the .net extension. In general whether a website has a .org or .net extension should not matter for seo purposes as search engines treat gTLD (generic top-level domains) the same. It is possible that in this particular niche of medical, government, social, etc. topics that the .org extension has more trust than the .net counterpart, which would account for its boost in search engine rankings. When discussing domain extensions outside of the realm of seo, the .org may also be preferable over the .net in certain situations. Many people still recognize .org websites as more trust baring than a .com or .net because the majority of people believe an organization is in charge of the website. However, this is not true, and there are no restriction upon who can own a .org domain name. For this reason many webmaster choose the .org over the .net in certain niches, such as health, medicine, law, etc. This experiment and these facts are some of the reasons that the .org can be seen as superior to the .net.