Monday, October 20, 2008

So Google can NOW Index Flash? Really....

Every so often I hear that Google can now index flash files, sometimes presented as if it's a great gift to the world, and we can all now finally shed the archaic HTML system and build a beautiful new flashy Internet that we'll all be better off for.

This time it's started again with some big official announcements back in July:

And there's a lot of discussion about it again, mainly around the same topics that have came up before.

So is it real this time?
Yes it’s real, but there is still a big difference between being indexed, and being found on the first page of results.

Even now that Google can now index flash files, it is still is and will always be difficult for Google to understand the hierarchy of information in the flash file, so don’t think flash is going to be at the top of the search results any time soon.

Here are a few examples of the problems with flash, even now that Google can index the content:

  1. What constitutes a page in a flash file? A flash file appears as a single page, and there is no way for Google to externally reference a specific section of the flash. Google likes to send visitors to the most appropriate page that matches their search, and visitors like going to the most relevant content. But you can only send someone to the home page of a flash file. This is not in Google’s interests, the site owners interest, or the visitors interest, so this factor alone makes it very unlikely flash files are going to start competing for top rankings.
  2. Where are the titles, headlines, subheadlines and most important content? Since flash designers aren’t constrained to any of the HTML standards of defining title tags and headline tags, they may all do it differently, making the context and importance of content difficult to determine.
  3. Images: Many flash files make extensive use of images and video. Google is still unable to parse words from images, so if your navigation or content uses images, it will not be indexed.
  4. Meta data: There are standards for defining information about the information in HTML (meta tags, robots.txt files, no-index, no-follow commands, XML sitemaps) that help Google understand the information on a site. None of these exist in flash.
These are a few of the reasons why it will always be difficult to optimize, and get good search engine rankings, with 100% flash websites. Add to this the problem that it’s too easy for Flash designers to make non-standard navigational schemas and therefore make the site difficult to navigate, it still makes no sense to me why anyone would make a 100% flash website (except in cases where the site is intended to provide a unique “experience” and search engine visibility is completely unimportant).

This does not mean that you shouldn’t use flash. A marketer can still take advantage of all the benefits of flash by chopping it into elements that can be included in an HTML layout. It can still even appear as if it was 100% flash. The difference is that the site would be structured in a standard way that can provide context to the content, and the content would remain in a form that can be externally referenced (you can send someone to a specific page). It’s important when building a site like this that an SEO professional is consulted on how the content is built in Flash to maximize visibility.

It turns out I wrote a post about the difficulty of indexing flash and ranking in the search results with a flash site back in 2006. I wonder if I'll write about it again in 2010? My guess is yes.

Thursday, October 16, 2008

Joe the "Plumbers" Win the Google Lottery

Move over Joe Sixpack, meet Joe the Plumber. Joe was mentioned 25 times during last night’s Presidential Debate, primarily by John McCain in a last desperate attempt to get something to stick to Obama. Joe has become an overnight star: The Guardian reports everyone on Facebook wants to be his friend; people are “tweeting” about him like crazy; he’s rocketing to number one on YouTube.

Anybody need a plumber?

It occurred to me to check where Joe is on Google this morning. No surprise, he had already made it to the top on Google Canada, right behind the guy who might just be the luckiest plumber in Canada this morning: Whitby’s Joe the Plumber. In fact, across the world there are more than a few ordinary Joes that might have a right to feel like they’ve won the lottery. I’m talking about Joe Lara (joelaratheplumber.com), Joe from Seattle (joetheplumberseattle.com) and the grand prize winner, Joe from Amarillo – joetheplumber.com. They should probably all invest in larger servers – and hire a few more plumbers. After all, from now on, what’s the key phrase that will come to mind next time anyone needs a plumber?

And while these average Joes may get pushed aside by the Joe the Plumber on Google for a little while, it won’t be long before his 15 minutes of fame is down the drain. Unless he wants to pay a plumber in Amarillo a sink full of cash for joetheplumber.com. The only downside for Amarillo Joe? There goes his tax cut.

Tuesday, October 07, 2008

October is a Good Month for Linking Info

October 2008 is a good month for anyone interested in Link Building for SEO. Three things have caught my attention:

1) A "public secret" for almost year (previously referred to with the codename "carhole"*), SEOMoz has unveiled their flagship SEO product on October 6th. And it's good. In fact, I'll say it's milestone event in the field SEO. It's called linkscape.

2) Google's Announced "Link Week" on their WebMaster Central Blog. Basic, but interesting reading for SEO professionals considering the source.

3) All three Search Engines confirm at an SMX East panel discussion that Links are #1

Here are the details:

SeoMoz's LinkScape:
What is it? Basically, these guys have built an application to save the entire web on some really big hard drives, and then analyze the entire linking structure of the Internet. No Kidding! They've actually tried to recreate the system that Google (and those there search engines) use to index and rank sites but keep so tight to their chests, so that SEOs like us can figure out how to manipulate site rankings better.

They haven't saved the entire Internet (yet), but they do have a database of over 30 billion of the most linked pages on the web. The index is already very similar to the ones used by the major search engines. They're updating the whole thing every three weeks or so, and they say they're working to make it bigger and refresh faster. It's a really useful tool for us because it exposes greater granularity and information about links and link graph based metrics.

Here are a few interesting pieces of Internet linking trivia they've pulled from the data so far:

  • Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site.

  • 1.83% of all links on the web are nofollowed and of these, 61% are external-pointing, while 39% link to pages on their own site. While those percentages may seem small, that's a massive number (~2 billion links) that are leveraging nofollow for link juice "sculpting."

  • While 0.08% of pages on the web use the 301 redirect, 0.12% (nearly twice as many) employ 302 redirects. Another 0.005% use the meta refresh.

  • About 1.5% of all pages use the meta noindex tag (which is a lot of content the engines don't get to see) and 0.87% of all pages use the meta nofollow tag.

  • From our entire index of pages, the median page received about 77 links (both internal and external), while the average page gets 32. If your pages have more than 32 links, congratulations! You're above average :-)

Now that data is interesting, but it's nothing compared to what you get when you analyze your own site compared to your competitors! We're already using it to our client's advantage by:
  • Finding the juiciest links of our competitors that we can get, too
  • Finding out which of the links we've purchased, rented or acquired through any other means have the most value so we can focus on the most effective link building techniques
  • Learning which of the links that we have direct control over have the most value so we can channel the Google juice more effectively
Congratulations to SEOMoz on building a great tool.

Google's Link Week
The Google webmaster Tools team decided to declare "link Week" and spill some beans on what they think about links. Here are links to all of the related posts:
Top Three Search Engines Agree: Links are Number one.
During the search engine panel at SMX East, all three search engines concurred that links are the number one indicator for the importance of websites when ranking them: (Another quote from Rand Fishkin's blog for this one):

When asked if links are the primary signal for search engine rankings, the engineers generally agreed that, yes, it probably is. Aaron noted that links are a far less noisy signal than many others, including some forms of on-page keyword use and clicks in the SERPs. Sean from Yahoo! said that while it may not be the "most important signal" by itself, it's more important than, for example, title tags (which SEOs generally agree are critical to the SEO process). There was no mention that links would be fading away anytime soon - or that any competing signals had yet entered the marketplace as a potential usurper.

So Until Next Time, Keep on Linking....

* Carhole is a reference to the simpsons - it's what Moe called Homer's garage in the episode where Homer was running a counterfeit Jeans operation out of his.

Friday, October 03, 2008

Important Google Adwords Changes

It's been two weeks since the implementation of new Quality Score changes at Google. What's been the effects?

(If you don't know, Quality Score is a factor Google uses to determine how much you have to pay to bid for any spot in the sponsored search results for a key phrase. A low quality score will force you to pay more than others for clicks on the same position.)

Quality score is getting more important - and more manageable
Since quality score focuses heavily on click through rate it does put pressure on advertisers to get more in line with Google's interests, but they aren't entirely against advertisers interests either.

Now that Google is showing us a score on a scale of ten versus the old display of a few words like "poor", "good" or "excellent" we can see to a greater degree of accuracy where the relevancy problems may exist and address them.

How do you improve your campaign?
Tightening the focus of campaigns and ad groups, more closely matching ads to keywords, having better landing pages - all of these changes will improve your quality score, and therefore decrease your average cost per click for the same traffic.

Below is my short hand version of the factors that affect the Google Quality Score:

  • Click through rate (based on keyword and your whole account)
  • relevancy of ad text to keywords (in the matched ad and the ad group)
  • quality of your landing page
  • and "other relevancy factors" (no kidding)
The bottom line is we need to keep optimizing
I just heard an interesting tip listening in to the PPC Rockstars podcast with David Svetela interviewing Brad Geddes:

Consider how Display URL is a factor. In display URLs the root of the domain has to match, but everything else is OK to change. This is important to understand. Giving the searcher information about where they are going to land, in the form of the display URL, is going to help.

Tweaks like this will influence click through rate, which of course influences quality score, which in turn helps us pay less for the same traffic. Or more likely - help stop from paying more.