SEO vs SEM – Part One

September 13th, 2008 | by Paul Rushing Published in SEO

  • Is search engine marketing (SEM or PPC) really a wise investment of marketing capitol?
  • Is a dedicated search engine optimization (SEO) investment worth the money and effort?

The obvious answer to both is a resounding yes. But first you need to define your goals before starting on either one. They both have their place but they two very distinct purposes.

To often though is SEM used as a crutch to make up for deficiencies in SEO.

You should not have to buy your money terms with pay per click advertising. Money terms meaning your stores name, and your brand and location combination’s. Ranking for these keywords is child’s play in the grand scheme of things, if your site is properly optimized and you obtain a few relevant back links.

Only 25% of your of your websites search engine presence is determined by your on site efforts. The other 75% is determined by the number of votes it gets from other sites, back links. However the 25% factor may make a huge difference on how effective the other 75% is and how long it takes to see results.

Key elements to be concerned with on site.

Meta Data

Meta tags tell the search engines what users will find when they arrive at your site. Think of it as how your site is cataloged in their index.

  1. Page title: This is the most important element. The title is what describes what your page is all about very much like a book title. It is also what people see in the search engines in the results pages and in the browser bar. It gets more weight than anything from the search engines when they index your site.
  2. Page description: It is a summary of your page and is what shows for a description of your site under your title in some of the search engines. It is very important to include your main “money terms” in your page description.
  3. Keywords: It was once the most weighted of all meta data but as people discovered this it was the most abused. People would stuff as many keywords as they could imagine to describe their site to the search engines many time even if they were not relevant. Now they do not give them as much weight however most SEO experts will agree because the search engine spiders still scan them and if your content matches they do provide some weighting. Having more than 8 keywords is a waste of energy and more than six is probably over kill and may even sandbox your site.

Website vendors should give you access to manipulate these items to help you build the relevance of your site. No one can describe your business better than you so it should not be left up to them. I laugh when I see websites by vendors that basically have broiler plate meta descriptions with just minor changes for dealer name and location. It is like they just fill in the blanks. Each page on your site should have unique meta data. Each page delivers a different message let the search engines know what you are showing your visitors.

Heading Summaries:

Heading summaries are sub headings on your site. These tags help people read your web page but also help the spiders understand more about your page is targeted too and what its the most important parts. For example you will see different sections of this post show different headlines through out the post. This is through the use of head tags. <h2> is a second level title and
<h3> a third level title. Heading summaries above is an <h2> element.

Image summary:

The search engine spiders cannot see images like your website visitors can. So you have to tell them what the image is or says especially if it is an image mostly comprised of text.


The Worse SEO Mistake You Make

By Brandt Dainow

Learn where most companies misstep when it comes to his crucial component of their online marketing strategies.

Search engine optimization, or SEO, is the process of tuning the content and coding of a website in order to maximize its listings in search engines. SEO should be part of every well-rounded online marketing program. Pay-per-click advertising is all very well, but it means you have to pay for every visitor. SEO is about getting free traffic from the search engines. Over the course of two years or more, nothing has a better return on investment than SEO. Thus, if you plan on having a website that runs for more than two years, search engine optimization should be a key part of your online marketing strategy.I started doing search engine optimization in 1996 when Web Position (the world’s first SEO tool) was in beta. I remember receiving an e-mail from the company that pointed out that its tool would make it possible to sell SEO services to clients. At the time, nobody was doing search engine optimization, but it was instantly obvious to me that such a service would be essential if people wanted to be found on the web. I have now been doing search engine optimization for 12 years — and in some areas I “own” Google.

The most common mistake that organizations make with regard to SEO is bringing their SEO consultants into the process too late. Many companies fail to give SEO its due consideration during a website’s design phase. In fact, many companies don’t give it any thought at all until after a site’s design has been finalized. However, it is during the planning and design processes that SEO considerations are most important and will provide the greatest advantage.

Coding for success

  The coding of a site affects search engine optimization in many ways. In fact, coding has a greater impact on a site’s listings in the search engines than the site’s content. Many sites — including those of some top brands — simply cannot be read by search engines at all. If you want to see for yourself, install the Google taskbar in your browser and start looking at the page ranks that appear when you visit various sites. Page rank is Google’s assessment of the global importance of a site. It will not take you long to find major sites that have no page rank. Unless the site is very new, a lack of page rank means Google cannot read it.

The technology used to build a site has a direct bearing on search engine optimization. For example, most search engines will not read pages if a URL contains a question mark. A question mark indicates that the content is the result of some dynamic process, such as a content management system or PHP. In other words, it tells a search engine that the content is being generated automatically.

When a search engine perceives that content is automatically generated, it has no way of knowing if the content is generated every hour or only once a year. There is typically a delay of six to eight weeks between the time that a site is read by a search engine and the time at which it appears in the listings. Thus, the search engine has no way of knowing whether what it has just read will still be there when it sends a user to the page in a month or two. In short, any page with a question mark in its URL is potentially untrustworthy. It was precisely for this reason that the mod_rewrite module was produced for Apache. (Microsoft has a similar module for IIS.) Mod_rewrite enables you to lay static URLs over dynamic ones. Adding mod_rewrite to a system before you start coding it is a small job. Adding it to a large dynamic shopping site after it is running is a major headache, and may simply be impossible.

If you read Larry Page’s and Sergey Brin’s Stanford University dissertation, describing the algorithms they wanted to use in a search engine, you will find that a great deal of space is devoted to the analysis of the importance of pages according to their position inside the navigation structure of a website. Therefore, how you arrange the pages and how they link to each other has a direct bearing on the search engine optimization of those pages. I have used this information to look at potential site designs and, in some cases, have found that the core content would actually rank as less important than the site’s privacy policy, simply because of the way links were built to the respective sections.

There are many ways of coding the same page, and not all ways are equal to a search engine. Dynamic menus are a case in point. At present, search engines cannot run JavaScript or Flash. The only hyperlinks that they can follow are standard HTML <A> tags. You want search engines to follow your links because that is how they find the pages inside your site. It is therefore important that you create navigation structures that they can follow. Some dynamic menus can be followed by search engines and some cannot. It depends on how they are coded. Generally speaking, menus that are dynamic because of changes to CSS properties are fine. However, those in which the target page is called via programming are not. Once again, it is best to lay considerations like this down during the design brief because changing every link in the site later is expensive.

This becomes more important if you plan on having a content management system (CMS). If software is going to be writing your copy, or code, you need to ensure that what it produces is as search engine-friendly as possible. Many content management systems generate horrific code from a search engine point of view. Once again, changing a CMS after it has been deployed is a major nightmare — and often impossible.

Early communication for optimal results

  You often won’t hear complaints from SEO consultants unless search engine activity is absolutely impossible (and sometimes not even then). SEOs are used to dealing with (from their perspective) sub-standard sites, sites that are barely readable by search engines, and sites that contain many problematic elements. SEOs have learned to accept such sites, and they often have no choice but to do the best they can with the garbage they are given by customers. Many SEOs have learned that pointing out problems may result in a client’s deciding to go to a yes-man who will not make waves and is happy to take the client’s money for a year or two while achieving nothing.

If you want to get the most out of search engine optimization, your SEO consultant should be the first person you talk to when developing a site — before you even write a brief and start searching for potential designers. The sites that have had the most success when I’ve worked with them are the ones that asked me to modify their briefs to cover the requirements of SEO. The last time I did this, three of the five design agencies that had been asked to bid withdrew because they could not meet the standards required to make a search engine-friendly site. Throughout the design and construction process, I worked closely with the coders. Most new sites don’t get listed by Google at all for months. Our site was No. 1 in Google within two weeks of launch.

Bring SEO experts into the discussions of what will be built at the earliest possible moment. Don’t let the design agency or your own designers get their feet under the table until you have spoken to the SEO expert.

There are many elements that need to be considered during the SEO process, and these discussions often result in the SEO expert becoming the most unpopular person at the table. Such conversations often degrade into a litany of “no, you can’t do that because the search engines don’t like it,” followed by “no, you can’t do that because the search engines don’t like it.” Companies have to watch their favorite design features drop like flies. Sometimes designers have even gone so far as to accuse me of trying to cripple their designs. But ultimately, it is not the fault of the SEOs; they are just the messengers. They are simply telling you the way things are. When it comes down to it, if you want your site to get listed in the search engines, you have to give the search engines what they want.

Remember: Search engines do not have to list every site on the Web. In fact, despite what they may claim, they don’t even try. All a search engine has to do is provide people with a list of 10 reasonably valid results from which to choose. The lesson: You need the search engines. They don’t need you. Therefore, it is incumbent upon you to understand what they require and give it to them.

Bringing an SEO in after a site is finished is like deciding to do the electrical wiring on a house after you have moved in. By bringing an SEO into the site design process, you can save time and money later. In addition, your site is likely to achieve listings that it could never achieve if SEO were undertaken after the site was already finished.

Design a site for the search engines, and the viewers will follow. Design a site the search engines can’t read, and nobody will ever know it exists.


How to win the search position game

By Chris Lien 

Sometimes being in the third or fourth paid search position is actually more effective — and a lot cheaper — than winning the top spot. Find out when it pays and when it doesn’t.
When marketers buy keywords, they often get caught up in the idea that their ads have to come first on the page — and they pay a premium for that placement. But first position isn’t always the best. Sometimes, position three or four will actually convert at the same or higher rate than position one, and at a fraction of the cost.

First position keywords can cost two or more times what a third position one does, and in some cases, it makes sense to pay that premium (for example, when you’re more interested in brand building than conversion, you’ll want to make sure your brand comes out on top). For campaigns for which conversion and profitability are also factors, position three or four can be better.

But how do you know which keywords should be in position three or four, and which are worth the splurge for the top position? How can you measure and test campaigns to find out which should be top-tier and which should be third-tier? How can you maneuver within Google, MSN and Yahoo to get the positions you want? Here are some tips that should help you win the position game.

Focus on what each click is worth, not on what position it should be in
In general, if a purchase conversion is worth $10, and one out of 10 people purchases, you should pay about $1 per click. You should offer that maximum price to Google (or another search engine) for the specified keyword.

After you launch campaigns, continue to test them for conversion metrics and adjust your top bid accordingly. Many marketers think that if the clickthrough rate is higher, the keyword should be more expensive. But you should determine the value of a keyword based on conversion rate, not clickthrough rate, because you only pay by the click.

Heads or tails?
Head keywords are generic terms that people search while browsing or doing product research, such as “mp3 player.” Head keywords often benefit from being in first position, because they capture a lot of “browsers” who just click on the first link and may be exposed to your site for the first time. These people may not buy now, but they’ll connect with your brand.

Tail keywords are often best in third or fourth position. These keywords are specific and appeal to committed buyers, such as “black ipod nano 8gb.” People searching for these keywords are usually more ready to buy, so they’ll look at — and even click through — several ads to find the best deal, even if that deal appears in a link halfway down the page.

The upshot? Head terms get much more volume and are often more expensive to boot, so to justify your investment you may need to measure carefully which visitors return to your website.

Set a top position
This is a tool on Google you can use to hold your keywords down in the rankings, even if you are bidding enough to be #1. It’s always better to figure out first how much your keywords are worth to your bottom line, and then find out where that places you. But this tool can be useful if you find that position #1 gets a lot of poor quality traffic that never converts.

Focus on the dirty dozen
Most marketers spend the majority of their budgets on a few top keywords, usually about a dozen, which are high volume and have a strong conversion rate. Focus on fixing the position of these keywords first, because correctly placing these top keywords will have the biggest impact on total revenues. Let the others fall where they will according to their conversion rates as described above.

Turn off Google Search and Content Networks
If you don’t opt out of Google’s search partners, like AOL and, your position numbers will reflect a blend of your positions across all of those properties. To get an accurate picture of where your keywords are positioned on Google itself, turn off the additional distributions. You can always turn them back on after you finish your measurement.

Turn off Google Content Network. Ditto as above
To figure out what your keywords’ true positions are, focus on Google itself, not your position across all its content partners, such as New York Times, MySpace and

Work weekends
Some keywords perform stronger on the weekend, such as “gardening” or “beach wear,” for example. Set up automatic bid increases for these terms to boost your position solely on the weekends. (Google supports this at the campaign level; MSN supports this at the Ad Group level; and Yahoo doesn’t support it right now.) Remember: These boosts should be based on changes in conversion rates, not click volume. Look for the pattern before you set the boosts.

Pony up for brand and “executive” keywords
If you’re Coca-Cola, you just have to pay whatever it costs to have “Coca-Cola” be in the top position — that’s crucial for your brand. Plus you can use your company name in those brand-term ads, and other advertisers cannot (call the support team at the search engine if you see any violations of this). Likewise, if your CMO tells you the company needs to be in top position for certain keywords, like “digital camera” or “PC” to build your brand in those categories, then just pay what it costs to be in the top spot (and pull the cost from the branding budget!).


SEM, SEO, PPC, CPC..please define

by Jeff Kershner 

I had a nice conversation with someone the other day and we were talking about how it’s so easy to get all these acronyms mixed up. You read an article in one magazine that talks about SEM and the next article you read refers to what seems to be the same thing as SEO. I myself have even been guilty of using different acronyms but not necessarily clarifying with the right terms.

So, I thought I would put together a few of the common acronyms to help clarify. I know these might seem elementary and obvious to many of us but it’s easy to get confused or just forget sometimes.

So lets review;

Search Engine Optimization (SEO):

The term used to describe the technique of preparing your dealerships website to enhance its chances of being ranked in the top results of a search engine once a relevant search is undertaken. A number of factors are important when optimizing a website, including the content and structure of the website’s copy and page layout, the HTML meta-tags and the submission process. This can also be referred to as Search Engine Positioning (SEP). Some companies commonly include SEO under the same umbrella as search engine marketing (SEM).

Search Engine Marketing (SEM):

The act of marketing your dealership website via search engines, whether this be improving rank in organic listings (search engine optimization), purchasing paid listings (PPC management) or a combination of these and other search engine-related activities (i.e. local listings, the new Google local coupon or link development). SEM is not always include SEO, so be sure to clarify this when you are speaking with an SEM vendor.

Cost-per-Click (CPC) or Pay-per-Click (PPC):

This is where an advertiser (or you the dealer) pays an agreed amount for each click a consumer makes on a link leading to your dealers web site. This is also known as “Paid Placement.”

Organic/Natural Listings:

Listings that search engines do not sell (unlike paid listings, CPC and PPC). Instead, your dealer’s website appears solely because the search engine has deemed it editorially important for your web site to be included. There is where your dealers’ website SEO comes into play.

There you have it. If anyone would like to share some thought or comments, please do so.


How will Flash alter the SEO landscape?

By Michael Estrin
News of Adobe’s decision to work with Google and Yahoo to make Flash searchable spread like wildfire. But so far, agencies aren’t sure what this change really means.

When John Romano, a senior web developer for marketing firm Capstrat, sits down to build a website for a client, he worries about a lot of things. But one concern foremost in his mind is whether anyone will see the cutting-edge work his team is tasked with creating. While Romano’s work is the kind clients pay handily for and users love, it’s not the sort of content that is search engine friendly. But that will soon change, as the two leading search engines and Adobe, which makes the tools Romano uses, have joined forces to help make his work more accessible by indexing the web for rich media files. 

For Romano, and many like him, the problem can be summed up in a word: Flash. Adobe’s powerful multimedia tool has become the instrument of choice for interactive agencies eager to deliver fully immersive online experiences that do more than simply hurl text at today’s fickle users.

But while 98 percent of internet-connected desktops have Flash Player installed, few users are likely to find a website rich in Flash.

“Getting Google [and other search engines] to connect users with specific Flash content has been a real problem,” Romano confesses, “and it’s been something the industry has been struggling with for years.”

Since the beginning, search engines have been fixated on text, rather than images or other forms of reach media. The result has been that pages heavy in images and rich media don’t rise to the top of the natural search results, even when they are more relevant than their text-based counterparts. To counteract this problem, digital agencies have employed an array of cumbersome solutions to help users find the more dazzling sites employed by major brand clients.

But the solutions — a patchwork of proprietary fixes designed to boost SEO efforts for Flash-heavy sites — have been far from ideal. Often developers find themselves duplicating efforts in both Flash and HTML, which can be both expensive and time consuming. The announcement earlier this month from Adobe, Google and Yahoo could change all that. At least, that’s the plan. But as is often the case, a barrage of questions followed from the agencies charged with leveraging the latest technology development on behalf of their clients. 

Next page >>
So how much does Flash weigh?
Mention the words “SEO” and “change” and you’re bound to get the attention of a lot people working in interactive. Little wonder. Being found is the name of the game for anyone working on the web. But the decision to begin indexing Flash has raised the web’s constant question: what does this mean for my business?

According to Google and Adobe, developers using Flash won’t need to make any retroactive changes, and they won’t need to do any special work to make their files accessible to the search engine spiders. But finding the Flash content is only the beginning, according to Ivan Todorov, CEO of BLITZ, an interactive agency that has worked with clients ranging from FX Networks to Lincoln.

“In the long-term, we think this will have a huge impact for the future of interactive,” Todorov says. “But right now, the primary concern is how Flash will be weighed by the search engines.”

Unfortunately for Todorov, that question isn’t one Google or Yahoo is likely to answer because it would mean sharing proprietary information related to their algorithms. While Todorov and others say they would like to be part of that conversation — presumably to argue for giving Flash maximum value — agencies are likely to be kept in the dark where SEO is concerned.

But according to Tom Barclay, senior manager, Flash Player at Adobe, all parties fully expect the Flash developer community as well as SEO experts to develop best practices for optimizing rich media content under the umbrella of an Adobe/Google/Yahoo collaboration.

“Existing Shockwave Flash (SWF) content is now searchable using Google search and, in the future, Yahoo search, dramatically improving the relevance of rich internet applications and rich media experiences that run in Adobe Flash Player,” Barclay explains. “As with HTML content, best practices will emerge over time for creating SWF content that is more optimized for search engine rankings.”

But in the meantime, Andrew Lovasz, director of search marketing at Moxie Interactive, says the change is likely to reorder natural search results where smaller operations were benefiting because their competitors were relying almost exclusively on Flash.

“This is definitely going to raise the barrier to entry,” Lovasz says, pointing out that big brands that are more likely to have Flash-heavy sites can expect to see a rise in their natural search results.

<< Previous page | Next page >>

The devil in the details
While searchable Flash raises the immediate and obvious question of “weighting” rich media as a content category, the truth of the matter is that the search engine ranking debate will always rage, whether the topic relates to text, Flash, video, audio or any other format. But behind the question of how all this newly ranked content will be integrated into natural search results, agencies will still have to grapple with the mechanics of developing for Flash.

“The headline was really nice to hear,” says Cheryl Haas, VP Fleishman-Hillard. “Hearing that Google, Yahoo and Adobe are all working together is a great start, but I think we’re still a long way off.”

What looks like the proverbial flip of the switch — Adobe’s decision to partner with the two leading search engines — in reality raises a slew of technical questions.

According to Lovasz, and many others, Yahoo, Google and Adobe have been long on excitement, but short on actionable details.

As a simple administrative matter, Google has said that it will take several weeks to index the vast amounts of Flash strewn across the web. Yahoo will begin indexing the web for Flash at an undetermined point in the near future. But while the indexing process is underway, Haas says her team has concerns that neither Google nor Yahoo will be able to crawl JavaScript, which is used to execute Flash content. That’s true, according to Google, but the search giant says it’s working on remedying that, and officials at Adobe say they’re attacking that problem as well.

But Haas’ concerns may highlight a larger problem for Adobe and its search engine partners. While agencies have uniformly praised the news, many have expressed concern that the Flash developer community remains largely in the dark regarding the establishment of best practices for building the Flash sites of tomorrow.

For its part, Google admits that there is no established best practices guide that is endorsed by all three companies. However, Google has its own online resource for developers, as does Adobe.

But a lack of communication — perceived or real — could slow the development of a Flash-friendly web, Romano says, and points out that it will be up to the armies of disconnected developers to figure out the mechanics of this latest tool.

“Our technical people have punched a lot of holes in this, and that’s not surprising given the fact that matching Google’s technology with Adobe isn’t easy,” Romano explains. “This is only the beginning of the solution, and it is likely going to take years to solve because it will require developers to ultimately build Flash sites differently.”

But that doesn’t mean that Adobe is operating independently of all developers. Stephen Jackson, CEO of Smashing Ideas, the largest independent developer of Flash in the U.S., says Adobe works hard to communicate changes with a core group of companies that use its products.

“I think a lot of the disconnect here is that there are millions of Flash users out there,” Jackson says. “So working with all of them makes it rather hard to conduct business.”

What will this mean for interactive?
Across the board, agencies do seem to agree that the decision by Yahoo, Google and Adobe to work together will be a good thing for the interactive advertising business. But just how good is hard to say.

What seems unlikely to some is the idea that improved search optimization for Flash will lead to more Flash development. As Haas put it: “You won’t see people building in Flash just for the sake of having Flash; there has to be a reason.”

But improvements in Flash should have an indirectly positive effect on the overall industry, according to Jackson, who says that getting cutting edge content in front of more users — especially from a Google or Yahoo query — should help drive impressions and clickthroughs.

“It all depends on impressions and clickthroughs,” Jackson says. “If this makes that happen, then you’ll see more advertisers increasing their online budgets.”


No Indexing Guarantee From Google Flash Crawls

By David A. Utter

Crawlers may miss things inside SWF

Just because Google says they pry out the text content from Flash files and make them searchable may mean less than webmasters think.

Flash represented a surefire way to keep content out of search engines. The algorithms that could chew through massive text files without a hiccup hit a brick wall when it came to the rich media content of a Flash file.

Google became the first engine to enable its crawler to peek inside Flash files and pull out indexable content; Yahoo should offer this at some future date. SEOmoz maven Rand Fishkin said it’s too soon to get excited about Google indexing Flash.

“Flash content is fundamentally different from HTML on webpage URLs, and being able to parse links in the Flash code and text snippets does not make Flash search-engine friendly,” said Fishkin. “But I don’t believe web developers should be any less wary than they’ve been in the past about Flash-based websites or Flash-embedded content.”

Out of several reasons Fishkin listed for keeping a wary eye on Flash indexing, one stood out. “There’s no ‘test my site’s Flash file crawlability’ feature that I’m aware of, leaving us very much in the dark about exactly how the engine’s going to parse your material,” he said.

One should hope that Google will work on such a feature, and add it to their Webmaster Central toolset. If Google demonstrates to webmasters what the crawler sees, that would be a boon for Flash’s owner Adobe.

Webmasters who avoid Flash for SEO purposes now may rethink its use with a reliable method of finding out how well Google indexes a Flash file. That would lead to more sales of Adobe’s developer tools, something we think they want to see.

10 SEO Myths Debunked

Confused about SEO? You’re not alone. We reached out to leading SEO gurus, including Google’s search evangelist and Danny Sullivan from Search Engine Land, to uncover the truth behind the most common myths.

Attending an ad:tech San Francisco panel on search engine optimization, one fact came apparent almost immediately: There is a ton of misinformation out there when it comes to SEO.

While iMediaConnection covers developments in search faithfully, we often leave it up to our readers to update their understanding of the field. But for this particular story, we elected to take a slightly different approach.

To address some of the more common misperceptions about SEO, we asked several SEO experts to tell us about the most common myths they hear from their clients.

Here’s what we found.

Myth #1: SEO is all about secret tactics

I talk to a lot of people about SEO, plenty of whom are new to it. I’d say the most common myth is that SEO involves all “secret” tactics requiring you to buy links or trick the search engines, and that no one in the industry can be trusted. In reality, there are a lot of simple but effective techniques that even the search engines will tell you to do that can increase traffic. And there are plenty of people who are not snake oil salespeople who can provide this useful service.

A good place to start the process is to look at your analytics. There are a variety of tools, including some from Google, that spotlight if you have problems being accessed by search engines. I also like a top-down approach. You start from the homepage and ensure that it is search engine friendly, then work your way back through the site going down the paths that are most important to your business.

Myth #2: SEO means optimizing only for Google

True, Google is the dominant search engine in many parts of the world, accounting for 60 to 90 percent of all search traffic; but if you think all search engine optimization is for Google, you have missed the online marketing love boat and should return to work at your mimeograph machine.

Yahoo, MSN and hundreds of special interest sites, along with vertical or category-specific search engines, are crawling and indexing your content. The art and science of SEO includes optimizing for vertical information sites, news and social groups as well.

So, what’s the best SEO strategy? While being aware of technological pitfalls and linking advantages is important, stop optimizing for Google and start optimizing for your intended audience. Building search-friendly sites in a content-friendly environment is the best way to win.

Kevin Ryan is vice president, global content director at Search Engine Strategies and Search Engine Watch

Myth #3: Submitting your site to thousands of directories helps

I get countless spam emails promising to get me the top listings in Google by submitting my site to thousands of web directories. It’s easy for anyone to start a web directory these days. Just buy some web directory software, and you’re good to go. That’s the danger! There is a proliferation of web directories from all the web entrepreneurs using web directory software, or some kind of PHP directory script.

Many web directories are brand new “out-of-the-box” and they don’t have authority, aged domain, or a strong inbound link profile. So, submitting to these directories will not provide any substantial type of SEO lift you might hope for. The reality of the matter is that some of those submissions may actually put your site in a “bad neighborhood” and hurt your SEO efforts.

Here are some factors to look for in a quality web directory:
1) Quantity of inbound links
2) Quality of inbound links
3) Age of domain
4) Topical relevancy to your site
5) Human-edited is better than automated because editorial control tends to lend itself to quality
6) How frequently the directory gets crawled (check the Google cache)
7) The directory itself ranks in the search engines — this can be a sign of authority and can drive clickthrough traffic
8) Are their links direct, static links or are they redirected to your site?

Bottom line: Web directory submissions do help. However, it’s better to cherry pick a handful of the most reputable/authoritative web directories instead of taking the easy way and shooting yourself in the foot by using an automated process to submit your site to thousands of directories.

Myth #4: SEO is free

Just because it’s not “paid search” (SEM), doesn’t mean it’s free.

The costs associated with SEO are:
1) SEO consultant
2) Programmer/graphic designer
3) Link development
4) Do-it-yourselfer’s time (based on hourly rates)

Depending on the website and campaign objectives, an SEO campaign could cost a few thousand dollars per month to tens of thousands per month.

Metrics to measure SEO success are:

1) Keyword ranking
2) Website traffic
3) ROI
4) Brand awareness/brand engagement

Sandler’s practice, which can be found at, appears as the top result (behind a directory) on Google for the combined terms: “SEO Consultant.”

Myth #5: Keywords need to appear everywhere

A popular myth (brought on by people reading old SEO information that is not relevant to the current marketplace and optimization software that was programmed many years ago) is that you should put your keywords everywhere to rank as best you can. The truth is that Google’s current relevancy algorithms favor more natural writing that includes a more diverse and realistic set of text with more variation in it. Some common variation strategies include using both the plural and singular versions of a keyword, changing the order of words in a phrase and adding relevant modifiers to page titles and headings.

Four or five years ago if you wanted to rank for “credit cards” you would put that phrase in your page title, in an H1 tag on the page, and in most of your inbound anchor text to that page. If you wanted to rank for the same phrase today, you might put a modifier word or two in the page title, opting for something like “Compare Credit Cards Online.” Within the page copy the heading might be something more like “Apply for a Credit Card Today.” Rather than focusing on the core phrase, this strategy still gets you decent coverage for it, but also helps the page rank for a much wider net of related keywords, and it makes the page much less likely to get filtered. You should also mix up your anchor text as well, if possible. If every link to a site has the exact same anchor text it doesn’t look natural.

In addition to Wall’s site, he has also launched the SEO Training program to help interactive marketers better understand SEO.

Myth #6: SEO is a one-time event for a website

It’s logical that a dynamically changing database of information (a search engine) requires recurring and systematic website optimization strategies and tactics.

SEO must be anchored with multi-disciplinary teams of interactive specialists who focus on website development, usability and search engine friendliness. In regard to SEO, we investigate how a search engine works to discover the requirements for acquiring natural search traffic. Our methodologies are described best in Google’s Guidelines. Following the principles of this document and taking advantage of many years of compliance, we have modeled an SEO methodology utilizing both one-time and recurring modules to produce a list of SEO client observations of success over a 12-year period. These are the factors known to contribute to SEO success, and our team is constantly aware of this when serving client needs.

Usually, the first three-modules are one-time events: keyword research, diagnostic audit and diagnostic audit modifications. The remaining three modules are recurring by nature: website and competitive analysis, page editing and optimization and link building strategy implementation. The recurring components work in sync with the way search engines work. They come into play when creating new websites, dealing with competitive pressures, adding new or dynamic pages, changing content and ongoing link profiling.

Myth #7: SEO will take years to return results

A professional SEO process begins with a “needs assessment,” documenting past, current and future activities related to natural search (SEO). When allowed to provide our process and methodology, complex websites have returned excellent natural search results within 30-90 days.

A critical path to quick wins is having proper measurement metrics in place. Benchmarking natural search status prior to SEO implementation is also important for setting up your SEM scorecard. Measuring lift is easily accomplished by measuring non-brand keyword traffic and/or revenue using web analytics and/or interactive marketing analytics.

The “SEO assessment and measurement process” is distributed to provide stakeholders with critical data about SEO expectations and ROI. Clearly, statements about SEO results and expectations have long been misunderstood or even abused within the search community, primarily due to a lack of professional guidelines and/or industry standards.

Companies seeking SEO services must look for SEM qualifications. SEO best practices are now available to mitigate abuses creating false expectations, and no one has to wait years to see results.

Bruemmer is a regular contributor on search for iMediaConnection.

Myth #8: PageRank is the critical measure of a site’s success

PageRank was a rather defining aspect of early Google search. Today, however — while PageRank still plays a role — we use more than 200 signals in ranking search results. This means that webmasters who focus primarily on PageRank are missing the bigger picture and overlooking aspects of their website that they have more control over. Of particular note, PageRank is focused on the issue of a page’s importance, whereas a larger component in determining search results is relevance. We aim to deliver results that are relevant to the query typed into the search box, the area where the person is searching from and, in many cases, even each person’s own demonstrated interests, based upon search history.

At the core, though, what generally makes a site successful is original and compelling content and tools. For a given set of pages, PageRank may fluctuate, and rankings do shift as the internet evolves. But in the end, what’s most important is consistently happy users: people who bookmark and share your site, who understand and respect your brand and who can confidently and seamlessly make that purchase.

Myth #9: Accessibility doesn’t really matter

Too many webmasters have thought of accessibility as an afterthought, as a “nice to add” feature for the blind or for a hypothetically small number of people on dial-up or super old computers. However, folks browsing the web on an iPhone can’t do anything on a site that has all its content and navigation in Flash. Business folks wanting to make purchases on the go using a low-bandwidth connection may find many of today’s multimedia-heavy sites simply unusable. And, especially relevant to your page’s ranking in search results, Googlebot cannot understand the meaning of photos or videos.

Site accessibility — by users on a wide variety of browsers and connections and by search engine bots — should be one of the first things webmasters focus on. If users can’t effectively use your site, you lose business. And if Googlebot can’t access or understand your site, you lose traffic.

Here are a couple of best practices: Make the bulk of your content and navigation text-based, optionally adding multimedia to spice things up. Next, test your site using mobile phone browsers and ideally even a text-based browser such as Lynx. We have more details in our official Webmaster Central blog, here and here.

Myth #10: Google has an adversarial relationship with webmasters and publishers

We view webmasters as our allies, and that’s not just pie-in-the-sky idealism. Helping webmasters get great content into Google benefits everyone — the webmasters, Google and our millions of users. That’s why we created Webmaster Central, which features a collection of powerful webmaster tools, our official webmaster blog, a forum featuring Googler and non-Googler search experts and help documentation in more than two dozen languages.

We are, of course, a bit constrained in what we can disclose about the subtleties of our ranking algorithms and such, largely to protect against unscrupulous folks who attempt to deceive both Google and our users. I was a webmaster myself for many years, so believe me, I know that can be frustrating. However, we’ve been sharing an increasing amount of information with site owners over the last few years, providing insights into how Googlebot sees a site’s pages, what keywords these pages most commonly show up for in our search results and so on.

Of greater importance, though, we’ve been supporting more two-way communication. We have a message center in our Webmaster Tools where we can, for instance, let webmasters know that their site has been hacked. And we have dozens of experienced Googlers from our Search Quality team who spend a lot of time reading and posting in our Webmaster Help groups and attending conferences around the world, answering questions and building up communities of search experts.


Location, Location, Location

If you were going to build a new physical location for your dealership, the “where” would be as, if not more, important than the “what.” After all if potential customers couldn’t find you, then buying a vehicle from you becomes all but impossible.

Not coincidentally, the same principle applies online. If your customers and prospects can’t find you, you don’t exist. Having a memorable and intuitive web address will help, but the number one thing you need to do is ensure that your site plays nice with the world’s biggest search engines (Google, Yahoo!,, MSN, etc.). It is those sites that determine your “location” online. Being near the top of the first page on Google is like having your dealership at the intersection of town’s two main drags — it’s so easy to find that your customers can’t help but come across it from time to time.

Search engines are the maps of the digital landscape. And products like Dealer Impact’s Rank King can use a variety of strategies and tactics to ensure that you grab a piece of prime digital real estate. That’s the “where” of the digital marketing game… and it’s at least as important as the “what.”

D. Jones
Marketing Strategist/Creative Consultant
SmackDabble, LLC

SEO lessons Nike and Tiffany’s didn’t learn

by Lisa Wehr

Published October 17, 2007 in iMedia Connection

A newly released Oneupweb study of retailers reveals some startling facts about the power of optimizing for search.

Quick, who’s the largest online retailer of shoes? Nike? Footlocker? Payless? Timberland? Not even close. The winner is, an eight-year-old company that, until recently, had little or no brand recognition. In 2006, sold more than the online sales of all the well-known brands listed above, combined.

Recently, Oneupweb looked at the top 100 online retailers, including some of the world’s most recognizable brands, to see how well they optimized their websites. What we found surprised even us.

Many of the world’s leading brands ignore SEO and maintain poorly optimized websites. In fact, 60 percent of the leading online retailers had little or no optimization on their websites. As the success of and other savvy internet marketers illustrates, extraordinary customer service combined with sound SEO can help a company overcome the obvious competitive advantage of branding alone.

Nike just didn’t

Nike and brand marketing are synonymous. So, we were surprised to discover little or no sign of optimization on the company website. Someone searching for “athletic shoes” will not find in the first three pages of Google results. In fact, the site barely shows up on page one of Google for the branded search term “Nike Athletic Shoes.”

Nike has an online visibility strategy. The company supplements its well-known branding efforts with paid online advertising for important keywords. Research indicates PPC campaigns are much more effective when combined with natural search. They aren’t in Nike’s case, leading us to speculate about how much more effective Nike’s online and offline marketing efforts could be if they were integrated into a well-executed SEO program.

Size doesn’t matter online

The beautiful thing about online retail is the way the medium levels the playing field. Huge warehouses and 500 worldwide locations mean nothing. Visitors don’t have to drive to a brick-and-mortar location; they are driven online to the retailer’s website. Retailers need only attract enough interested visitors to their sites and provide an excellent shopping experience after their guests arrive.

Searchers look for brands they know. However, Oneupweb’s recent research showed repeated examples of a well-optimized, savvy marketer successfully competing with a better known brand. Well-optimized websites position the challenger higher on non-branded keyword searches. The higher the position on search engines, the more traffic, conversions and sales.

Online, web-only jeweler Blue Nile outsells its much larger and more-well known competitor, Tiffany & Company. The Tiffany brand has been around 170 years; Blue Nile, eight years. Both sites are optimized, although the clear edge goes to Blue Nile when it comes to the degree of optimization and overall online customer service experience.

Well-optimized for a changing landscape

Our study did not include the use of new media as a criterion for the degree of optimization on a website. Nevertheless, we found that top online retailers who have well-optimized websites are 60 percent more likely to have corporate blogs or podcasts. This reflects a growing sensitivity to Google’s new Universal Search model specifically, and the growing popularity and viral power of blogs and podcasts overall., the leading online retailer for all three studies Oneupweb has conducted since 2003, uses blogs and podcasts in addition to many other sound SEO and SEM practices. Furthermore, the company constantly solicits user feedback and reviews to generate loyalty, links and social support for its products and services. The results speak for themselves.

Consider the opportunities

Our study should be good news to most online retailers. For those who do optimize well, it means an existing competitive edge that will allow them to compete successfully with some larger, more established brands.

And for those large brands that do not optimize well, there is a great opportunity for growth in the best or worst of years. Either way, there is much work to be done; work that can result in greater traffic and revenue.