Friday, December 30, 2011

Google Panda Update: Say Goodbye to Low-Quality Link Building

A while back, I wrote about how to get the best high volume links. Fast forward eight months and Google has made two major changes to its algorithm -- first to target spammy/scraper sites, followed by the larger Panda update that targeted "low quality" sites. Plus, Google penalized JCPenney, Forbes, and Overstock.com for "shady" linking practices.

What's it all mean for link builders? Well, it's time we say goodbye to low quality link building altogether.

'But The Competitors Are Doing It' Isn't an Excuse

This may be tough for some link builders to digest, especially if you're coming from a research standpoint and you see that competitors for a particular keyword are dominating because of their thousands upon thousands of pure spam links.

But here are two things you must consider about finding low quality, high volume links in your analysis:
1. Maybe it isn't the links that got the competitor where they are today. Maybe they are a big enough brand with a good enough reputation to be where they are for that particular keyword.
2. If the above doesn't apply, then maybe it's just a matter of time before Google cracks down even further, giving no weight to those spammy backlinks.

Because, let's face it. You don't want to be the SEO company behind the next Overstock or JCPenney link building gone wrong story!

How to Determine a Valuable Backlink Opportunity

How can you determine whether a site you're trying to gain a link from is valuable? Here are some "warning" signs as to what Google may have or eventually deem as a low-quality site.

>> Lots of ads. If the site is covered with five blocks of AdSense, Kontera text links, or other advertising chunks, you might want to steer away from them.

>> Lack of quality content. If you can get your article approved immediately, chances are this isn't the right article network for your needs. If the article network is approving spun or poorly written content, it will be hard for the algorithm to see your "diamond in the rough." Of course, when a site like Suite101.com, which has one hell of an editorial process, gets dinged, then extreme moderation may not necessarily be a sign of a safe site either (in their case, ads were the more likely issue).

>> Lots of content, low traffic. A blog with a Google PageRank of 6 probably looks like a great place to spam a comment. But if that blog doesn't have good authority in terms of traffic and social sharing, then it may be put on the list of sites to be de-valued in the future. PageRank didn't save some of the sites in the Panda update, considering there are several sites with PageRank 7 and above (including a PR 9).

>> Lack of moderation. Kind of goes with the above, except in this case I mean blog comments and directories. If you see a ton of spammy links on a page, you don't want yours to go next to it. Unless you consider it a spammy link, and then more power to you to join the rest of them.

What Should You Be Doing

Where should you focus your energy? Content, of course!

Nine in 10 organizations use blogs, whitepapers, webinars, infographics, and other high quality content to leverage for link building and to attract natural, organic links. Not only can use your content to build links, but you can use it to build leads as well by proving the business knows their stuff when it comes to their industry.

Have You Changed Your Link Building Strategy?

With the recent news, penalties, and algorithm changes, have you begun to change your link building strategies? Please share your thoughts in the comments!

Thursday, December 29, 2011

Web Analytics Year in Review 2011

Around the same time last year, we discussed how businesses were finally investing heavily in the tools, people, and processes required when operating data-driven organizations.

This year, an eConsultancy report estimates the UK web analytics technology and services sector alone to be worth more than £100 million annually. If we assume this number can be applied relative to GDP, that would put the web analytics technology and services sector well above $4 billion globally.

But as with anything web analytics related, sometimes concentrating on the numbers are not as important as the trend! The trend for total spend on internal staff, third party agencies and total vendor revenues appears to have grown by 12 percent year over year, certainly in the realm of "significant".

These were the top stories and trends of 2011.

Online & Offline Data Integration

What good is online intelligence without offline context? The integration of online and offline data was a focus for many organizations in 2011 because without this connection, it’s hard to understand the online contribution of marketing, channel of preference for task-level customer and prospect interaction, and customer satisfaction across channels. Without making this connection, it is nearly impossible to optimize online experience for lifetime value.

Social Media Analytics

Social media analytics diversifies with emphasis on business requirements. Many vendors and agencies started diversifying their service portfolios to cater to varied business and social media goals in 2011.

The industry gained a little clarity this year when several vendors started clearly categorizing their social media analytics into several use cases such as:

>> Monitoring and trend analysis.
>> Sentiment analysis and reputation management.
>> Workflow management.
>>Integrated social insights.

Although this sub-sector of analytics is far from mature, several large-scale companies are taking major steps to bridge the gap between social media analytics and cross-channel product offerings. Look for significant moves in this area for 2012.

Omniture SiteCatalyst Launches

Adobe announced the launch of Omniture SiteCatalyst 15 at the Omniture Summit in March this year. For those of us fortunate enough to be in attendance, it felt as if we were strapped into a fighter jet and just engaged afterburners. Adobe has done a great job integrating Omniture into their product portfolio, and the wow-factor for their presentation was nothing short of awe-inspiring.

I’ve always had a healthy love-hate relationship with Omniture, so luckily for them the hype associated with V15 was warranted! Some of my favorite features include real-time segmentation, a new bounce rate metric, ad-hoc unique visitor counts, and a new processing rules feature that makes server-side implementation tweaks very easy.

Salesforce.com Buys Radian6

Salesforce.com bought Radian6 for $326 million and brought cloud computing to a whole new level. What I like most about this deal is how naturally this acquisition can be folded into Salesforce’s CRM product.

‘Super Cookies’

Unfortunately it’s not all good news this year, as several companies (most notably Kissmetrics) were the recipients of some serious bad press and legal action for use of so-called “Super cookies” in July. These Flash-based cookies were blamed for a number of privacy concerns including cross-domain and cross-client visitor identification and re-spawning traditional cookies after being cleared from user browsers.

Mobile Analytics

This year marked the dawn of mobile analytics, especially after Apple rewrote their third-party tracking policies towards the end of 2010. As the mobile market continues to mature with increased pressure from the almost limitless supply of new Android handsets and operating systems, look for mobile analytics to take a larger share of attention in 2012.

Google Analytics Real-Time

Google Analytics Real-time debuted in the fall of this year, enabling millions of site owners across the globe watch user interaction as it happens, which is an exciting prospect for many. Although this feature set has been around for a while from vendors such as Woopra, it’s remarkable that Google would offer such a robust feature at no cost.

Google Encrypts Search Data

Almost immediately after any positive sentiment had tapered off from the introduction of real-time analytics, Google must have decided to test the waters with a carefully-measured negative announcement that they would be removing search query parameters for users of their secure (SSL) search results. The news didn’t go over too well amongst the online marketing community, and to this day the analytics community is still relatively sore on the subject, so don’t bring it up with your web analyst at the holiday party.

Google Chrome Passes Mozilla Firefox

More good news for Google surfaced in November when Google Chrome surpassed Mozilla Firefox in global browser share for the first time in history. Although it is too soon to tell what the effect will be on the analytics industry, one thing is certain: ensure your quality assurance and browser compatibility testing includes all three browser minorities.

Here’s to a safe and happy holidays and prosperous New Year!

Friday, December 23, 2011

How Advanced Marketers Will Use Facebook in 2012

As digital marketers, we’re frequently reminded magic formulas don’t really exist. Still, our experimentation and experiences often lead to insights about “what’s next.” Hopefully, the following insights and sample tools mentioned in this article will inspire your consideration (and actions) for 2012.

What Happened in 2011

For most brands, perhaps the most predominant focus with Facebook marketing in 2011 was growing the fan base. We saw a variety of custom Facebook applications (tabs) paired with Facebook ad buys – where requiring a Like (the becoming of a fan for that page) was the first or even final call to action.

As a result, some of the most common questions emerging were:
>> What’s the value of a Facebook fan?
>> How many Facebook fans shouldwe have?
>> Now that we have these fans, what should we do with them?
>> What can we be doing with Facebook outside of Facebook?

And honestly, many have even asked, “why are we doing this again?”

It’s The Data, Stupid

If you’re saying, “oh no, not another discussion on analytics or the latest changes in Facebook Insights,” fear not. This discussion goes beyond tracking simple key performance indicators (KPIs) within some marketing dashboard that spits out monthly reporting with +/- percentages.

On the contrary, it goes straight to the core of how companies can use a new breed of tools leveraging Facebook data to dramatically improve advertising results, content creation and overall business strategies. For the sake of brevity, we’ll take a quick look at two tools in particular: CalmSea and InfiniGraph.

CalmSea

CalmSea is a technology platform that enables you to create a conversion-based offer that can be accessed via a website, email, tweet, mobile device or Facebook page. As an example, let’s consider a coupon.

Normally, the basic data you would expect to collect with an online coupon might consist of clicks, shares and redemptions. Of course, you may also collect some demographics – or even additional data, depending on form-related entries required of the user in order to get the coupon.

The trick with CalmSea lies within an extra click that prompts your Facebook authorization in exchange for access to the coupon (or other offer). This authorization includes access to 3-4 of your Facebook permissions, which provides the CalmSea platform with multiple data points specific to your social graph (likes, interests, demographics, friends, etc.).

All of this activity can take place on any web page, including your ability to share the coupon with others on Facebook without actually ever going to your Facebook page.

When I spoke to Vivek Subramanian, VP of Products for CalmSea, he said they are seeing upwards of a 70 percent acceptance rate on the permissions authorization for branded apps (which could include coupons, sweepstakes, private sales, group buys and more).

The Power of The Data

CalmSea takes the Facebook user interactions and news feeds around the given offer – then combines that data with purchase/conversion analytics (could be Google Analytics) to aggregate and display insights on segments of users/customers with the highest levels of:

>> Engagement
>> Profitability
>> Influence

This kind of data goes beyond Facebook Insights, in that it enables you to build predictive models based on distinct attributes that best describe current and potential customers with respect to the three items listed above.



In the figure above, you can get a slight feel for CalmSea’s dashboard, which demonstrates, among other items, the ability to view social insights compared to purchase data insights on users who have authorized the offer.

Depending on your role in the company (media buyer, content creator, channel partner/affiliate manager, etc.) this kind of data ideally improves how and where you spend your time and money.

The initial offer you develop with a platform like CalmSea will likely have a consistent conversion rate with similar offers you may have conducted in the past. It’s the offers that follow, leveraging the data collected from your first use of the platform, that stand to produce significantly improved results.

InfiniGraph

The InfiniGraph platform aggregates Facebook and Twitter data for the purpose of identifying relevant (real-time) affinities, content and interests that are trending around a particular brand, product or industry. There are two key considerations with respect to how this platform’s output produces actionable value:

Improved performance on your Facebook ads: Gives you insights to new interests/keywords you should be targeting as part of your selection process within Facebook’s ad platform.

Insights to assist with content creation and curation:Gives you a clear picture and delivery mechanism for content that is trending via a content “Trend Score” that algorithmically combines likes, comments, clicks, retweets, and shares.

nfiniGraph’s approach to identifying content that’s trending on Facebook, in particular, provides a level opportunity that is certainly missed by many brands wishing to dive deeper into content strategy (check out the Digital Path to Social Media Success to view the four kinds of content you could be addressing).

To describe how this works, imagine a series of Facebook status updates that are posted about subject matter relevant to your fans (on your Facebook page or another Facebook page your fans follow).



In the sample from InfiGraph above, you can see the dates these status updates were posted, in addition to the enormous amounts of engagement they received. Here’s the problem: Think of how many fans of this page would also be interested in this content, but simply didn’t see it. Now think of how quickly those status updates will slide down the page and disappear.

As Chase McMichael, President of InfiniGraph, told me, "Humans can’t keep up with trending content, nor can they see how content trends across multiple Facebook pages containing fans with similar interest."

McMichael alludes to "crowdsourcing" of the human voice around collective interests and actions. Not only can this aid in the repurposing of content otherwise lost, but as McMichael so eloquently puts it: "you can know where to double-down from a media buying perspective. Who needs comScore when you have a resource that is guiding you where to advertise based on what a large audience is in essence telling you?"

Wrap-up

Although the summaries on these platforms don’t do them justice, my hope is you’ll be inspired to dig deeper regarding the possibilities they offer. It will be interesting to see how Facebook will continue enabling access to data, but I think it’s a safe prediction that advanced marketers will leverage it to the hilt.

On a final note, I’ll bid you a farewell to 2011 with my favorite quote of the year:

"Data will become the new soil in which our ideas will grow, and data whisperers will become the new messiahs." – Jonathan Mildenhall, VP of Global Advertising Strategy at Coca-Cola

Thursday, December 8, 2011

New Tagging Suggests Google Sees Translated Content As Duplicates

Last year Google launched meta tags for sites where a multilingual "template (i.e., side navigation, footer) is machine-translated into various languages but the "main content remains unchanged, creating largely duplicate pages." This week they have gone a step further and now include the ability to differentiate between regions that speak basically the same language with slight differences.

Like the canonical tag, the implementation falls on the website owners to do, in order to get "support for multilingual content with improved handling for these two scenarios:
1. Multiregional websites using substantially the same content. Example: English webpages for Australia, Canada and USA, differing only in price.
2. Multiregional websites using fully translated content, or substantially different monolingual content targeting different regions. Example: a product webpage in German, English and French."

This tagging is interesting and suggests Google knows when the content on a site is duplicate despite it being in a different language. Has their data storage the ability to translate, or just recognize words that are used in the same language but are regionally different? If I use "biscuit" on my UK or Australian sites in place of "cookies", does Google know they are the same word?

"If you specify a regional subtag, we'll assume that you want to target that region," Google tells us.

Is duplicate content now being measured for similar terms? Or are the tags a way to have website owners limit the pages Google index for regional areas? We add the tags and Google thins the pages we have showing in the SERPs for different regions?

Google shared some example URLs:
>> http://www.example.com/ - contains the general homepage of a website, in Spanish
>> http://es-es.example.com/ - the version for users in Spain, in Spanish
>> http://es-mx.example.com/ - the version for users in Mexico, in Spanish
>> http://en.example.com/ - the generic English language version

On these pages, you can use this markup to specify language and region (optional):
>> [link rel="alternate" hreflang="es" href="http://www.example.com/" /]
>> [link rel="alternate" hreflang="es-ES" href="http://es-es.example.com/" /]
>> [link rel="alternate" hreflang="es-MX" href="http://es-mx.example.com/" /]
>> [link rel="alternate" hreflang="en" href="http://en.example.com/" /]

Seems like many wouldn't bother installing the tags unless Google was to start dropping pages, or if the implementation helps improve regional rankings for the pages where publishers have gone that extra step and customized their content to specific regions and subtle language differences.

The hreflang tag has been around for quite some time. The W3 organization discusses it in 2006 and has it in its links in HTML documents list. This addition in to the head tag information seems to be a new twist. How Google uses the information for ranking will really determine if people will use it.

Tuesday, December 6, 2011

SEO is Both Science and Art

For many people who aren’t involved in search engine optimization (SEO) on a regular basis, it’s easy (or so they think). You simply create a website, write some content, and then get links from as many sources as you can.

Perhaps that works. Sometimes.

More often than not, the craft of SEO is truly a unique practice. It’s is often misunderstood and can be painfully difficult to staff for. Here’s why.

SEO is Science

By definition, “Science” is:

1. a branch of knowledge or study dealing with a body of facts or truths systematically arranged and showing the operation of general laws: the mathematical sciences.
2. systematic knowledge of the physical or material world gained through observation and experimentation.
3. any of the branches of natural or physical science.
4. systematized knowledge in general.
5. knowledge, as of facts or principles; knowledge gained by systematic study.

Anyone who has performed professional SEO services for any length of time will tell you that at any given time we have definitely practiced each of the above. In some cases, changes in our industry are so rapid that we crowdsource the science experiments among peers (via WebmasterWorld forums or Search Engine Watch forums).

Unfortunately, Google doesn’t provide step-by-step instruction for optimization of every single website. Every website is unique. Every optimization process/project is unique.

Every website represents new and interesting optimization challenges. All require at least some experimentation. Most SEOs follow strict methods of testing/monitoring/measuring so that we know what works and what doesn’t.

We have a few guidelines along the way:

1. Our “branch of knowledge” is well formed in what Google provides in their Webmaster Guidelines and SEO Starter Guide.
2. Our unique experience. Just like you might “learn” marketing by getting your bachelor’s degree in marketing, you really aren’t very good at it until you’ve worked in your field and gained real-world experience. There are so many things that you can read in the blogosphere regarding SEO that are complete crap. But, if you didn’t know any better, you’d buy off on it because “it sounds reasonable, so it must be true!” So, be careful to claim something is 100 percent “true” unless you have enough “scientific” evidence to back up the claim. Otherwise, it’s called “hypothesis”:
a. A supposition or proposed explanation made on the basis of limited evidence as a starting point for further investigation.
b. A proposition made as a basis for reasoning, without any assumption of its truth.

SEO is Also Art

By definition, art is:

the conscious use of skill and creative imagination especially in the production of aesthetic objects

I've worked with and befriended many incredibly bright SEOs in my years in this business. It is those who manage to blend the scientific skills with the creative thoughts on how to experiment/improve programs are the gems.

Getting creative with SEO is thinking of how a marketing program can encompass social, graphic design, link building, content generation, and PR to drive toward a common goal.

Getting creative with SEO is also about reworking a website’s design/code so that usability and accessibility improve, while maintaining brand guidelines and keeping with “look and feel” requirements, yet improving SEO.

Every day, we must get creative in determining how to best target keywords by determining which method of content generation gives us the best chance at gaining a presence in the search engines and – most importantly – engaging our audience.

Should we write a blog post? Should this be best handled in a press release? How about a video? Infographic? New “corporate” page on the site? There are a multitude of ways that we might determine to target a keyword via content.

The Perfect SEO

Today’s SEO is so much more involved than SEO of years past. When I hear people saying that they’re trying to determine if they should hire an in-house SEO or an agency, I will give them the pros and cons of each (and there sincerely are pros and cons of each).

But one factor which I believe leans toward the strength of an agency is that there’s typically going to be a team of individuals, each with a unique skill set. And, these individuals can share examples of what works and what doesn’t, with each other (scientific experiments often occur), they can bounce creative thoughts off of one another and collectively provide more value than any one person might.

Our industry needs more of these highly-skilled “freaks of nature” who blend both the scienctific skills and artistic creativity of SEO.

Friday, November 11, 2011

Google May Penalize Your Site for Having Too Many Ads

Google is looking at penalizing ad heavy sites that make it difficult for people to find good content on web pages, Matt Cutts, head of Google's web spam team, said yesterday at PubCon during his keynote session.

”What are the things that really matter, how much content is above the fold," Cutts said. “If you have ads obscuring your content, you might want to think about it,” inferring that a if a user is having a hard time viewing content that the site may be flagged as spam.

Google has been updating its algorithms over the past couple months in their different Panda updates. After looking at the various sites Panda penalized during the initial rollout, one of the working theories became that Google was dropping the rankings of sites with too many ads "above the fold."

This is an odd stance, considering Google AdSense Help essentially tells website publishers to place ads above the fold by noting, "All other things being equal, ads located above the fold tend to perform better than those below the fold."

Cutts also encouraged all websites that have been marked as spam and feel they should not have been marked as spam to report their sites to Cutts and his team. Cutts stated that he has a team of web spam experts looking into problem sites and that the Google algorithm still misses a site or two in its changes.

SEO is Not Dead, Is Always Evolving


Leo Laporte took the stage Tuesday as the keynote speaker at PubCon. Laporte talked about video and how getting your audience involved with you is the next step to online media.

Later, Laporte said he believes SEO will be dead in the next six months. As you'd expect, the crowd responded negatively to this assertion, even causing many of them to walk out.

Yesterday, Cutts responded during his keynote talk. Cutts started by setting the record straight, letting everyone in the audience know that SEO will still be here for the next six months, let alone the next six years.

Cutts joked by mentioning a a tweet about him "spitting out his morning coffee" in reaction to Laporte's statement the earlier morning. He thought it was more of a joke and laughed about the whole thing.

SEO will always be evolving, Cutts told the audience. Search will always be getting better, getting more personalized for each one of us. Google will always be striving to help people to get the best results possible while getting fresh real time results.

Later he talked about how if Google and every other search engine were to die that Internet marketing and SEO would still be alive because of social. Looks like SEO is here to stay!

Monday, October 24, 2011

SEO Pricing Models: How Much Should You Charge?


For six years now, I’ve been the founder, CEO, and president of a search engine optimization company, and I’ve maintained one additional role, that I soon hope to relinquish.

Salesperson.

My reluctance in hiring a salesperson is that finding the right person for the job can be quite a challenge. “Selling” search engine optimization (SEO) is a very consultative venture and one that I’m loath to turn over to someone mostly motivated by a commission check.

Selling SEO requires that you know the space well, and that you’re able to balance the need to bring revenue into the company while ensuring that you’re bringing in business which won’t lead to future headaches, because:

Expectations weren’t set appropriately.
The prospect lacks an understanding of the process.
There aren’t enough resources on the client’s side (or not enough money to outsource) for generating content, addressing issues, and implementing recommendations.
The work isn’t properly scoped out so that we can gain a solid estimation as to the work that will be involved (so that the work doesn’t end up being unprofitable).
It’s that last piece that I’m going to address, today.

Scoping Out and Pricing an SEO Effort

In order to survive as a well-run business, we must make sure that we operate profitably. Just as I might look at the probability of generating a positive ROI for prospects that come seeking SEO services, so too must I focus on ensuring that my company also runs profitably.

To that end, I made a decision, years ago, to hire a head of operations who also happened to have a background in process improvement. It also doesn’t hurt that she’s a CPA. Her name is Kim Patterson, and she also happens to be my wife’s cousin (so you headhunters can just keep on looking).

Kim came to the company with no background in SEO. Over the years, she has learned the process and has put in place measurements of effectiveness, efficiency, and accountability that I believe had been sorely lacking in our industry.

Another major component of bringing Kim on was to make sure that we were charging a fair rate (one which was within Industry standards, one which helped us to reach our profitability goals and one which would be reasonable, based upon what value we bring to our clients).

I asked Kim what formula she uses to price our SEO initiatives. Her response:

First, you need to determine your profit goals (15 percent; 20 percent; 30 percent; more?). You calculate your fixed overhead costs (costs that you pay no matter what your sales are), variable costs (what costs go up as sales go up – and what the relationship is to the sale dollar; not all relationships are the same for each sector of your biz) and direct costs (what actual direct labor, direct services, etc) for the actual job and make sure when it all is put together you end up with your targeted profit.

The real tricky part here is determining the direct costs. SEO projects can’t be precisely scoped out for everything that you might do for the next 12 months. Part of SEO is analysis and recommendations, based upon what we see in the SERPs, analytics, competitive analysis, and industry changes/opportunities, among other things.

Pricing SEO is one of the great challenges, because most prospects still have a basic understanding of search engine optimization and the time/work involved in the efforts. The best that anyone can do is get a sense for how much time may be required to spend adequate time to address specific goals.

Rand Fishkin provided a tremendous value when, in 2007, he wrote a quality post on pricing SEO. Granted, this was written over four years ago.

Can you think of much that is cheaper today than it was four years ago? If supply and demand are the drivers for pricing, I would suggest that demand far exceeds (good) supply. There are many people who lay claim to being good at SEO, and a scarce few who truly do it well.

Today, I hope to peel back the onion just a bit, to help you to understand how search engine optimization is priced and how you, too, might want to consider your pricing models for SEO.

SEO Pricing Structure

Many firms still offer package rates for SEO. In fact, there’s one firm that is driving me absolutely crazy because they’re spamming a client of mine with emails. Here’s an example of one such email:

I just got the details from XXX management on our July SEO Special, and here they are…… “Summer Gold Rush” Special

Our National Gold Plan - Regularly Priced at $2750 per month will be greatly discounted for ONLY 30 new clients starting July 11

The 2011 Summer SPECIAL:

Only $1999/mo for the life of the account
That's a $751 monthly savings
That's approx 27% off each month for 30 lucky clients
Equal to saving over $9,000 per year
Clients can purchase multiple specials

*Starts as a 3-month agreement, then runs month-to-month

To me, an SEO effort isn't a package of tactical deliverables. Unless you’re simply hiring a company to merely provide tactical work (keyword research, site structure analysis, competitive analysis, SEO audit, analytics review, usability consulting), SEO is something that is unique to every website, competitive environment, client goal(s), and assets (news, blog, product search, local, video, image, etc.) and is ongoing with review and “optimization” (there is a reason that “O” exists in “SEO”).

Just as you’d expect to scope out a website design and development project, so too must you scope out a search engine optimization effort.

Who’s going to write the content?
Who is responsible for PR efforts?
Who is handling social marketing?
Who’s doing link building?
Who’s restructuring the website, as necessary?

Today’s SEO is about bringing together many facets of your marketing, web design/development and PR/social efforts so that they work well together.

All of these things go to “scope” and each requires time (either the agency's time or the company’s time). And, yes – time is money.

Setting Pricing Structure

When we go through the process of determining how much a search engine optimization effort might be, it goes directly to how much time we have to spend on the initiative.

Yes, we have some basic templates to follow, in terms of what is generally included in most SEO efforts, but then we need to dig deeper into the time that might be needed to address items which the prospect has mentioned are “goals” of the effort, and address any human resource allocation that we may need to provide because the prospect isn’t adequately staffed.

Cost of Talent

We’ve all seen the ads: “$400/Month for SEO.” How can one firm be charging $400 per month while another is proposing a monthly cost of $20,000 per month? What’s the difference?

In most cases, it comes down to people. In my past, I was the president of an SEO firm that had over 30 “employees” (contractors) offshore. We paid those folks $300-$400 per month for full-time employment.

I won’t speak for all offshore firms/contractors, but our experience with these contractors wasn’t very positive. A couple guys were good, but more often than not, the work would need to be redone by our staff in the U.S., or otherwise we sold the services so cheaply that the client’s expectations were low and so “it worked.”

It depends on your expectations.

If you think that the $400 a month guys are doing “the same thing” as the $20,000 a month guys, you’re probably going to be wrong. Chances are the $20,000 a month guys are hiring people that cost more than $400 a month. While the deliverables may seem the same (anyone can get their hands on the $20,000 a month company’s proposal, and find/replace with their company name and say “we do the same thing”), at the end of the day, you’re paying for the people/process/software/experience of the firm(s).

Cost of Software

There are plenty of great free SEO tools. Due to the length of this post, I can’t get into them all.

Then, there are plenty of great tools that cost significant amounts of money. The firms charging $20,000 per month may very well be using software for management of the efforts that cost $10,000 a month. It’s something to consider. Some of these pieces of software are slick, and I can see that larger companies would want this type of information/dashboard/reporting, because they need to work with a firm that is very polished and professional.

Amount of Talent Needed

Small effort? Perhaps we only need one person.

Small-ish effort, but the client has no webmaster, copywriter, PR folks, link builder, etc.? We may need to put a lot of resources into a project that would cost quite a bit of money and the ROI may simply not be there, for this small business.

If a project is properly scoped, you should be able to get a general sense as to how much time/work/effort may need to come from the client, and how much will need to come from the agency. Some efforts can become quite complex, with web designers, developers, copywriters, PR staff, social marketing, video optimization, SEO analysts, link builders, analytics specialists, usability consultants, etc.

If you need more resources, you’ll be paying more (whether it’s to your in-house team or to your agency).

ROI Estimation

As I mentioned above, many times, the effort needed for an SEO effort, and the time needed to realize results, can prevent many companies from affordably investing in search engine optimization. And, there are many times when companies are infatuated with “free” search engine traffic that they don’t realize that there are simply not that many searches for keywords that are relevant to their business.

If there are no fish in the pond, no amount of bait in the world is going to catch you a fish. Comprendé?

This is why one of my favorite tools is SEMRush. You can check out the estimated value of the organic traffic for your competitors, and compare that with where your website is currently, and get a sense as to whether there will be value available to you in your efforts.

If you see that all of your identified competitors are getting $200 a month “worth” of organic search traffic each month, it’s going to be hard to justify spending any amount of money on organic search. You might be best off to invest in a pay-per-click effort. At least with paid search, you’re guaranteed to pay only when you actually get a click.

Amortization of Efforts

The real value of SEO efforts are, generally, not realized in the first month(s) of the effort. Folks ask me all the time, “how soon until we see results?” The honest answer is “hard to say, but our guess is ____.”

We don’t own the search engines, so we can’t make guarantees. But, if a search engine optimization firm has dug into the competitive landscape and done a fair amount of research, they should be able to determine whether an opportunity to have a profitable SEO effort is possible, even if it may take some time. This is where you should be considering the lifetime value of the effort.

You didn’t get into business thinking that you’d start and be profitable, right away (at least, not most of you). In fact, when I started my company, I didn’t take a salary for over a year and I expected that there was a price to pay to be in business.

With search engine optimization, it’s kind of the same way. There will be a period of time (perhaps as small as a couple of months, and perhaps as long as many years) when money and time is going out, and there’s not an equal amount of value coming in. But, if you’ve done your homework, and know that the opportunity for great value is there, and you “plan your work and work your plan,” you can realize gains that can far exceed the value in “pay per click” models.

One such company in an extremely competitive industry for SEO was being charged $6,000 per month for SEO and was getting OK value for a couple of years, and now – five years later – realizes an approximate value of $319,000 “worth” of organic search traffic each month. Even if they were getting “no” value whatsoever for the first two years (invested $144,000 total) to one day realize a monthly value of $319,000 (each and every month), you’d say that this is a pretty good investment, right?

It doesn’t always come to these types of valuations. Just like the TV ads say, “results will vary.” For this particular client, they happen to be in a space which people search for these keywords veryoften.

For the local business owner, that’s one of the critical pieces in determining whether SEO is for you. What is the “search universe”? How many times are people searching for your keywords?

So, How Do You Price Your SEO Efforts?

Do you know if the efforts are profitable? How do you measure profitability?

I’d love to hear your thoughts, in the comments section below.

Friday, October 21, 2011

Optimizing Your Website for Mobile and Tablet Devices



Is your site optimized for the mobile and tablet generation? There’s no such thing as being lost in the woods anymore thanks to mobile and tablet technology, which gives consumers the power to directly connect from the palm of their hands. This means your site has to visually evolve as technology shrinks.

Here’s how you minimize your site while maximizing your searchability.

Smarterphone, Smarterconsumer, Smartersite

Having a mobile version of your site gives you the advantage of reaching the instant consumer. In 2011, we’re no longer sitting at our desks waiting for a page to load; we can be anywhere when pages load. That’s why it’s crucial you make sure your site is available in the mobile world, and optimized properly for optimal searching and customer-site connection. Here’s how you do it:

Google AdWords now lets you search keywords that are specific to mobile devices, helping you narrow down exactly what words you need to use to reach the instant consumer.
Take the same SEO formula used for classic site optimization and optimize your mobile site – use your mobile keywords to create your meta titles, title tags, and headers.
Don’t overload your reader. Keep your content short and simple, make sure images are smaller to help give the instant consumer the best mobile experience possible.
Make sure fonts are nice and clear, and important direction buttons are bigger. Remember this is all being viewed on a small screen.

To help make sure the instant consumer lands on the mobile version of your site when they’re strolling down the street reading their smartphone, you have to do some quick website revamping to turn your site from computer-friendly to mobile-friendly.

Most sites have two URLs: one for mobile and one for computer. The SEO advantage to having a mobile version of your site is that Google now has a designated bot that crawls around looking for mobile versions of classic sites to index.

This means you have two pages of the same site indexed as one. This is a huge advantage for you because it means that if your mobile and classic site is optimized properly, you’re even more searchable.

Mobile To-Do Guide

Here are three key things you need to do right now to make your site mobile-friendly:

1. Optimize for Mobile: Know The Basics

Make content visually appealing for limited screen viewing by using CSS in your coding. Reduce the image sizes and ensure fonts and content are simple enough to quickly scan and understand.
Optimize your content and images by including the keywords found in your Google AdWords mobile search, and strategically sprinkle them throughout your pages. Best practice is to avoid using any ads – it’s hard enough to see your content.

Quick tip: If your site is e-commerce based, get an app developed. An app gives you the competitive edge you need to keep your customer from browsing the web and looking at your competition; instead it places them directly in your virtual store via your app.

2. Consider Your Design Options: Google Transcode and Mobile Subdomain

Google Transcode: Using Google’s configuration tool to transcode your site from classic HTML to mobile HTML won’t give your user a unified experience. When you get Google to the do work you risk having images and content resized in unattractive ways, duplicate content/error pages, and overall bad user experience. How do you avoid this? Make a mobile subdomain.
Mobile Subdomain: Make a subdomain specifically for your mobile site. You can do this by creating a subdomain txt files. This is a key factor for search direction and indexing. Having one distinct mobile URL keeps your mobile optimization from interfering with your classic optimization (keeping the same experience on the small screen), and allows the GoogleBot Mobile to visit and index the mobile version for mobile searches.

Quick tip: Avoid using Flash, Java, Ajax and Frames. Instead try XHTML (WAP 2.0), cHTML (iMode) or WML (WAP 1.2).

3. Mobile Preview: Don’t Be Fooled

Run your site through WSC Mobile to ensure it is mobile-friendly and test, test, test it on multiple browsers and devices.

Quick tip: Not everyone has made the switch to a smartphone; there are still a number of users using classic phones. You’ve got to make sure your site looks good no matter what screen it’s on. Rethink your coding and design options, and look into apps.

The Rise of the Tablet

Now that we’ve covered why you need a mobile version of your site, let’s switch our minds over to tablets. Tablets are quickly replacing laptops and televisions – they’re lighter and easier to pack, making them more attractive for people on-the-go.

The top four ways consumers use tablets are:

Organizing recreational activities
Online shopping
Reading news or blogs
Social networking

Unlike the majority of mobiles, tablets rely mostly on Wi-Fi and 3G for connectivity. Making your site tablet-friendly means you’ve got to make it fast. Optimizing your site for tablet means you’ve got to make it familiar and simple.

Here’s what you need to think about:

Offering your site in classic and mobile versions gives you the upper hand if the user is using an iPad. Remember iPads don’t use Flash. If you have Flash content on your site, simply redirect the user to the mobile version, and everyone’s happy. Also consider making your site in HTML5.
If your site is primarily e-commerce based have an application developed that directly connects your products to the instant consumer in one tap. Apps easily store all your information in one place and remove the annoyance of waiting for a page to load. They also make the paying experience more enjoyable. The only disadvantage to an app is: it’s not picked up in organic search results.
Having a content heavy site is great but it can be time consuming and irritating to scroll down for what seems like forever. Avoid this consumer pet peeve by including “Previous” and “Next” buttons for easy content navigation.

Tablet To-Do

Here’s a top 10 list of what you need to do right now to make your site tablet friendly:

1. Limit duplicate content by changing CSS so it’s optimized for tablet viewing.
2. No Flash, you’re good. Flash, make sure there’s a redirection to your mobile site from the tablet.
3. Don’t overwhelm visitors with scrolling fever - include “Previous” and “Next” buttons to help guide them through your content and web pages.
4. Consider creating an app for e-commerce pages and information storage to help avoid irritation and slow loading time.
5. Optimize check-out process for visitors by using cookies, postal codes, and PayPal.
6. Make it touchable: use HTML5 and CSS3 to create scrolling and horizontal navigation to put all content on one page, not giving the visitor a tab-attack.
7. Adding CSS to increase the size of buttons on your page to make your site more attractive and user-friendly from the tablet perspective. No one really wants to zoom in that close.
8. Offer downloadable content by creating PDF versions of your sites content or important information. This also gives them the option to store your content for future reading.
9. Remember the 5 viewing angles: Vertical and landscape in both mobile and tablet, and the straight desktop view. Take advantage of the viewing options available.
10. Test, Test, Test – test to make sure it’s good to go and makes a returning impression.

Devices might be getting smaller, but search opportunities and consumer-to-site connection is expanding. You now have indexing for mobile sites, instant purchasing and sharing, and individually developed apps that remove users from the browser and place them directly into a niche online store, keeping their attention focused on the most important goal: conversion.

Thursday, October 20, 2011

Getting Social Traffic on the SERPs Without SEO

It’s coming up on a year since Matt Cutts announced that social signals are absolutely used as ranking factors for your content, but that’s not what I’m going to talk about here.

I’m going to talk about other ways that social can drive organic traffic to your site outside of links being retweeted or publicly shared by influencers.

Personalization

If you perform a search on Google for SEO, you’ll see something similar to the image below (I stripped out the paid ads). Note the position of the various results, especially the third result with is for a completely different definition of SEO.



Now what if I’m logged into Google? What will I see? Well, for me, it looked like this (again with the paid ads removed).



It looks a bit different. Suddenly there are people that I’m connected to with my Google account showing up under results they’ve indicated that they like, through some form of sharing.

The first two results are the same, but Aaron Wall’s site SEOBook.com, which was sitting in fifth when I wasn’t logged in, is now third for me. In fourth I now see the SEOmoz.org home page. When I wasn’t logged in, there were articles from SEOmoz in 9th and 14th, but their home page wasn’t showing until 21st position, way down the SERPs. Thanks to Keri there’s a much higher chance that I would click on it with it being 17 places higher than it normally ranks.

Everything else not shared by my network remains in the same order, although pushed down one or two spots based on their original position. So here social sharing within my group of contacts has changed my view of the SERPs and potentially my click behavior.

The same is also true of Bing. I did the below search for “NFL Picks” which shows the results from the third text link down.



Then I logged into Facebook and did the same search, and you can see the results aren’t too different, apart from the “Liked by your Facebook Friends” section, which suddenly surfaces a new video, a video that, when logged out, I don’t see surfacing in the top 100 results. Once again, my social network has personalized my version of the SERPs to give some content a greater chance of being clicked on that it generally has.



Even where the search results haven’t changes, I’m seeing the smiling faces of my Facebook friends sitting next to results that they’ve ‘liked’. If these people in my social circle are people that I trust, then the likelihood that I’ll click on that content rather than on the ‘unliked’ content ranking above them increase.

Real Time Results

When Google had their Twitter deal (prior to discontinuing it in July of this year) they would show a scrolling list of results in the SERPs. If users saw something they liked, they could click on a link within a particular tweet and open up a new page.

While Google doesn’t have real time search in their results, they have said that it will be returning at some point soon, with Google Plus as an integral part.

Bing currently displays their social results on a their social page but it would be hard to say that they’re “real time.”



That said, around the time of the Emmys, I saw Bing conducting a test of a Twitter widget. This widget showed up on the right side of the SERPs, with a list of results scrolling upwards.



This widget was good because it was less obtrusive than breaking the SERPs up with a scrolling list in the middle, and that it also displayed the domain of the shortened links. So no worrying about where the t.co was going to take you. All in all, I found it to be a good solution for displaying real time Twitter results, and therefore a good potential source of clicks on a SERP where your content may not be ranking organically.

Now, I’ve not seen this widget again since that weekend, and I don’t know whether it’s something they’re still testing, but it shows that both major search engines are trying to come to terms with the best ways to show social results in search results.

Wednesday, October 19, 2011

8 Durable SEO Elements


Recent changes to Google’s search ranking algorithms and highly publicized search penalization of well-known brands, have caused much angst in the SEO community. However, Google’s changes to its algorithms aren’t new.

Seasoned SEOs have experienced major algorithm changes over the last decade. In fact, Google has confirmed making on average at least one change per day to its search algorithms. Clearly change is here to stay.

How do you minimize the impact of these inevitable changes? Focus your efforts on the durable elements of SEO. These are the cardinal elements of SEO that will stay true regardless of Google’s ongoing improvements to its algorithms.

These eight durable elements will withstand the test of time because they are aligned with Google’s explicit statements and implicit motivations. Here’s a rundown of these elements and practical steps you can take to align your website with them.

1. Basic On-Page SEO Rules

The basic keyword SEO rules per the Google Webmaster’s SEO Guide are easy to communicate and easy to follow. Google wants you to follow these rules, because it makes crawling easier for their bots, which leads to better results for users, and in turn, helps Google with market share.

Make sure your pages are optimized for these basic SEO rules. Include target keywords in your URLs, titles, meta tags, H1s, etc. Determine which pages rank for which keywords. Then use the right keywords to double-down your efforts on these pages.

2. Inbound Quality Links

An inbound link from an external website to your website signals to Google that the content it’s pointing at is relevant to the subject surrounding that link. Quality inbound links aren’t easy to amass and they require your content to be good enough for someone else to reference. For these reasons, quality inbound links are believed to have the biggest impact on ranking.

Continue building your inbound links through white hat content syndication and promotions. Create unique content that others will want to reference and link to. Build relationships with other web writers, so they are aware of your content.

3. Authority

Google’s Internet is a meritocracy and authority is essentially your assigned status within this order. To date, Google has confirmed using only Google PageRank and social media authority as the two types of authority signals in their ranking algorithm.

A website’s authority is factored into rankings. There will always be those who attempt to game the system rather than advance on merit, so diminished authority is one way Google punishes fraud.

Build your authority by creating unique content and follow white hat practices. Build relationships with authoritative sites and users. Pursue links from them to benefit from their authority too.

4. User Experience

Google wants their users to be satisfied. If Google gives high ranks to bad websites, with terrible designs, slow page load speed, and difficult-to-navigate pages, users will look for alternative search solutions.

A fast page load speed and a low bounce rate are two measureable indications of a good user experience. Additional qualitative measures include a nice design, easy navigation and a good UI.

Basically, anything that makes a user remain on your site and read more. Keep your site updated, measure your bounce rate and page load speed, and create clear alignment between the keywords you target, your content and the experience you’re creating for your visitors.

5. Freshness

If relevancy is the name of the game, then freshness is its nickname. The world is evolving and so is the web, and freshness of content will always be an indication of a site keeping up with the world. Freshness is a supporting factor to a site’s authority and plays a huge part in news ranking.

Keep creating new content as well as updating existing content. Link to it internally from indexed pages as well as syndicate it on social media. Register your site and submit a sitemap to Google. If you have a news site, register to be on the Google News Index.

6. Diversity

To borrow from the capital investment world, diversification minimizes your systematic risk allowing you higher overall return on investment, and it would appear Google follows this rule. Google prefers diversity in the format and content of its results.

Use multiple formats to deliver your content on the same page – video, presentations, images – and publish unique pages for diverse content to optimize for long tail terms. Submit a unique sitemap to Google for formats other than text.

7. Feedback

In order to gauge their effectiveness, Google must collect feedback on its work. From machine input, like click through rate and bounce rate, to human input, like human evaluators, Google’s “manual intervention” and Google +1, Google will keep collecting feedback to improve its results.

Use only white hat tactics, be relevant and optimize your pages to the actual content on them. In addition, syndicate, promote and give people an easy way to share their feedback.

8. Compatibility with Updated Technology

Being compatible with new technologies such as new browsers is important to the overall user experience. Freshness is always good practice when it comes to being on top.

As an example, consider mobile devices. With the emergence and rapid growth of the mobile market, we can see how and why, in the future, Google will discount sites that aren't mobile device-compatible.

Make sure your site is up-to-date with the latest mainstream technology. For example, verify that your mobile site is indexed by Google and submit a mobile sitemap.