Friday, December 30, 2011

Google Panda Update: Say Goodbye to Low-Quality Link Building

A while back, I wrote about how to get the best high volume links. Fast forward eight months and Google has made two major changes to its algorithm -- first to target spammy/scraper sites, followed by the larger Panda update that targeted "low quality" sites. Plus, Google penalized JCPenney, Forbes, and Overstock.com for "shady" linking practices.

What's it all mean for link builders? Well, it's time we say goodbye to low quality link building altogether.

'But The Competitors Are Doing It' Isn't an Excuse

This may be tough for some link builders to digest, especially if you're coming from a research standpoint and you see that competitors for a particular keyword are dominating because of their thousands upon thousands of pure spam links.

But here are two things you must consider about finding low quality, high volume links in your analysis:
1. Maybe it isn't the links that got the competitor where they are today. Maybe they are a big enough brand with a good enough reputation to be where they are for that particular keyword.
2. If the above doesn't apply, then maybe it's just a matter of time before Google cracks down even further, giving no weight to those spammy backlinks.

Because, let's face it. You don't want to be the SEO company behind the next Overstock or JCPenney link building gone wrong story!

How to Determine a Valuable Backlink Opportunity

How can you determine whether a site you're trying to gain a link from is valuable? Here are some "warning" signs as to what Google may have or eventually deem as a low-quality site.

>> Lots of ads. If the site is covered with five blocks of AdSense, Kontera text links, or other advertising chunks, you might want to steer away from them.

>> Lack of quality content. If you can get your article approved immediately, chances are this isn't the right article network for your needs. If the article network is approving spun or poorly written content, it will be hard for the algorithm to see your "diamond in the rough." Of course, when a site like Suite101.com, which has one hell of an editorial process, gets dinged, then extreme moderation may not necessarily be a sign of a safe site either (in their case, ads were the more likely issue).

>> Lots of content, low traffic. A blog with a Google PageRank of 6 probably looks like a great place to spam a comment. But if that blog doesn't have good authority in terms of traffic and social sharing, then it may be put on the list of sites to be de-valued in the future. PageRank didn't save some of the sites in the Panda update, considering there are several sites with PageRank 7 and above (including a PR 9).

>> Lack of moderation. Kind of goes with the above, except in this case I mean blog comments and directories. If you see a ton of spammy links on a page, you don't want yours to go next to it. Unless you consider it a spammy link, and then more power to you to join the rest of them.

What Should You Be Doing

Where should you focus your energy? Content, of course!

Nine in 10 organizations use blogs, whitepapers, webinars, infographics, and other high quality content to leverage for link building and to attract natural, organic links. Not only can use your content to build links, but you can use it to build leads as well by proving the business knows their stuff when it comes to their industry.

Have You Changed Your Link Building Strategy?

With the recent news, penalties, and algorithm changes, have you begun to change your link building strategies? Please share your thoughts in the comments!

Thursday, December 29, 2011

Web Analytics Year in Review 2011

Around the same time last year, we discussed how businesses were finally investing heavily in the tools, people, and processes required when operating data-driven organizations.

This year, an eConsultancy report estimates the UK web analytics technology and services sector alone to be worth more than £100 million annually. If we assume this number can be applied relative to GDP, that would put the web analytics technology and services sector well above $4 billion globally.

But as with anything web analytics related, sometimes concentrating on the numbers are not as important as the trend! The trend for total spend on internal staff, third party agencies and total vendor revenues appears to have grown by 12 percent year over year, certainly in the realm of "significant".

These were the top stories and trends of 2011.

Online & Offline Data Integration

What good is online intelligence without offline context? The integration of online and offline data was a focus for many organizations in 2011 because without this connection, it’s hard to understand the online contribution of marketing, channel of preference for task-level customer and prospect interaction, and customer satisfaction across channels. Without making this connection, it is nearly impossible to optimize online experience for lifetime value.

Social Media Analytics

Social media analytics diversifies with emphasis on business requirements. Many vendors and agencies started diversifying their service portfolios to cater to varied business and social media goals in 2011.

The industry gained a little clarity this year when several vendors started clearly categorizing their social media analytics into several use cases such as:

>> Monitoring and trend analysis.
>> Sentiment analysis and reputation management.
>> Workflow management.
>>Integrated social insights.

Although this sub-sector of analytics is far from mature, several large-scale companies are taking major steps to bridge the gap between social media analytics and cross-channel product offerings. Look for significant moves in this area for 2012.

Omniture SiteCatalyst Launches

Adobe announced the launch of Omniture SiteCatalyst 15 at the Omniture Summit in March this year. For those of us fortunate enough to be in attendance, it felt as if we were strapped into a fighter jet and just engaged afterburners. Adobe has done a great job integrating Omniture into their product portfolio, and the wow-factor for their presentation was nothing short of awe-inspiring.

I’ve always had a healthy love-hate relationship with Omniture, so luckily for them the hype associated with V15 was warranted! Some of my favorite features include real-time segmentation, a new bounce rate metric, ad-hoc unique visitor counts, and a new processing rules feature that makes server-side implementation tweaks very easy.

Salesforce.com Buys Radian6

Salesforce.com bought Radian6 for $326 million and brought cloud computing to a whole new level. What I like most about this deal is how naturally this acquisition can be folded into Salesforce’s CRM product.

‘Super Cookies’

Unfortunately it’s not all good news this year, as several companies (most notably Kissmetrics) were the recipients of some serious bad press and legal action for use of so-called “Super cookies” in July. These Flash-based cookies were blamed for a number of privacy concerns including cross-domain and cross-client visitor identification and re-spawning traditional cookies after being cleared from user browsers.

Mobile Analytics

This year marked the dawn of mobile analytics, especially after Apple rewrote their third-party tracking policies towards the end of 2010. As the mobile market continues to mature with increased pressure from the almost limitless supply of new Android handsets and operating systems, look for mobile analytics to take a larger share of attention in 2012.

Google Analytics Real-Time

Google Analytics Real-time debuted in the fall of this year, enabling millions of site owners across the globe watch user interaction as it happens, which is an exciting prospect for many. Although this feature set has been around for a while from vendors such as Woopra, it’s remarkable that Google would offer such a robust feature at no cost.

Google Encrypts Search Data

Almost immediately after any positive sentiment had tapered off from the introduction of real-time analytics, Google must have decided to test the waters with a carefully-measured negative announcement that they would be removing search query parameters for users of their secure (SSL) search results. The news didn’t go over too well amongst the online marketing community, and to this day the analytics community is still relatively sore on the subject, so don’t bring it up with your web analyst at the holiday party.

Google Chrome Passes Mozilla Firefox

More good news for Google surfaced in November when Google Chrome surpassed Mozilla Firefox in global browser share for the first time in history. Although it is too soon to tell what the effect will be on the analytics industry, one thing is certain: ensure your quality assurance and browser compatibility testing includes all three browser minorities.

Here’s to a safe and happy holidays and prosperous New Year!

Friday, December 23, 2011

How Advanced Marketers Will Use Facebook in 2012

As digital marketers, we’re frequently reminded magic formulas don’t really exist. Still, our experimentation and experiences often lead to insights about “what’s next.” Hopefully, the following insights and sample tools mentioned in this article will inspire your consideration (and actions) for 2012.

What Happened in 2011

For most brands, perhaps the most predominant focus with Facebook marketing in 2011 was growing the fan base. We saw a variety of custom Facebook applications (tabs) paired with Facebook ad buys – where requiring a Like (the becoming of a fan for that page) was the first or even final call to action.

As a result, some of the most common questions emerging were:
>> What’s the value of a Facebook fan?
>> How many Facebook fans shouldwe have?
>> Now that we have these fans, what should we do with them?
>> What can we be doing with Facebook outside of Facebook?

And honestly, many have even asked, “why are we doing this again?”

It’s The Data, Stupid

If you’re saying, “oh no, not another discussion on analytics or the latest changes in Facebook Insights,” fear not. This discussion goes beyond tracking simple key performance indicators (KPIs) within some marketing dashboard that spits out monthly reporting with +/- percentages.

On the contrary, it goes straight to the core of how companies can use a new breed of tools leveraging Facebook data to dramatically improve advertising results, content creation and overall business strategies. For the sake of brevity, we’ll take a quick look at two tools in particular: CalmSea and InfiniGraph.

CalmSea

CalmSea is a technology platform that enables you to create a conversion-based offer that can be accessed via a website, email, tweet, mobile device or Facebook page. As an example, let’s consider a coupon.

Normally, the basic data you would expect to collect with an online coupon might consist of clicks, shares and redemptions. Of course, you may also collect some demographics – or even additional data, depending on form-related entries required of the user in order to get the coupon.

The trick with CalmSea lies within an extra click that prompts your Facebook authorization in exchange for access to the coupon (or other offer). This authorization includes access to 3-4 of your Facebook permissions, which provides the CalmSea platform with multiple data points specific to your social graph (likes, interests, demographics, friends, etc.).

All of this activity can take place on any web page, including your ability to share the coupon with others on Facebook without actually ever going to your Facebook page.

When I spoke to Vivek Subramanian, VP of Products for CalmSea, he said they are seeing upwards of a 70 percent acceptance rate on the permissions authorization for branded apps (which could include coupons, sweepstakes, private sales, group buys and more).

The Power of The Data

CalmSea takes the Facebook user interactions and news feeds around the given offer – then combines that data with purchase/conversion analytics (could be Google Analytics) to aggregate and display insights on segments of users/customers with the highest levels of:

>> Engagement
>> Profitability
>> Influence

This kind of data goes beyond Facebook Insights, in that it enables you to build predictive models based on distinct attributes that best describe current and potential customers with respect to the three items listed above.



In the figure above, you can get a slight feel for CalmSea’s dashboard, which demonstrates, among other items, the ability to view social insights compared to purchase data insights on users who have authorized the offer.

Depending on your role in the company (media buyer, content creator, channel partner/affiliate manager, etc.) this kind of data ideally improves how and where you spend your time and money.

The initial offer you develop with a platform like CalmSea will likely have a consistent conversion rate with similar offers you may have conducted in the past. It’s the offers that follow, leveraging the data collected from your first use of the platform, that stand to produce significantly improved results.

InfiniGraph

The InfiniGraph platform aggregates Facebook and Twitter data for the purpose of identifying relevant (real-time) affinities, content and interests that are trending around a particular brand, product or industry. There are two key considerations with respect to how this platform’s output produces actionable value:

Improved performance on your Facebook ads: Gives you insights to new interests/keywords you should be targeting as part of your selection process within Facebook’s ad platform.

Insights to assist with content creation and curation:Gives you a clear picture and delivery mechanism for content that is trending via a content “Trend Score” that algorithmically combines likes, comments, clicks, retweets, and shares.

nfiniGraph’s approach to identifying content that’s trending on Facebook, in particular, provides a level opportunity that is certainly missed by many brands wishing to dive deeper into content strategy (check out the Digital Path to Social Media Success to view the four kinds of content you could be addressing).

To describe how this works, imagine a series of Facebook status updates that are posted about subject matter relevant to your fans (on your Facebook page or another Facebook page your fans follow).



In the sample from InfiGraph above, you can see the dates these status updates were posted, in addition to the enormous amounts of engagement they received. Here’s the problem: Think of how many fans of this page would also be interested in this content, but simply didn’t see it. Now think of how quickly those status updates will slide down the page and disappear.

As Chase McMichael, President of InfiniGraph, told me, "Humans can’t keep up with trending content, nor can they see how content trends across multiple Facebook pages containing fans with similar interest."

McMichael alludes to "crowdsourcing" of the human voice around collective interests and actions. Not only can this aid in the repurposing of content otherwise lost, but as McMichael so eloquently puts it: "you can know where to double-down from a media buying perspective. Who needs comScore when you have a resource that is guiding you where to advertise based on what a large audience is in essence telling you?"

Wrap-up

Although the summaries on these platforms don’t do them justice, my hope is you’ll be inspired to dig deeper regarding the possibilities they offer. It will be interesting to see how Facebook will continue enabling access to data, but I think it’s a safe prediction that advanced marketers will leverage it to the hilt.

On a final note, I’ll bid you a farewell to 2011 with my favorite quote of the year:

"Data will become the new soil in which our ideas will grow, and data whisperers will become the new messiahs." – Jonathan Mildenhall, VP of Global Advertising Strategy at Coca-Cola

Thursday, December 8, 2011

New Tagging Suggests Google Sees Translated Content As Duplicates

Last year Google launched meta tags for sites where a multilingual "template (i.e., side navigation, footer) is machine-translated into various languages but the "main content remains unchanged, creating largely duplicate pages." This week they have gone a step further and now include the ability to differentiate between regions that speak basically the same language with slight differences.

Like the canonical tag, the implementation falls on the website owners to do, in order to get "support for multilingual content with improved handling for these two scenarios:
1. Multiregional websites using substantially the same content. Example: English webpages for Australia, Canada and USA, differing only in price.
2. Multiregional websites using fully translated content, or substantially different monolingual content targeting different regions. Example: a product webpage in German, English and French."

This tagging is interesting and suggests Google knows when the content on a site is duplicate despite it being in a different language. Has their data storage the ability to translate, or just recognize words that are used in the same language but are regionally different? If I use "biscuit" on my UK or Australian sites in place of "cookies", does Google know they are the same word?

"If you specify a regional subtag, we'll assume that you want to target that region," Google tells us.

Is duplicate content now being measured for similar terms? Or are the tags a way to have website owners limit the pages Google index for regional areas? We add the tags and Google thins the pages we have showing in the SERPs for different regions?

Google shared some example URLs:
>> http://www.example.com/ - contains the general homepage of a website, in Spanish
>> http://es-es.example.com/ - the version for users in Spain, in Spanish
>> http://es-mx.example.com/ - the version for users in Mexico, in Spanish
>> http://en.example.com/ - the generic English language version

On these pages, you can use this markup to specify language and region (optional):
>> [link rel="alternate" hreflang="es" href="http://www.example.com/" /]
>> [link rel="alternate" hreflang="es-ES" href="http://es-es.example.com/" /]
>> [link rel="alternate" hreflang="es-MX" href="http://es-mx.example.com/" /]
>> [link rel="alternate" hreflang="en" href="http://en.example.com/" /]

Seems like many wouldn't bother installing the tags unless Google was to start dropping pages, or if the implementation helps improve regional rankings for the pages where publishers have gone that extra step and customized their content to specific regions and subtle language differences.

The hreflang tag has been around for quite some time. The W3 organization discusses it in 2006 and has it in its links in HTML documents list. This addition in to the head tag information seems to be a new twist. How Google uses the information for ranking will really determine if people will use it.

Tuesday, December 6, 2011

SEO is Both Science and Art

For many people who aren’t involved in search engine optimization (SEO) on a regular basis, it’s easy (or so they think). You simply create a website, write some content, and then get links from as many sources as you can.

Perhaps that works. Sometimes.

More often than not, the craft of SEO is truly a unique practice. It’s is often misunderstood and can be painfully difficult to staff for. Here’s why.

SEO is Science

By definition, “Science” is:

1. a branch of knowledge or study dealing with a body of facts or truths systematically arranged and showing the operation of general laws: the mathematical sciences.
2. systematic knowledge of the physical or material world gained through observation and experimentation.
3. any of the branches of natural or physical science.
4. systematized knowledge in general.
5. knowledge, as of facts or principles; knowledge gained by systematic study.

Anyone who has performed professional SEO services for any length of time will tell you that at any given time we have definitely practiced each of the above. In some cases, changes in our industry are so rapid that we crowdsource the science experiments among peers (via WebmasterWorld forums or Search Engine Watch forums).

Unfortunately, Google doesn’t provide step-by-step instruction for optimization of every single website. Every website is unique. Every optimization process/project is unique.

Every website represents new and interesting optimization challenges. All require at least some experimentation. Most SEOs follow strict methods of testing/monitoring/measuring so that we know what works and what doesn’t.

We have a few guidelines along the way:

1. Our “branch of knowledge” is well formed in what Google provides in their Webmaster Guidelines and SEO Starter Guide.
2. Our unique experience. Just like you might “learn” marketing by getting your bachelor’s degree in marketing, you really aren’t very good at it until you’ve worked in your field and gained real-world experience. There are so many things that you can read in the blogosphere regarding SEO that are complete crap. But, if you didn’t know any better, you’d buy off on it because “it sounds reasonable, so it must be true!” So, be careful to claim something is 100 percent “true” unless you have enough “scientific” evidence to back up the claim. Otherwise, it’s called “hypothesis”:
a. A supposition or proposed explanation made on the basis of limited evidence as a starting point for further investigation.
b. A proposition made as a basis for reasoning, without any assumption of its truth.

SEO is Also Art

By definition, art is:

the conscious use of skill and creative imagination especially in the production of aesthetic objects

I've worked with and befriended many incredibly bright SEOs in my years in this business. It is those who manage to blend the scientific skills with the creative thoughts on how to experiment/improve programs are the gems.

Getting creative with SEO is thinking of how a marketing program can encompass social, graphic design, link building, content generation, and PR to drive toward a common goal.

Getting creative with SEO is also about reworking a website’s design/code so that usability and accessibility improve, while maintaining brand guidelines and keeping with “look and feel” requirements, yet improving SEO.

Every day, we must get creative in determining how to best target keywords by determining which method of content generation gives us the best chance at gaining a presence in the search engines and – most importantly – engaging our audience.

Should we write a blog post? Should this be best handled in a press release? How about a video? Infographic? New “corporate” page on the site? There are a multitude of ways that we might determine to target a keyword via content.

The Perfect SEO

Today’s SEO is so much more involved than SEO of years past. When I hear people saying that they’re trying to determine if they should hire an in-house SEO or an agency, I will give them the pros and cons of each (and there sincerely are pros and cons of each).

But one factor which I believe leans toward the strength of an agency is that there’s typically going to be a team of individuals, each with a unique skill set. And, these individuals can share examples of what works and what doesn’t, with each other (scientific experiments often occur), they can bounce creative thoughts off of one another and collectively provide more value than any one person might.

Our industry needs more of these highly-skilled “freaks of nature” who blend both the scienctific skills and artistic creativity of SEO.