What is the Google Hummingbird Update?

What is the Google Hummingbird Update?No Google haven’t started to breed hummingbirds, we’re talking about the name given to their recent update to their search engine algorithm. And although the Hummingbird is among the smallest of birds it has most certainly created a huge amount of discussion and debate within SEO and tech communities. So much so, that since the update happened around the end of September it even got featured in a Telegraph newspaper article.

So what’s all the fuss about and how big an impact is Hummingbird likely to have on SEO strategy?

What exactly is Hummingbird?

Essentially, Hummingbird consists of a new technology called the Knowledge Graph. In a nutshell the Knowledge Graph is Google’s attempt at a creating a more intelligent search engine and one that tries to get inside the mind of someone carrying out a search to better understand what type of information they’re really after. Because of the huge volumes of digital content that now exist on the web, Google’s users are having to be more specific in the way they search and to do this they’re having to type longer phrases into Google and in some cases they’re often having to completely rephrase their query if Google was way off the mark first time round. And all this is frustrating and time consuming for Google’s users which isn’t great for the brand, which now has shareholders to keep happy!

So what Google is now capable of doing, similar to a human mind is make connections between items and answer complex questions. It does this through its semantic search capabilities, to the layman that means Google is analysing all words used in a search query to better understand the true meaning of the phrase we have typed in. This is a subtle move away from just simply providing a set of search results and web pages to delivering meaningful answers. For example Hummingbird will more greatly consider question words like “how” “why”, “where” and “when” in search phrases.

Another thing built into the Hummingbird update is conversational search or ‘hot wording’ as Google calls it.

I tried it out myself and asked Google “Where is my nearest GP” and to my amazement Google served the ‘Find Services’ page on the NHS website at the top of its search results. Exactly the type of page I’d need if I was indeed looking for my nearest doctors surgery. So it does look like Google is able to intelligently connect up the words and provide users with highly relevant content. Eventually, this technology might reach the point of understanding text on a more nuanced and human level, a scary and yet thrilling thought.

How does this impact SEO?

However, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Its best practice guidance remains the same: create original, high-quality content, but I do believe healthcare marketers now need to think beyond simple therapy/disease or product and service based keywords and think about the types of sentences they’ll be typing in. To do this well you really do need to understand your target audiences’ information needs and the different stages they go through as part of their decision making or purchasing process.

Marketers need to better understand the types of problems certain groups of patients and HCPs are looking to solve when they go online and the common types of questions they seek credible answers to. Once they’ve done this they then need to think about the actual content itself and make sure their digital content is relevant to those search queries being ‘Googled’. It’s also important to try and determine what format is most likely to appeal to and engage those audiences when they first come into contact with that content, i.e. static web pages, videos, infographics, images or a combination of them all, as this will help to reduce bounce rates which will also help secure high rankings.

One beneficial result of Hummingbird should be that it creates a more level playing field for smaller healthcare organisations that specialise in a particular aspect of healthcare or focus on a specific therapy/disease area.  The high volume, generic paid search keywords are often dominated by large multi-national organisations that have a diverse brand portfolio and deep pockets that enable them to win the Google Adwords bidding war. But because of the more generic nature of their web content, this means that they are at a disadvantage when it comes to the less predictable nature of semantic search results. The Hummingbird update should enable smaller niche companies that have the ability to produce unique, informative and fresh content specifically relating to that niche, gain a higher ranking in the search results when a precise and complex search phrase is used.

However, one negative linked to Hummingbird is that as Google accelerates its movement away from Google keyword search to Google semantic search, Google will encrypt all future search results, which means that they’ll no longer provide any data whatsoever within web analytics packages on organic keyword referrals. For us marketers this means that we are going to be completely in the dark when it comes to knowing which keywords are sending people to our website and more importantly driving and assisting conversions from the organic search results. There’s more on this topic in my other post titled: “Google Analytics no longer providing organic keyword data“.

Need help?

If your organisation has been hit by Hummingbird and need expert help to better understand what changes you should be making to your healthcare SEO strategy to get your site performing well in the organic search results then get in touch to request a Google Hummingbird Impact Assessment.

Google Analytics no longer providing organic keyword data

What does (not provided) mean in Google Analytics?

Many of you may have noticed that for some time now that Google Analytics has been showing the “(not provided)” message in the keyword data section. And while the (not provided) tag was annoying, we still had enough organic keyword data to help us assess the effectiveness of our SEO strategy and measure visits and conversions rates from organic search.

But on September 23rd this all changed quite significantly as Google moved to encrypted searches which meant that all Google organic keyword searches are now 100% secure. When a user goes to Google to search, they are automatically redirected to the https:// version of their Google domain of choice. This encryption means that Google no longer shares any keyword data with website owners, regardless of whether a user is logged into their Google account when conducting a search.

This is great from a privacy perspective and Google has been making major steps to help protect everyone’s privacy but I remain highly suspicious as to why Google has only done this for organic searches only. They are still happy to provide website owners with keyword data if they are using Google Adwords to attract visitors to their site. So Google the money making machine, with shareholders wanting to see increasing profits so they can get a quick return on their investment, fully understand the requirement for website owners and SEOs to be able to track the effectiveness and commercial value of various keywords. And now the only way we can do that is to test the value of key phrases using paid ads in the Google sponsored listings. Pure genius, another quick way Google can increase its revenues and profits. I can’t help but question if Google’s informal corporate motto: “Don’t be evil” still rings true amongst the Board, given these recent changes.

So what can we do to still try and understand the value that certain keywords have on driving good quality visitors and conversions? Here are our some of the tactics we’ve been adopting to get around the issue:

  • Looking at non-Google keywords. OK a fairly obvious one I know and although Bing and Yahoo collectively have less than a 7% share of the search market, you can make the assumption that visitors coming from Bing or Yahoo are on average using pretty similar search terms to that of Google users. The problem you may have is if your site does not get a great deal of traffic in the first place you might struggle to obtain enough good quality data to make an informed decision.
  • Analysing your Webmaster Tools data. At present Webmasters Tools does include search data from encrypted searches but only for the last 90 days so start exporting and saving that data for analysis.
  • Analysing on-site searches. If your site has its own search facility then you can use Google Analytics to capture and analyse data from your site search facility. You’ll then be able to see what key terms users are typing into your site search tool, this will give you an insight into how they search and the various words they use in phrases.
  • Setting up test campaigns on Google Adwords. Using Google Adwords to test the effectiveness of certain key phrases is actually something we recommend our clients do before they embark on an SEO campaign if they have no existing keyword data to analyse. We find that this approach can help to reduce the risk of targeting the wrong phases from day one and because SEO does not deliver quick and immediate results, you don’t want to be committing resource and budget into a keyword strategy only to find that several months later despite seeing increased rankings for those keywords you have focused on, you’re still no better off from a customer acquisition perspective. Using the Google Ad Planner you can easily determine search volume and estimated clicks (on ads) for particular keywords which will help you to formulate your initial test keyword list.
  • Looking at historical data. Our search behaviour has not changed that much so there’s still a lot value in pre-encrypted search data that still resides within Google Analytics. You can check the data to see if there were any seasonal differences worth noting and also to check bounce rates, conversions and assisted conversions too for various keywords.
  • Using Google Trends. Google Trends is a favourite tool of mine and one I like to use quite regularly as it gives me a better insight into for which keywords are trending right now. So if you do notice a huge spike in traffic and you suspect it could be something newsworthy or trendy, but the majority is “(not provided)”, head over to Google Trends and it might possibly give you an idea on what exactly it is that is trending that is bring you the extra traffic.
  • Setting up filters in Google Analytics. Most marketers are not aware of the full range of features available to them in Google Analytics and setting up custom filters is one way great way to really understand how your website and marketing campaigns are performing. You can set up filters for all your “(not provided)” traffic so that it shows you the landing page for each of those “(not provided)” referrals. So even though you might not know the exact keyword that’s bringing in the visits, you can instead filter it so that you can see what page they landed on. You can then look at the keywords that you’ve used in your title tag and in the on-page content and that should give you some idea as to what search phrase might have brought on to that page.

Need help?

If you’re struggling to get to grips with the encrypted search issues in Google Analytics and need expert help to better understand what changes you should be making to your web analytics strategy to help you measure your digital performance more effectively then get in touch to request a Google Analytics Assessment.

How Google’s ‘Penguin’ is changing the face of SEO

Google’s latest update to its algorithm, dubbed ‘Penguin’, was only released on 24th April, but has already had a huge impact on hundreds of companies over the last few months and the way in which they strategise their search engine optimisation efforts.

The update is designed to counter web spam and over optimisation, by penalising sites that employ these underhand techniques with a significantly lower ranking in the Search Engine Results Pages (SERPs).

Google’s Head of Webspam, Matt Cutts, announced Penguin in a blog post, which explained the principles on which the update is based. In particular, he emphasised the differentiation between ‘white hat’ SEO and ‘black hat’ web spam, highlighting the benefits of the former and the pitfalls of the latter.

The blog post stressed the importance of maintaining a focus of high quality content and user-focused pages under the principles of ‘white hat’ SEO and avoiding any of the ‘black hat’ techniques, such as keyword stuffing, link schemes, or unoriginal, duplicated content.

The online pharmaceutical sector is likely to be as affected as any other area of business, and so it is important for digital marketers in the industry to fully understand the Penguin update, and be aware of what they need to do to prevent their company from being affected – by Penguin or any of its successors. Cutts estimates the update will only affect around 3% of queries negatively, so as long as digital marketers follow and maintain the ‘white hat’ SEO guidelines, it is unlikely that there will be a detriment to their company’s online presence.

If your healthcare, medical or pharma website has been hit by the Google Penguin update then get in touch to learn more about how we can help you fix the SEO problems you might be facing.

Google’s Venice update – will your healthcare website sink or swim?

Google is the process of rolling out its new Venice update all part of its “40 changes for February” project. So what is the Venice update and how will it impact your healthcare SEO strategy?

In a nutshell ‘Venice’ is all about Google serving up more locally relevant websites in its search results, well that is what Google is trying to achieve, but from my initial experience the accuracy of the results are pretty poor and I had to physically alter some of the search settings before I started to see more relevant results.

Here’s an example…

Yesterday I ‘Googled’ “osteopath” and straight away I noticed that Google was serving up quite a few listings from its Google Places entries along with a map of where those osteopaths are based. Below those results there were still some generic, non-local standard organic results too. Now you might think this is great, because you’re only seeing results showing osteopaths located near to you, as you are not going to be interested in learning more about an osteopath located some 100 miles away from you. However, when I carried out the search, Google seemed to think I was based in Liverpool, which I’m not, I actually live in East Sussex. Google was therefore serving up a list of osteopaths situated in the wrong area! Totally useless for me and frustrating also, so why did this happen? Basically Google uses a number of methods to detect where a user is based – most notably, the user can set their default location in their search preferences, but if that is not set then Google will also look at GPS, Wifi information and IP address and to some degree past search history to try and determine your location.

So for me the IP route failed, so I had to alter the Google search preferences and change my location (I guess most searchers won’t automatically think of doing this) and finally I started to get a list of more relevant local results. Next, I went and carried out the same Google search on my iPhone and this time Google thought I was based in London, so again I had to update my location in the search preferences.

Local SEO Strategies for Healthcare Websites

So, what impact will this new update have on your SEO strategy? What it means is that’s even more important than ever for both smaller local level healthcare businesses and national businesses that have multiple branches dotted across the country to ensure that firstly, they have created and optimised a Google Places listing. Secondly, I would also recommend that you identify the types of searches relevant to your business and carry out a series of searches to see if Google is triggering the Venice algorithm for those phrases. Then you need to check that the on-page elements of your web pages are optimised for those target key phrases but also include a location, e.g. “Osteopaths in High Wycombe, Bucks”.  For a smaller healthcare business that is based in just one area, then you can easily do this by optimising top level pages such as the home page or product and service pages. However, if you are national business then you are going to need to think about how you can incorporate multiple location pages into your site so that you can optimise a page for each specific location. However, you need to approach this tactic with care as you don’t want to erroneously create pages that have duplicated content, as this could trigger a penalty.

If you’re healthcare website has been impacted by the Venice update and you want some help in developing a Google compliant local SEO strategy then drop us a line.

New Google search update raises privacy concerns

Two weeks ago Google changed the nature of search with its Search Plus Your World update which adds social media results to its search returns.

This evolution in the way search engines work is an acknowledgement of the fact that consumers continue to move their internet activity onto social media networks.  While the update was highly controversial in the tech world it is becoming increasingly clear that the modification could have major implications for pharma companies that have embraced Google’s own social network, Google Plus as a way to interact with the public.

On the positive side, early pharma adopters of Google Plus business pages will benefit from their content on the network gaining more traction in search returns which should drive more traffic to their online portals. However the update has also raised privacy concerns which could derail some companies carefully conceived Google Plus marketing strategies.

The big appeal for pharma marketing of Google Plus was the networks’ functionality allowing the allocation of an individual communication network to specific groups, the Circles feature. This allowed the companies to set up separate networks for each of their target markets, for example one for cancer another for circulation and so on.

Members of the public who joined the individual Circles were able to share experiences with others that were dealing with similar issues within a limited and empathetic environment. The problem with the Search Plus Your World update is that snippets of those conversations could now subject to a much wide distribution by appearing in search returns.

Pharma companies on Google Plus should be aware of the privacy issue and the effect of the new update.

Maximise your budget with effective campaign tracking

It can be hard to understand why something you can’t see can be one of the most important elements of your campaign. But that’s exactly the case with analytics: They’re never visible to the user but they can have a significant impact on what a user does and how your website functions. Used properly, they give a detailed view of how your campaigns perform, where you should spend more, where you should spend less and ways you can maximise your ROI. If you already know what they are and you’ve already got them, you can skip the first four headings below and dive straight into our top tips.

What do we mean by web analytics?

Web Analytics is the measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing web usage [1]. It basically means including a mechanism on your site that keeps track of where visitors come from, what they do on your site, where they go to next and then analysing the information to constantly improve your site.

Why are analytics important?

Simply put, if you’re not keeping track of how people are interacting with your site, you’re not making the most of it. In other words, you’re wasting your money. With the level of information available, you can discover valuable visitor trends that can dramatically influence the design, navigation, functionality and content of your site.

How do you know if you are using web analytics?

There are many different types and providers of analytics software but the most popular has quickly become Google Analytics (GA) [2]. It’s free so, even if you’re already using other analytics software, you should use GA too so you have access to its reporting suite.

To find out if you’re using GA, open a web page on your site and right click on an area of empty space (off to one side of the page for example). No matter which web browser you’re using, in the menu that pops up, select the ‘View page source’ or ‘View source’ option. You’ll then be presented with a new tab or window that contains the code behind the page – it’s in here we’ll find the code for GA if it exists. Press the Ctrl and F keys on your keyboard at the same time (Command + F if you’re on a Mac) to open the ‘find’ dialogue (we might be teaching you to suck eggs here) and search for ‘google-analytics.com‘. If you don’t find it, you’re not using GA so skip to the bit below that explains how to get it. The GA code has changed a lot over the years but if you do find the code, it should be contained within a line that looks like either of the lines below. If it is, you’re using GA:

Example 1:
document.write(unescape(“%3Cscript src='” + gaJsHost + “google-analytics.com/ga.js‘ type=’text/javascript’%3E%3C/script%3E”));

Example 2:
ga.src = (‘https:’ == document.location.protocol ? ‘https://ssl’ : ‘http://www’) + ‘.google-analytics.com/ga.js‘;

Tip: If it looks like example 1, you’re tracking code is out of date – it’s time to update it!

How can you get web analytics?

As discussed above, Google Analytics is the most popular analytics software so we recommend you install if before any others. It’s free and easy but you’ll need to be able to edit your website templates to get it working. If you can’t, talk to your webmaster or web agency who will be able to do it for you.

To get the required code, visit http://www.google.com/analytics/ and either sign up or login using an existing Google account. Once logged in to the analytics interface, use the help centre search box and search for ‘tracking code’. Click the result for ‘Set Up the Tracking Code’ and follow the instructions. We’d explain how to do it here but the technique changes so it’s best to follow the instructions on the site.

Top Analytics Tips

The default statistics offered by Google are pretty thorough but with some small additions you can add extra valuable information to help you maximise the potential of your campaigns.

404 Tracking
404 is the error code returned by a web server when it can’t find the page it’s been asked for. Why is this valuable? There are plenty of reasons why people might be looking for a page that doesn’t exist. It could be as simple as they typed in the address incorrectly but it could be as serious as a typo in a print ad. If you don’t know people are trying to find these pages then you can’t fix it. And Google doesn’t (can’t) report these by default. How to fix it? Get your webmaster to follow this guide:

http://www.google.com/support/analytics/bin/answer.py?answer=86927

Site Search Tracking
Most websites have a built in search function to aid user’s navigation of the site. If yours doesn’t, you should seriously consider implementing one as they’re highly valued by visitors. Search is also highly valuable to you when you know the phrases people are searching for. These key phrases can provide excellent insight into visitor needs. You’ll often discover there are recurring phrases that can, for example, demonstrate an area of the site you didn’t realise was popular with visitors and should be promoted. You may discover there is a need for information that doesn’t exist at the moment or realise people are struggling to find particular pages. Users have become so familiar with search it is often used as a first point of entry so its value should not be underestimated. Follow these instructions to get up and running:

http://www.google.com/support/analytics/bin/answer.py?answer=75817

Goal Tracking (aka conversion tracking)
This should be an obvious one but often gets overlooked. Goal (or conversion) tracking is the act of analysing the effectiveness of driving traffic to a particular action. As an example, this could be determining how often visitors arrive at a ‘thank you’ page. ‘Thank you’ pages are regularly displayed after a visitor successfully completes a form; it could be a contact form, an information request form, a registration form, etc. When the user lands on the ‘thank you’ page it means they have successfully completed a task. By setting up goal tracking, you can see an immediate snapshot of how well your campaigns are performing. You can also analyse the goal statistics further to determine how your campaigns can be improved and, if appropriate, assign a monetary value to each conversion. Learn how here:

http://www.google.com/support/analytics/bin/answer.py?answer=55515

Campaign Tracking
Tracking campaigns is the most effective way of ensuring your marketing budget is being well spent. You can use specific URLs for individual campaigns and monitor traffic to those URLs to measure effectiveness or you can use tracking codes which you can then monitor in your analytics. Some systems such as Adwords and Mailchimp insert code that allows you to automatically monitor their campaign performance in Google Analytics, for other campaigns you’ll need to do it manually. To learn how to effectively track your campaigns read more here:

http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=55540

Filtering
Your analytics data can be skewed or less valuable if you’re not filtering it. Data can be filtered in a number of ways. The primary way is using the built in filters that Google provides. You should use these to exclude traffic that you don’t want included in your reports, for example, traffic from your own offices. Most businesses have a fixed/static internet address that you can add to your filter list so company employees don’t add unnecessary data to your reports. Another useful filter type is available under the advanced link next to the search box in most Google Analytics reports. From here you can use key phrases to filter the report results. By doing so, you narrow results to more pertinent data. For example, under the Search Queries report (where you can view which terms people used in the search engines to find your site) you can filter out brand phrases to better understand which generic terms people are finding you for. Generic terms are often more valuable than brand terms but brand terms can often be so popular they obscure the generic results. Filtering fixes that.

Google are constantly improving their Analytics package with new features such as real-time tracking (see who’s live on your site and what they’re up to right now), social engagement (see where and how your site is performing on the various social sites) and flow visualisation (useful visual representations of visitor paths through your site). To keep up with what’s new and how it could be useful to you follow the Google Analytics team here:

http://www.youtube.com/user/googleanalytics

http://analytics.blogspot.com/

 

[1] The Official WAA Definition of Web Analytics

[2] Analysis of the top 1m websites from http://trends.builtwith.com/analytics

Kung Fu Panda – “This Time Its Personal” – Google Panda Survival Guide

So it’s now almost 6 months since Google rolled out their Panda update and boy has it caused some furore. This Panda has kicked some butt alright, the thread over on Webmaster World has spanned to over 250 messages –  and plenty of website owners are not happy with this particular algo update. Anyway I want to try and cut through all the hype and angst surrounding Panda and take a look at where Google are at in this significant update and how we as search engine marketers can avoid getting beaten up by Panda.

Question one – Is Panda a rolling update?

Barry Schwartz over at SEOroundtable did seem to think that Panda is a rolling update and that Google will continue to tweak their algo daily. However, a spokesperson over at Google commented: “We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”
The team over at Searchengineland have kept a timeline on the various Panda updates that have taken place and have seen a clear update pattern evolve.  Since Feb, Google have updated Panda every 4 to 7 weeks so this seems to suggest that there are more changes to come.

Question two – Why have some sites that have been using so-called ethical SEO techniques fallen foul of the Panda update?

So let’s try and understand why some sites have fallen into the tumbleweed search results pages. The Panda update has impacted around 12% of search queries which means that a lot of websites have seen their search traffic impacted.
The common theme that I am picking up on why some sites have dropped down the SERPS is down to poor quality content, i.e. content that adds little value to your site and your visitors aren’t engaging with that content. Panda is a filter that Google has developed to flag-up what it believes to be low-quality content on web pages. Basically if you have too many low-quality pages with little original content then Google penalises those pages. It doesn’t mean that your entire site is out of Google but it does mean that pages within your site carry a penalty designed to help ensure only the better ones make it into Google’s top results.

So the types of pages that might get affected are those pages that you might have introduced into your site to specifically target certain keywords and get a higher ranking for them. Doorway pages, gateway pages, SEO articles – these are pages that you have been specifically created to appeal to search engine spiders. The content is keyword rich and the html has been appropriately formatted so that you can rank well for the key phrase being targeted. Little thought has been given to usability and how the end user will react to that page. These pages are deliberately often buried deep within the site’s hierarchy so that users who are already on your site can’t easily come into contact with them, for obvious reason.

A response from a Google employee on their Webmaster Forum commented:

“Bear in mind that people searching on Google typically don’t want to see shallow or poorly written content, content that’s copied from other websites, or information that are just not that useful. In addition, it’s important for webmasters to know that low quality content on part of a site can impact a site’s ranking as a whole. For this reason, if you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”

If you think you’ve been negatively affected by Panda and wrongly so, then you can try and use the Google webmaster forum to see if someone from Google will manually review your site.

Question three – How does Panda impact the future of SEO?

SEO is all about adapting. I have been developing SEO strategies since 1996 and the techniques that I used back then would get my clients into serious trouble today. The point I am making is that as Google evolves so does the way we do SEO. 

Here are my 5 tips on how to avoid the “Panda Punch”

1 – Make the content on your site engaging
Yes, the process the of getting an SEO copywriter to produce a series of 300-500 word articles for your site used to be an effective SEO technique and it still can be. But, the content for those pages need to add value and they need to be of a high enough quality so that they capture the interest of your site visitors. If they don’t, then they won’t serve a purpose anymore because Google will ignore them. Spend more time creating fewer higher quality pieces of content as opposed to quickly bashing out hundreds of ‘doorway pages’ that don’t actually contain any valuable information that will help your site visitors achieve what it is they are trying to achieve by visiting your site.

2 – Fix usability issues
Do ugly and poorly designed pages inspire confidence, of course they don’t? Get back to basics and make sure that the web pages that you put up on your site are all well styled and that the content is easy to read so that it draws your visitors in and they stay to read it.

3 – Reduce the bounce rate
OK, so this is closely linked to usability and writing engaging content.  The last thing Google wants to see is a visitor that selects a page from the top of its search results and returns almost immediately without even viewing another page on that site. To Google that is a clear signal that it has failed to do its job properly. It has not served up a relevant, high quality page for its user’s search query.  Use your analytics pages to identify all the pages on your site that have high bounce rates, especially the ones that are ranking well or used to be. Then think about how you can improve the quality of those pages so the bounce rate reduces.

4 – Get people linking to and tagging those pages
Getting other sites to link to internal pages (other than your home page) within your site has always been an effective way of improving the credibility and authority of that page. However, it’s easier send than done, unless you have something on that page that is really going to add value and provide the owner of the linking website with a good reason to point his own visitors in the direction of your web page. It can be a well written article, white paper, video, info graphic – it doesn’t matter as long as what you are promoting is unique and engaging.  On the topic of linking it’s also important that you have plenty of internal links pointing to those pages also. Again, that’s a clear indicator of quality – if you’re prepared to link to those pages and send your own visitors to them, then Google will think that they must serve a valuable purpose.
There’s also no harm in making use of social bookmarking icons on your pages to make it easier for your visitors to flag the page up to friends and colleagues. The one’s I’d focus on are Facebook Like and Google +1. I’ll do a separate post on these later on.

5 – Don’t view SEO as a one-off project and don’t isolate SEO
SEO is not a project in the sense that it has a definitive start and end date. It’s an ongoing process that should be a key part of your marketing mix. And as its part of the marketing mix, don’t isolate it. SEO can and needs to be integrated with all your other marketing communications activity. Don’t just think about optimising web pages. Google now serves up a wide range of digital media in its search results. You need to be asking how do we optimise that video that we created, so that it appears in Google?  A wide range of digital assets can be optimised for search engines including video, images, clinical papers, product databases not just your web pages.
 If your business benefits from good search engine visibility then invest in that area full-time. SEO is an activity that can deliver good long term results as long as you get the strategy right and put in the required effort. It’s not an activity that will deliver short term results, unlike pay per click advertising. I like to compare SEO and PPC in terms of buying a house versus renting one. Yes, PPC like renting provides an immediate short term solution. However, buying a home and investing in SEO will deliver a greater return on your investment in the medium to long term.