Clapham Local SEO

Basic SEO Techniques Clapham :

A few basic SEO techniques:

  • View Google’s cache to see your page the way a search engine sees it. If you can’t see an element of the page in the cache, the search engines can’t see it either! This means that content isn’t gaining you any search traffic.
  • Be sure your page is reachable. Crawlers don’t perform searches, they travel by links. This means your page has to be viewable by clicking on a link on another page or crawlers won’t see it and it won’t be searchable.
  • Be sure you don’t have metaframes preventing your page from being crawled. Many Clapham webmasters use metaframes to prevent rogue bots from crawling their page and don’t realize that these metaframes will also prevent search bots from crawling their pages!
  • Don’t let your links get lost in a sea of links. Pages with hundreds or thousands of links may not get crawled thoroughly in order to prevent spam and skewed rankings.
  • Use specific keywords. General keywords will have lots of competition. By using more specific keywords, a webmaster can reduce competition and increase rankings.
  • Don’t abuse keywords. Use your keywords in natural speech. Optimize your page for one or two specific keywords that searchers might use when looking for information available on your page. Be sure your keywords are relevant to your content.
  • Don’t use keywords inĀ link anchor text pointing to other pages on your site. (This is known as commiting keyword cannibalization.)
  • Be sure your keywords are plain text HTML. Add ALT text for images, transcripts for videos and audio clips, and put captions with java or flash plugins or images. Although crawlers are improving, many are unable to process anything besides plain text.

Be Aware of User Experience

Increasing the usability, accuracy, and visual design of the website will also increase your search metrics. Search engines use search metrics and backlinks to determine the popularity and user-friendliness of your page. These factors play an important role in SEO. Always seek to create content that is pleasing for your reader.

Search Engines use Engagement Metrics to determine user satisfaction. Time spent on a search result means the user found the result much more helpful than a user who immediately hits the back button to look for another search.

The Panda update allows google to use machine learning to rank websites on quality and user friendliness. In 2011, human evaluators ranked thousands of websites based on quality and then implemented machine learning that mimicked those evaluators. This update changed more than 20% of search results.

Selecting Keywords:

Things to consider when selecting keywords:

  1. The Keyword’s relevance to your page. Will the people using that keyword in searches be satisfied with the content on your page?
  2. The Keyword’s specificity. Would a narrower keyword or keyphrase attract an audience that is more interested in your content than the audience of the broader term? On the other hand, is your keyword so specific it won’t be searched?
  3. The Competition on that keyword. Can you compete with the current top ranked websites for your keyword?

Where to Put Keywords:

 

  1. In the Title: as close to the beginning as you can put it
  2. At the Top of the Page
  3. Several times throughout the body of the page
  4. At least once in the ALT text
  5. In the URL
  6. In the Summary or Meta description tag

Optimizing Title Tags:

  1. Titles should be between 45 and 55 characters long. Titles that exceed 55 characters in length may not show properly in google searches.
  2. Keywords should be placed near the beginning of the title
  3. Consider putting your brandname at the end of the title
  4. Don’t sacrifice readability and emotional impact for keyword optimization

Visit https://moz.com/learn/seo/title-tag to see how the title and summary of your page would show up in a google search.

Constructing URLS:

  1. Keep them short
  2. Be sure your reader has an idea of what he will find on the page when reads the URL
  3. Use plain text links
  4. Use hyphens to separate words
  5. Use keywords

How Links to your Page Effect your SEO

  1. Are a wide variety of websites created by a wide variety of webmasters on a wide variety of topics linking to your site?
  2. Are many sites on similar topics linking to your site?
  3. Are the ranking sites for your keywords linking to your site?
  4. Are sites linking to your page using your keywords in anchor text to link to your site? (Anchor text is the blue text that shows on a webpage instead of the URL to the link)
  5. Are you getting links from trusted sources?
  6. Have you refrained from linking to spammy or poor quality websites and websites that link to your page?
  7. Have you been accruing new links?
  8. Is your page being shared on social media?

If you can answer yes to the eight questions above, links to your content are doing their job and helping increase your rankings. If you can’t answer yes to each question, devise strategies to remedy your answer.

Creating good, quality content that related pages will naturally link to is one of the best ways to get links. Another great way, is through Manual Outreach. Don’t under estimate the traffic that a well placed link can generate itself. Target specific websites who cater to similar audiences, preferably websites with few links, high credibility and high rankings for your keywords, and invite them to link to your page. Be sure to explain how linking to your page is a benefit for them, and not just an act of charity for you. You may also use this tool to see your competitors’ backlinks. Finding sites who have linked to the top ranked sites for your keyword gives you a list of sites that whose links would be of value to you. Avoid self-created links on lower quality websites. While they may help rankings a little bit, they may also hurt your rankings. If you pursue this option, proceed with caution.

 

Search Engine Webmaster SEO advice

SEO advice from the major search engine webmasters includes:

  • Be sure all pages can be reached by static, plain-text links. (Links of the form: , as opposed to links buried in images, java or flash.) These links are easier for crawlers to find.
  • Don’t “cloak” your work, or try to present different content to the search engine than you present to the user. Make sure your page is optimized to be found by a search engine when a user searches a relevant query. Don’t try to cheat the crawler’s and their index.
  • Use accurate, descriptive, clear language with appropriate keywords in your titles, subtitles and ALT text. (ALT text is text that appears when an image cannot be displayed.) On hubpages, photo captions are ALT text and are therefore an extremely important part of search engine optimization.
  • Create content filled with relevant keywords.

Check for Errors

To check for crawl errors see Google’s Webmaster Tools.

Search Engine Optimization: Part 2 Specific SEO Suggestions

Website Search Optimization

Is your business trying to reach potential customers who search for information online? Of course it is.

But be sure not to neglect those who use mobile phones -- used by more than 90 percent of U.S. households today, according to CTIA, the wireless industry association. That compares to home Internet usage estimated at about 74 percent, according to Nielsen Online.

The CTIA also says data usage on mobile phones has surpassed the amount of voice data in the U.S. for the first time last year. Along with using mobile versions of Web browsers, on-the-go Internet users are increasingly turning to social media and specialized apps to help them find what they're looking for.

In the era of mobile Internet commerce, businesses need to re-evaluate their search engine optimization strategies. Here are some tips on taking advantage of this shifting trend from computer to smartphone from experts in the field.

Drop .mobi, but limit Flash

"The recent and continued advancement in smartphone technology has brought mobile browsing and search engine optimization (SEO) much closer to standard web SEO practices," says Dustin Ruge with the SEO Consultant Firm, based in New York City. "Previously, companies would pursue the creation of mobile sites (.mobi), with much lighter content and faster load times to support first generation mobile browsers, but today, mobile browsers are becoming much more 'normalized' in nature and tend to perform similar in results to standard Web browsers."

That said, Ruge still suggests to utilize XHTML formats, limit excessive load times (i.e. Adobe Flash) and make sure critical information -- such as phone numbers and addresses -- is prominently displayed and readable in mobile applications.

Test is best, click to call

Amber MacArthur, a new media strategist and author of Power Friending: Demystifying Social Media to Grow Your Business (2010 Portfolio), agrees with Ruge's last point.

"To ensure that consumers get what they want when searching on a mobile phone, companies need to ensure they have mobile-friendly websites," says MacArthur. "Businesses don't simply have to check their sites on one device, they should test across multiple smartphone platforms, such as the BlackBerry, iPhone, and Android."

Smartphones with these three dominant operating systems allow users to call phone numbers listed in their Web browsers with a single tap or click, which then launches the phone function of the device. You'll know if a phone number or email address can be used as such if they're underlined in the browser.

App attack

The most significant change to how consumers are using smartphones to find companies is the widespread popularity of mobile apps."To put this into context, Steve Jobs recently said that there are now more than 200,000 Apple mobile apps," says MacArthur. "In other words, individuals are no longer going through a browser get information, such as restaurant reviews and product recommendations -- this means that traditional SEO placement tactics are less effective."  

Ruge acknowledges mobile apps are "exploding in use," but he feels it might be too early to develop any concrete conclusions about its effectiveness in user search. "Based on learned user behavior, I suspect that standard browsing practices through the traditional search engine interfaces will not be threatened anytime soon," says Ruge.

Social networks, too

Customers are also relying on their social networks to find what they're looking for, reminds MacArthur, who says she uses Google less and Twitter and Facebook more."The tipping point for me was a couple of years ago when I went online to Twitter to ask my network where I should stay in the D.C. area. Within minutes, I had dozens of recommendations and links, which was a lot easier than sifting through pages and pages of random search results on Google."

According to MacArthur, about three quarters of cellphone users are using mobile phones to frequent social networking sites. "With such a high penetration of users on Facebook, Twitter and in other online hangouts, it's key that small and mid-sized businesses put time and effort into social networking strategies."

MacArthur suggests businesses consider a free Web service called HootSuite. "Not only will this tool allow companies to post to multiple accounts at the same time within an easy-to-use dashboard, it makes networking with the people on these sites easy and it also makes it a cinch to monitor your brand's reputation (and respond when necessary)," adds MacArthur.

As for the future

Both Ruge and MacArthur were asked about location-based services.

"This is a very difficult issue to address at this point since there are ongoing privacy related issues dealing with mobile browsing and GPS," says Ruge. "Recent privacy issues dealing with Facebook should be a shot across the bow to any unauthorized future use of personal online browsing coordinated with GPS data; the technology is certainly there for some amazing capabilities but Americans are very particular when it comes to privacy issues," he adds.

MacArthur is more optimistic about its immediate future. "Location-based services are exploding as a key marketing platform for many businesses." "For starters, setting your company up on a site like Foursquare won't cost you a cent, giving businesses an opportunity to bring their online relationships with customers offline."

MacArthur also says "augmented reality" tools that add informational layers on top of what you see through your smartphone's camera, "is about to change the way most of us get information." "For example, imagine walking down the street, pointing your phone's lens at a restaurant, and then seeing live links to menu item reviews online."

Although augmented reality is a hot trend, "it's somewhat more complicated to develop, compared to location-based apps and GPS tools, so companies are a slow to jump onboard," says MacArthur.

Search Engine Optimization: Part 2 Specific SEO Suggestions

Website Search Optimization

The use of metadata by search engines, including meta keywords, has changed extensively throughout the years.

While many of the rules regarding metadata remain the same, it is now an area of lesser importance when it comes to SEO. That said, meta tag optimization is still an important aspect of search engine optimization, so it is important to employ many of these so-called "deprecated" techniques to ensure high SERP rankings

Following these rules pertaining to metadata can help ensure a site's high ranking in search results. While Google does not use metadata for site rankings, there still are search engines that do. A variety of websites and syndication services also rely on metadata. Furthermore, Google even pulls your site's description from your metadata for use in the SERPs.

While meta tag optimization is still useful, it's important to note that there is no reason to stress out over metadata. This article is meant to serve as an informational piece on how metadata is used today as well as noting its much greater historical importance. Feel free to comment, however, on your thoughts and feelings (particularly on the modern usage of meta tags and meta data in SEO.)

What is Metadata? What are Meta Tags?

Metadata provides information about a site. This information gives search engines clues regarding what a site is about. Since metadata is hidden away in a site's markup, visitors can't see it, but search engines can. There are several types of metadata, but we're going to talk about the three most important parts that make up meta tag optimization: meta keywords, meta descriptions, and the robot tag.

Meta descriptions are actually one of the few things that visitors will see, but they won't see it on your site. Here is an example:
Bob has a site about sports cars and in his meta description, he has written, "The ultimate guide to European and American sports cars."
When someone searches on Google for the term "sports cars," Bob's site may show up in the results. If it does, the listing will show his meta description. In short, meta descriptions tell people what a site is about before they even visit it.

What Do Meta Keywords Do?

Meta keywords work very similar to meta descriptions, but instead of telling Internet users what a site is about, it tells the search engine. Here is an example:
Lisa's site is about Georgia peaches. In her meta keywords, she has written, "Georgia peaches, peaches from Georgia, peach orchards" By writing these keywords, Lisa is telling search engines that her site is about Georgia peaches, peaches from Georgia, and peach orchards. Annette searches in a search engine for the term "peaches from Georgia." Since Lisa's site has this phrase in its meta keywords, the search engine may show Lisa's website to Annette in the search listings.

Why You Need Meta Data

Meta information is very important to a site's well being when it comes to SEO. The part of SEO that deals with metadata is known as meta tag optimization. Many people say that you don't need meta keywords because "Google doesn't use metadata in their algorithm and since Google is the most popular search engine, there's no need to use them." That is a myth. While there is no argument that Google is the most popular search engine in the world, Google also reportedly does not use metadata as a way to weigh sites. Still, it's a bad idea to ignore meta tag optimization.

Here's why:

  • Other search engines use meta keywords in their algorithms and although they won't make up the larger portion of a site's traffic, it is traffic the site may not be getting if it didn't have meta keywords.
  • Google doesn't care about meta keywords at all, so no sites will be penalized by Google for having meta keywords.
  • Meta tags are easy to add. It's as simple as adding a tiny bit of HTML right after your <head> tag. Many content management systems even do all the work for you. Those using something like Wordpress are blessed with the number of plug-ins available that help out with metadata.
  • Google uses meta descriptions to give searchers more information about your site. Without a meta description, Google will hand-pick something from the site it feels is relevant to what the searcher is looking for. While this can be helpful, having a clean and well-written meta description can really entice searchers that YOUR site is the one they are looking for. While it has no impact on a site's ranking, it can help click-through rates.

How to Add Metadata

Each site has a main page or an index page. In the file for this page, a web developer or site admin, should is to look for the open head tag: <head>. Right after this tag, the site admin should enter their metadata. The following example shows what metadata should look like:

Metadata Mistakes

There are a few easily fixable problems that people run into when they are optimizing their metadata. For example, sometimes metadata is set up to disallow search engines from indexing a site. There are a few reasons webmasters set their sites up to block search engine, but those creating websites for SEO purposes should make sure search engines are NOT blocked.

If a website isn't being indexed in the search engines, generally the first thing a webmaster will do is make sure their robots tag is not blocking search engines. By default, if a site does NOT have a robot tag, search engines will index it. Many webmasters write a robot tag that essentially says "allow search engines to index this", but this is unnecessary as search engines will only avoid indexing a site if there is a robot tag that disallows them from doing so. (Or if it's been blacklisted, but that's a whole different issue.)

Another problem that people run into is giving search engines too much information. Sites are allowed a max of 10 keywords in their metadata. Here's an example:

Lisa's site has three meta keywords which are "Georgia peaches, peaches from Georgia, peach orchards." Gina, who is Lisa's competition has 12 keywords which are "peaches from Georgia, Georgia peaches, peach orchards, peach orchard, peaches, Georgia, Georgia peaches, Gina's peach stand, juicy peaches, peaches from the south, ripe peaches, fresh peaches"
Because Gina has more than 10 keywords and Lisa has only 3, Lisa stands a better chance of doing well in the search engine. The same rule applies to meta descriptions, except web developers are allowed a max of 150 characters for meta descriptions. Remember that it's 150 characters and not 150 words. While this isn't one of the biggest SEO mistakes a web developer can make, it still can negatively affect search engine rankings.


London