Basic SEO Techniques Billericay :
A few basic SEO techniques:
- View Google’s cache to see your page the way a search engine sees it. If you can’t see an element of the page in the cache, the search engines can’t see it either! This means that content isn’t gaining you any search traffic.
- Be sure your page is reachable. Crawlers don’t perform searches, they travel by links. This means your page has to be viewable by clicking on a link on another page or crawlers won’t see it and it won’t be searchable.
- Be sure you don’t have metaframes preventing your page from being crawled. Many Billericay webmasters use metaframes to prevent rogue bots from crawling their page and don’t realize that these metaframes will also prevent search bots from crawling their pages!
- Don’t let your links get lost in a sea of links. Pages with hundreds or thousands of links may not get crawled thoroughly in order to prevent spam and skewed rankings.
- Use specific keywords. General keywords will have lots of competition. By using more specific keywords, a webmaster can reduce competition and increase rankings.
- Don’t abuse keywords. Use your keywords in natural speech. Optimize your page for one or two specific keywords that searchers might use when looking for information available on your page. Be sure your keywords are relevant to your content.
- Don’t use keywords in link anchor text pointing to other pages on your site. (This is known as commiting keyword cannibalization.)
- Be sure your keywords are plain text HTML. Add ALT text for images, transcripts for videos and audio clips, and put captions with java or flash plugins or images. Although crawlers are improving, many are unable to process anything besides plain text.
Be Aware of User Experience
Increasing the usability, accuracy, and visual design of the website will also increase your search metrics. Search engines use search metrics and backlinks to determine the popularity and user-friendliness of your page. These factors play an important role in SEO. Always seek to create content that is pleasing for your reader.
Search Engines use Engagement Metrics to determine user satisfaction. Time spent on a search result means the user found the result much more helpful than a user who immediately hits the back button to look for another search.
The Panda update allows google to use machine learning to rank websites on quality and user friendliness. In 2011, human evaluators ranked thousands of websites based on quality and then implemented machine learning that mimicked those evaluators. This update changed more than 20% of search results.
Things to consider when selecting keywords:
- The Keyword’s relevance to your page. Will the people using that keyword in searches be satisfied with the content on your page?
- The Keyword’s specificity. Would a narrower keyword or keyphrase attract an audience that is more interested in your content than the audience of the broader term? On the other hand, is your keyword so specific it won’t be searched?
- The Competition on that keyword. Can you compete with the current top ranked websites for your keyword?
Where to Put Keywords:
- In the Title: as close to the beginning as you can put it
- At the Top of the Page
- Several times throughout the body of the page
- At least once in the ALT text
- In the URL
- In the Summary or Meta description tag
Optimizing Title Tags:
- Titles should be between 45 and 55 characters long. Titles that exceed 55 characters in length may not show properly in google searches.
- Keywords should be placed near the beginning of the title
- Consider putting your brandname at the end of the title
- Don’t sacrifice readability and emotional impact for keyword optimization
Visit https://moz.com/learn/seo/title-tag to see how the title and summary of your page would show up in a google search.
- Keep them short
- Be sure your reader has an idea of what he will find on the page when reads the URL
- Use plain text links
- Use hyphens to separate words
- Use keywords
How Links to your Page Effect your SEO
- Are a wide variety of websites created by a wide variety of webmasters on a wide variety of topics linking to your site?
- Are many sites on similar topics linking to your site?
- Are the ranking sites for your keywords linking to your site?
- Are sites linking to your page using your keywords in anchor text to link to your site? (Anchor text is the blue text that shows on a webpage instead of the URL to the link)
- Are you getting links from trusted sources?
- Have you refrained from linking to spammy or poor quality websites and websites that link to your page?
- Have you been accruing new links?
- Is your page being shared on social media?
If you can answer yes to the eight questions above, links to your content are doing their job and helping increase your rankings. If you can’t answer yes to each question, devise strategies to remedy your answer.
Creating good, quality content that related pages will naturally link to is one of the best ways to get links. Another great way, is through Manual Outreach. Don’t under estimate the traffic that a well placed link can generate itself. Target specific websites who cater to similar audiences, preferably websites with few links, high credibility and high rankings for your keywords, and invite them to link to your page. Be sure to explain how linking to your page is a benefit for them, and not just an act of charity for you. You may also use this tool to see your competitors’ backlinks. Finding sites who have linked to the top ranked sites for your keyword gives you a list of sites that whose links would be of value to you. Avoid self-created links on lower quality websites. While they may help rankings a little bit, they may also hurt your rankings. If you pursue this option, proceed with caution.
Search Engine Webmaster SEO advice
SEO advice from the major search engine webmasters includes:
- Be sure all pages can be reached by static, plain-text links. (Links of the form: , as opposed to links buried in images, java or flash.) These links are easier for crawlers to find.
- Don’t “cloak” your work, or try to present different content to the search engine than you present to the user. Make sure your page is optimized to be found by a search engine when a user searches a relevant query. Don’t try to cheat the crawler’s and their index.
- Use accurate, descriptive, clear language with appropriate keywords in your titles, subtitles and ALT text. (ALT text is text that appears when an image cannot be displayed.) On hubpages, photo captions are ALT text and are therefore an extremely important part of search engine optimization.
- Create content filled with relevant keywords.
Check for Errors
To check for crawl errors see Google’s Webmaster Tools.
You Have Two Cats: 5 Common Components of Digital Marketing
In 2014, the new Top Level Domains (TLDs) were introduced to much fanfare from the press and tech bloggers. New web address endings were touted as a land rush on the internet and a game changer for marketing strategies. Despite such pronouncements, new TLDs were largely ignored in 2014, leaving some to expect an explosion in activity in 2015. However, before business owners run off to build new sites with fancy new names, it's important to separate fact from fiction regarding TLDs and to ask the question, "Will the new TLDs matter to marketers and consumers in 2015?".
As a brief primer, Top Level Domains are the endings to websites such as .com, .edu, .gov, etc. In the past, these were all handled by the ICANN, but in 2014, the door was opened for entrepreneurs to create their own TLDs that they can control on their own. So now, there are essentially an endless amount of TLDs. Business owners could pay to can have their site end with things like .xyz, .toys, .soy, .wed, and more. Nearly 4 million web sites around the world use one of the newly created TLD.
There have been many different Top Level Domains for website owners to choose from before the introduction of the new TLDs and research has shown that they work in a general sense. People know that the various country TLDs can be used to find information from a certain region of the world. Consumers generally know that .fr is for pages in France and that .ca is for Canada. However, it's not perfect, a study from Moz suggests that nearly 25 percent of Americans can be tricked into thinking that .ca is for California; so they knew that the TLD was for a region, but guessed the wrong region.
Similarly, people know that a .tv site will be about a television show, .edu is for schools, that .org pages tend to be for non-profits. The .edu and .org are the two TLDs that carry the most meaning for consumers. Searchers know that .edu resources will be more reliable since they are from schools and not from businesses. And people associate .org with organizations, groups or non profits with goals other than profit. Many people don't realize that .com itself is short for "commercial" which was chosen in the early days of the internet to identify the sites that weren't the traditional school or government based web pages that first populated the nascent world wide web.
The challenge for these new TLDs is that though people can use them to quickly understand the purpose of a site, consumers don't inherently trust sites with unusual TLDs more than ones with more traditional endings. In fact, having a vanity TLD immediately indicates that this a new site which puts the site at a disadvantage when compared to sites that have been serving customers from a .com web address for years. This is why older alternative TLDs like .biz or .info never really took off.
When NPR followed up with some of the creators of the new TLDs at the end of 2014, they found that adoption has been incredibly slow. The TLD that is doing the best so far is .xyz and even that is based on a large buy from a third party that gave a free year of .xyz registration to their clients or from people who bought .xyz domains to squat on them. A better indicator will come at the end of 2015 when we see how many people who squatted or received their new TLD domain for free decide its worth it to continue paying for it.
The international study from Moz asked users if, based solely on the domain name, they were more likely to trust an insurance quote from a website ending in .insurance. 62 percent of Americans, 53 percent of Australians, and 67 percent of marketers said they were unlikely to trust the quote based on the domain alone.
And despite what people who are trying to sell TLDs may say, when it comes to TLDs and SEO, TLDs offer no intrinsic value to improve SEO. The algorithms for search engines don't include these new TLDs as a ranking factor. These domains will show up in a generic search for a keyword and people can search by TLD extension if they want to. If TLDs become more popular, they may become ranking factors in the future (though Google says they doubt it), but for now, they are treated no differently.
Google's John Mueller recently reposted comments the company made earlier in 2014 to reiterate their position on TLD and search.
"It feels like it's time to reshare this again. There still is no inherent ranking advantage to using the new TLDs," Mueller wrote on Google+ before sharing a postfrom Matt Cutts on the subject. "They can perform well in search, just like any other TLD can perform well in search. They give you an opportunity to pick a name that better matches your web-presence. If you see posts claiming that early data suggests they're doing well, keep in mind that this is not due to any artificial advantage in search: you can make a fantastic website that performs well in search on any TLD."
The sheer amount of TLDs available also undermines one of the reasons some thought they would be so popular. When there so many TLDs to choose from, it's not as effective for squatters to try and wrap of domains with the intent of selling them later. It's not the same as in the early days of the internet. Back then, if someone had the .com you wanted, there was nothing you can do but pay them or get a different name. Now, marketers can just move to a different TLD. The introduction of these new TLDs have created so much internet real estate, it's impractical for one person to try to lock up domains they don't intend to use.
There are a lot of good reasons why business owners may want to create a site using one of the new TLDs, but it's important to be clear about what the benefits are and what is just hype. It's undeniable that business can get domain names using the new TLDs that are unavailable for older extensions. Some states are introducing TLDs for businesses in their state. So BillsGarage.com may be taken, but BillsGarage.NYC may be up for grabs.
However, other than the benefit of giving marketers more options when deciding on domain names, TLDs don't offer any intrinsic benefit to business. As the research from Moz and comments from experts at Google have shown, TLDs have no advantage over .com when it comes to customer trust and SEO visibility.
Given the challenges facing TLD adoption, it's unlikely that TLDs will make a huge marketing impact in 2015 unless there is some sort of game changing development. If you want to use one of the new TLDs to build a site with an easy-to-remember name, you won't be disappointed, but don't expect new domain endings to perform some kind of marketing magic in the coming year.
What is The Future of Search
The use of metadata by search engines, including meta keywords, has changed extensively throughout the years.
While many of the rules regarding metadata remain the same, it is now an area of lesser importance when it comes to SEO. That said, meta tag optimization is still an important aspect of search engine optimization, so it is important to employ many of these so-called "deprecated" techniques to ensure high SERP rankings
Following these rules pertaining to metadata can help ensure a site's high ranking in search results. While Google does not use metadata for site rankings, there still are search engines that do. A variety of websites and syndication services also rely on metadata. Furthermore, Google even pulls your site's description from your metadata for use in the SERPs.
While meta tag optimization is still useful, it's important to note that there is no reason to stress out over metadata. This article is meant to serve as an informational piece on how metadata is used today as well as noting its much greater historical importance. Feel free to comment, however, on your thoughts and feelings (particularly on the modern usage of meta tags and meta data in SEO.)
What is Metadata? What are Meta Tags?
Metadata provides information about a site. This information gives search engines clues regarding what a site is about. Since metadata is hidden away in a site's markup, visitors can't see it, but search engines can. There are several types of metadata, but we're going to talk about the three most important parts that make up meta tag optimization: meta keywords, meta descriptions, and the robot tag.
Meta descriptions are actually one of the few things that visitors will see, but they won't see it on your site. Here is an example:
Bob has a site about sports cars and in his meta description, he has written, "The ultimate guide to European and American sports cars."
When someone searches on Google for the term "sports cars," Bob's site may show up in the results. If it does, the listing will show his meta description. In short, meta descriptions tell people what a site is about before they even visit it.
What Do Meta Keywords Do?
Meta keywords work very similar to meta descriptions, but instead of telling Internet users what a site is about, it tells the search engine. Here is an example:
Lisa's site is about Georgia peaches. In her meta keywords, she has written, "Georgia peaches, peaches from Georgia, peach orchards" By writing these keywords, Lisa is telling search engines that her site is about Georgia peaches, peaches from Georgia, and peach orchards. Annette searches in a search engine for the term "peaches from Georgia." Since Lisa's site has this phrase in its meta keywords, the search engine may show Lisa's website to Annette in the search listings.
Why You Need Meta Data
Meta information is very important to a site's well being when it comes to SEO. The part of SEO that deals with metadata is known as meta tag optimization. Many people say that you don't need meta keywords because "Google doesn't use metadata in their algorithm and since Google is the most popular search engine, there's no need to use them." That is a myth. While there is no argument that Google is the most popular search engine in the world, Google also reportedly does not use metadata as a way to weigh sites. Still, it's a bad idea to ignore meta tag optimization.
- Other search engines use meta keywords in their algorithms and although they won't make up the larger portion of a site's traffic, it is traffic the site may not be getting if it didn't have meta keywords.
- Google doesn't care about meta keywords at all, so no sites will be penalized by Google for having meta keywords.
- Meta tags are easy to add. It's as simple as adding a tiny bit of HTML right after your <head> tag. Many content management systems even do all the work for you. Those using something like Wordpress are blessed with the number of plug-ins available that help out with metadata.
- Google uses meta descriptions to give searchers more information about your site. Without a meta description, Google will hand-pick something from the site it feels is relevant to what the searcher is looking for. While this can be helpful, having a clean and well-written meta description can really entice searchers that YOUR site is the one they are looking for. While it has no impact on a site's ranking, it can help click-through rates.
How to Add Metadata
Each site has a main page or an index page. In the file for this page, a web developer or site admin, should is to look for the open head tag: <head>. Right after this tag, the site admin should enter their metadata. The following example shows what metadata should look like:
There are a few easily fixable problems that people run into when they are optimizing their metadata. For example, sometimes metadata is set up to disallow search engines from indexing a site. There are a few reasons webmasters set their sites up to block search engine, but those creating websites for SEO purposes should make sure search engines are NOT blocked.
If a website isn't being indexed in the search engines, generally the first thing a webmaster will do is make sure their robots tag is not blocking search engines. By default, if a site does NOT have a robot tag, search engines will index it. Many webmasters write a robot tag that essentially says "allow search engines to index this", but this is unnecessary as search engines will only avoid indexing a site if there is a robot tag that disallows them from doing so. (Or if it's been blacklisted, but that's a whole different issue.)
Another problem that people run into is giving search engines too much information. Sites are allowed a max of 10 keywords in their metadata. Here's an example:
Lisa's site has three meta keywords which are "Georgia peaches, peaches from Georgia, peach orchards." Gina, who is Lisa's competition has 12 keywords which are "peaches from Georgia, Georgia peaches, peach orchards, peach orchard, peaches, Georgia, Georgia peaches, Gina's peach stand, juicy peaches, peaches from the south, ripe peaches, fresh peaches"
Because Gina has more than 10 keywords and Lisa has only 3, Lisa stands a better chance of doing well in the search engine. The same rule applies to meta descriptions, except web developers are allowed a max of 150 characters for meta descriptions. Remember that it's 150 characters and not 150 words. While this isn't one of the biggest SEO mistakes a web developer can make, it still can negatively affect search engine rankings.