What You Need to Know About Local SEO

When it comes to getting your website ranked online, there’s no question that search engine optimization (SEO) must be an important part of that process. While there are many different factors that go into ranking a website online, it’s critical to make sure that a local business that wants to compete has the help they need to rank their website locally so customers can find them. Whether the solution is a capable freelancer or an all-out agency, the key is finding professionals who are not only going to work for you to get your website ranked, but also have the knowledge and experience they need in order to get the job done.

 

 

 

 

 

 

 

Why Do Local Businesses Need to Rank?
Over half of all people start their search for a product or service by looking online. The numbers are even much higher among the younger generations, and that is a trend that is not likely to turn around anytime soon. In addition to this, Google’s strong emphasis on local search means small businesses that have a decent website and even moderately good SEO can often rank near the top of the search engine results. Considering 50-90% of all searches end somewhere in the top five options, that is an important area for a business to be listed in.

Localized SEO isn’t something that can just be picked up on a whim, either. There is an entire industry around the ranking of websites for good reason. If you want to see your website shoot to the top then it is important to get the right help to put your business up there. So, if you own a business in Worcestershire, you might want to target clients within the county.

What to Look for in SEO Professionals?
The answer here involves you learning just enough about SEO to be able to ask a couple intelligent questions and to gauge when an agency you’re working with is on the level and when they are extremely suspicious or not quite living up to what you would expect. A couple hours of basic research is enough so you’re familiar with the big Google updates (Panda, Penguin, Hummingbird, etc.) and can throw out questions based on those recent updates.

seoAsk any potential hires how they handled those update changes for their clients, who they have from the past who would give them a glowing recommendation, and absolutely ask for a real life campaign laid out for you to review, as well as a follow up mock-up of their strategy to get your website ranked. They should be willing to provide and share this information – and it is a huge red flag if they are not.

In Conclusion
Finding a high quality SEO company  who can actually walk the walk in addition to talking the talk. They need to understand claiming, on page tags, social media footprints, and authority. These are critical to ranking any small local business website and that is key to making sure you get the final results that you are looking for. That way you know you’re getting the best value for money when sales shoot through the roof.

 

We’ve all had those moments of absolute terror where we just want to crawl into the fetal position, cry and pretend the problem doesn’t exist. Unfortunately, as SEOs, we can’t stay this way for long. Instead, we have to suck it up and quickly resolve whatever went terribly wrong.

There are moments you know you messed up, and there are times a problem can linger for far too long without your knowledge. Either way, the situation is scary — and you have to work hard and fast to fix whatever happened.

Things Google tells you not to do

There are many things Google warns about in their Webmaster Guidelines:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content
  • Cloaking
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages
  • Scraped content
  • Participating in affiliate programs without adding sufficient value
  • Loading pages with irrelevant keywords
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans or other badware
  • Abusing rich snippets markup
  • Sending automated queries to Google

Unfortunately, people can convince themselves that many of these things are okay. They think spinning text to avoid a duplicate content penalty that doesn’t exist is the best option. They hear that “links are good,” and suddenly they’re trying to trade links with others. They see review stars and will fake them with markup so that they have them and stand out in the SERPs.

None of the above are good ideas, but that won’t stop people from trying to get away with something or simply misunderstanding what others have said.

Crawl and indexation issues

That’s all it takes — two simple lines in the robots.txt file to completely block crawlers from your website. Usually, it’s a mistake from a dev environment, but when you see it, you’ll feel the horror in the pit of your stomach. Along with this, if your website was already indexed, you’ll typically see in the SERPs:

Then there’s the noindex meta tag, which can prevent a page you specify from being indexed. Unfortunately, many times this can be enabled for your entire website with a simple tick of a button. It’s an easy enough mistake to make and painful to overlook.

Even more fun is a UTF-8 BOM. Glenn Gabe had a great article on this where he explained it as such:

BOM stands for byte order mark and it’s used to indicate the byte order for a text stream. It’s an invisible character that’s located at the start of a file (and it’s essentially meaningless from an SEO perspective). Some programs will add the BOM to a text file, which … can remain invisible to the person creating the text file. And the BOM can cause serious problems when Google tries to read the file. …

[W]hen your robots.txt file contains the UTF-8 BOM, Google can choke on the file. And that means the first line (often user-agent), will be ignored. And when there’s no user-agent, all the other lines will return as errors (all of your directives). And when they are seen as errors, Google will ignore them. And if you’re trying to disallow key areas of your site, then that could end up as a huge SEO problem.

Also of note: Just because a large portion of your traffic comes from the same IP addresses doesn’t mean it’s a bad thing. A friend of mine found this out the hard way after he ended up blocking some of the IP addresses Googlebot uses while being convinced those IPs were up to no good.

Another horrific situation I’ve run into was when someone had the bright idea to block crawlers to get pages out of the index after a subdomain migration. This is never a good idea, as crawlers need to be able to access the old versions and follow the redirects to the new versions. It was made worse by the fact that the robots.txt file was actually the shared for both subdomains, and crawlers couldn’t see either the old or the new pages because of this block.

Manual penalties

 

Just hearing the word “penalty” is scary. It means you or someone associated with the website did something wrong — very wrong! Google maintains a list of common manual actions:

  • Hacked site
  • User-generated spam
  • Spammy freehosts
  • Spammy structured markup
  • Unnatural links to your site
  • Thin content with little or no added value
  • Cloaking and/or sneaky redirects
  • Cloaking: First Click Free violation
  • Unnatural links from your site
  • Pure spam
  • Cloaked images
  • Hidden text and/or keyword stuffing

Many of these penalties are well-deserved, where someone tried to take a shortcut to benefit themselves. With Penguin now operating in real time, I expect a wave of manual penalties very soon.

A recent scary situation was a new one to me. A company had decided to rebrand and migrate to a new website, but it turned out the new website had a pure spam penalty.

Unfortunately, because Google Search Console wasn’t set up in advance of the move, the penalty was only discovered after the migration had happened.

Oops, I broke the website!

One character is all it takes to break a website. One bad piece of code, one bad setting in the configuration, one bad redirect or plugin.

I know I’ve broken many websites over the years, which is why it’s important to have a backup before you make any changes. Or better yet, set up a staging environment for testing and deployment.

Rebuilding a website

With any new website, there are many ways for things to go horribly wrong. I’m always scared when someone tells me they just got a new website, especially when they tell me after it’s already launched. I get this feeling in the pit of my stomach that something terrible just happened, and usually I’m right.

Read more: http://searchengineland.com/scary-many-ways-seo-can-go-wrong-261625