Is Google’s BERT Update the Death of SEO Writing?

As someone who has done a lifetime of so-called SEO writing over the years, I gotta say, I have never like that term. To me, SEO writing is sort of awful writing.

SEO makes it sound like the focus is solely on the search engines, ignoring human users. It sort of brings to mind the terrible and keyword-stuffed blogs and web pages we saw around the turn of the century.

So, I was very pleased to see Google’s Danny Sulivan say this when explaining how to optimize your content for Google’s new massive BERT update.

do nothing BERT Google

google bert update

What sweet, sweet music to our ears.

For years, there has been a growing movement in SEO agencies to stop writing for Google and start writing for human users. The BERT update will hopefully give us all the green light to really double down on this idea.

What is the Google BERT Update?

Of course, you need to be aware of the BERT Update. But, you also need to be aware of the fact that there is nothing that you or your SEO firm can do to optimize for it; Besides write good content. But more on that later.

The BERT update stands for Bidirectional Encoder Representations from Transformers. It’s basically an open-sourced neural network-based technique for natural language processing (NLP) pre-training.

Ok, now in English. BERT is AI that helps the Google Search algorithm better understand and adapt to the subtle intricacies of or language, in order to give everyone better search results.

How does that work? Take a look at this before and after example provided by Google: 

Parking google bert

Google said that: 

“In the past, a query like this would confuse our systems–we placed too much importance on the word “curb” and ignored the word ‘no’, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb!”

Or take a look at this example.

Google bert update

The algo before BERT didn’t quite grasp the usage of “for someone” in that query. The searcher wanted to know if they could pick up medication on behalf of someone else. The first search result would likely have meant the searcher would have to try some other combination of words, perhaps including “on behalf of” to get the results they want.

However, in the after-BERT example on the right, the algo now recognizes how the words “for someone” completely alters the meaning of the inquiry and provides a better result on the first search.

This is expected to have a major impact on snippets in the SERPS.

What Should I Do if I See an Increase After the BERT Update?

BERT gipfy

Does this mean you have the best SEO agency in the world?

This is certainly a sign that you’re doing the right things. If you see an uptick in rankings and traffic after BERT, odds are pretty good that you were one of the under-rewarded sites we spoke of earlier. The Google algo didn’t quite see how good your content was, but now they do. Again, there is nothing to do but maintain your commitment to creating good content. Don’t slow down now!

If you’re tempted to take your foot off the gas now that you’ve achieved the #1 ranking (or at least higher than your competition), please know that your competition is almost certainly going to notice that they have dropped in the rankings and double their efforts to regain it. You’re going to have to fight to keep these results.

What Should I Do if I See a Drop After the BERT Update?

No, you don’t need to fire your SEO agency just yet. There are literally thousands of things that can result in a sudden ranking drop. However, if you saw a sudden and significant drop last week, odds are very good it was because of the update. Blame BERT.

Analytics screen google bert update

Does a drop in traffic and ranking mean you’re not writing for users and focusing too much on search engines? Maybe. But there is another possibility.

It could be a similar situation to Google’s so-called-Medic update a few months ago. When asked what to do about a drop in rankings following the update, Google tweeted the response:

“As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.”

So, it may not be a matter of you doing the wrong things. It could be a matter of your competition previously being under-rewarded for doing the right things, which gives them a little bump over you.

However, if you choose to use this as motivation to go all-in on creating better and more user-focused content, we support this 100%. This is absolutely never a bad decision to make and we can pretty much guarantee that the next Google update will also reward high-quality content. And so will the one after that.

Google is Begging for Quality, So Give it to Them

As we said, focusing on humans (and not robots) in your content writing has been a mantra in digital marketing for some time. And it’s one that we firmly, firmly believe in.

But, some of the best SEO firms still aren’t embracing this concept, even though Google is often the one trumpeting it.

Whenever Google takes to the Twitter-verse to explain what to do and how to prepare for their big update de jour, their messaging is often something like, “Just keep creating quality content.”

Google’s reps are always huge proponents of quality. In fact, John Mueller famously handed out some tough love in a help forum from a question about why someone’s site wasn’t being indexed.

“However, looking at the content, it seems a bit questionable to me, and I feel there’s still a lot of work to be done with regards to its quality,” said Mueller, bluntly.

“My recommendation would be to take the site down, and start over fresh with unique, compelling, and high-quality content that you spend time working on… Make something awesome, don’t just make a website.”

Elsewhere, we pointed out at the top of the blog that Google Danny Sullivan’s response to the BERT update was to focus on quality and write for the user. He said essentially the same thing back in August after a broad core update.

“As explained, pages that drop after a core update don’t have anything wrong to fix… We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

Bert update google

Writing for Human Users and Search Engines

If you click on that Danny Sullivan link at the end of the previous section, you can see he also provides a checklist of sorts to ensure you’re publishing quality content. There’s a lot there, but that’s a good thing.

Following his advice means you’re committing to taking content seriously.

To be honest, most SEO agencies have not taken content seriously for the last few years. In fact, most feel inconvenienced that their blogs have to contain 975 other words to go along with their 25 keywords… and it shows.

Their checklist before publishing a blog would be:

  • Is the keyword in the title and the lead? (Yes /No)
  • Is there a keyword density of 2.5%? (Yes /No)
  • Is the blog free of spelling/ grammar errors? (Yes /No)

At no point would they ask:

  • Does this blog suck? (Yes /No)

Does this blog suck seems like such a simple question to answer, but SEO agencies have had a tough time answering it. To them, it’s abstract or subjective. It doesn’t fit on a spreadsheet.

We could fill 1,000 blogs on how to write better blogs. However, here are just a few things SEO agencies need to stop doing ASAP.

STOP Writing Boring Headlines

Boring leads and headlines will kill the blog and the reader’s interest before it even starts. They’re not going to read this just because you wrote it. You have to earn their click and their read.

It may help to write the lead last, after you have a better understanding of the story and know the most interesting part.

STOP Burying the Lead

If the headline makes a promise, you had better fulfill it right away.

For example, if you write a blog about how to install a showerhead, you had better not write 1,000 words about the best showerheads you should buy in 2019 before you write about how to install one. That’s called burying the lead and human readers absolutely hate it.

STOP Using One-off or Random Writers

If you use a content ordering system, you could get 4 writers working on 4 different blogs for the same client in the same month. This creates terrible content because each of those writers has no previous experience with this client and may not even have any in that industry.

Build content plans with the intent of finding a writer who can work on the same client, month after month. This way they can learn the client’s audience, offering, and industry.

STOP Using Passive Sentences

Some writers do it because they don’t know any better. Some do it because they’re trying to bump up the word count.

Yes, it has a time and place. But, too much of a passive voice puts your reader to sleep. It just reads like filler. So, write “I opened the box” instead of “The box was something that I opened.”

If you’re not sure of the difference between a passive and active voice, you can click here. Hint, that last sentence was passive.

Final Thoughts

We know, we know. Being told that you can’t optimize for an update this big can leave you feeling a bit uneasy.

However, there is a difference between not being able to optimize for BERT, and not being able to increase your odds of ranking in a post-BERT world.

Write great content that focuses on what the user would like, not what the search engines are probably looking for. Of course, get your keywords in there, but work them in organically. Don’t write a blog that reads like its only goal is to rank. Write blogs with the goal of entertaining or educating the reader and the rankings will come.

If you have any questions about BERT or any other Google updates, please feel free to reach out to us any time.

Google Made 3,200 Changes Last Year. Did They All Matter?

Google recently said that they made over 3,200 changes to their search algorithm last year.

To give you a little bit of perspective, James Harden led the NBA in scoring with 2818 points. So, Google changed more frequently than Harden scored.

Could you imagine having to change your digital marketing strategy every time James Harden scored? Well, changing it after each Google update doesn’t make much more sense. You can drive yourself mad and sabotage your results by constantly chasing Google’s algo. At our SEO agency, we’re firm believers in what John Mueller has advised us all to do.

Don't chase the algo

Everyone who works at our SEO firm is plugged into the world of SEO news and updates all day long. It’s what our employees read about on the way home and talk to their friends about, because we’re simply unabashed SEO nerds.

We stay on top of algo updates, but we don’t change the playbook every time something happens.

If you’re doing the right things (the white hat things, basically), that means you’re creating good content, and organically earning links. That is basically the perfect utopian internet that Google wants to see. Every single update they do is another step forward to their algo rewarding companies that do the right things, while weeding out the rest.

As I’ve said before, right now, the algo makes more sense than it ever has.

If you’re publishing quality content, optimizing it the right way, and earning organic links to it, you are basically future-proofing your SEO success. You’re algo-update-proofing them.

An update is not going to arrive and pull the rug out from under all of the keywords you’re ranking well for. However, if you’ve taken some shortcuts over the years, you may be holding your breath with each update.

With that in mind, here are some of the biggest updates we’ve seen to Google over the last 12 months, and what they have meant to businesses and marketers.

Google’s Mobile-First Indexing

mobile first indexing

This update was probably the source of the most misinformation out there

There was a July deadline that was a bit misunderstood by some people. First of all, that July 1st deadline was NOT the official launch of mobile-first indexing by Google. Mobile-first has been slowly rolled out by Google since back in 2016.

The July 1st deadline was actually the date where mobile-first indexing was the default for all new (meaning not previously crawled or indexed by Google) sites from that day on.

This meant that if you were building a new site that was scheduled to go live after July, you had better make sure the mobile site was optimized. It’s 2019, so hopefully you were already going to do that. If you weren’t, we need to have a serious talk. Call me, like now.

If you had an older site that didn’t have a mobile-friendly site, odds are good that Google would have reached out to you via Search Console to let you know that you needed to change XY or Z for mobile-first indexing.

What This Update ‘Really’ Meant

Was mobile-first indexing the biggest update Google has rolled out in years? Yes. But it was rolled out slowly and systematically over the last few years. When they announced it in 2016, SEO people like us said, “Ok, $#!% is going to get real.”

But it’s been getting real for about 3 years now.

The big announcement 3 years ago served as an internet-wide notice that your mobile site is now just as important as your desktop site. Mobile-first indexing meant we all had to shift to mobile-first planning.

It’s good they gave us so much time to adapt because this was a significant shift in thinking for many. In 2016, if you had an online store or e-commerce page, you probably designed your mobile site concurrently with your desktop, because you expected a lot of purchases from mobile devices.

You wanted a frictionless buying experience that was “Tap, tap, ‘Buy,’ confirmation page.” There was no room for “Tap, tap, buy, whoops I hit ‘See More’ by mistake because the buttons are too close together on a mobile screen.”

That would lead to a lot of shopping cart abandonment, so your mobile site was crucial for your SEO and your CRO (conversion rate optimization).

But if you didn’t sell anything on your site in 2016, your mobile experience was probably an after-thought. That can’t be the case anymore.

Key Takeaway: Even if your users don’t look at your mobile site first, Google will.

Google Mobile Page Speed Update

Not only does your mobile site have to be optimized, it also has to be fast.

In July of 2018, Google rolled out the mobile page speed update. Most SEO’s assumed page load speed was a factor in mobile ranking (and desktop too, for that matter) but this made it official.

Google stated in their blog, “The ‘Speed Update,’ as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries.”

They did also add that a slower page may still rank well, if the content is found to be highly relevant.

They also provided a mobile experience tester for webmasters and business owners to see how their mobile site measures up.

mobile speed testWhat This Update ‘Really’ Meant

We’ve always known that faster is better. Quick load speeds help your CRO and your SEO, on both your mobile and desktop experience.

But, with this update, Google confirmed that speed is a major factor in your mobile’s SEO, and showed us how speed impacts your bounce rate:

mobile bounce rateKey Takeaway: Make your mobile site fast and make it good.

Google’s Broad Core Algorithm Update

In August of 2018, Google announced a “broad core algorithm update.” But what did it involve?

Nobody really knew and Google wasn’t going to tell us. Google is notoriously tight-lipped about the finer details of their algo, and only give us the information that they want us to have without ever being too specific.

Imagine KFC was announcing that they were changing one of the herbs in their secret 11 herbs and spices recipe.

Reporter: “Can you tell us which one?”
KFC: “Well, doing that would give you clues into what the rest of the secret is. So, no.”

What This Update ‘Really’ Meant

This was a tough one to lock down. We didn’t notice any significant changes.

However, the updated was dubbed the “medic update” by some in the SEO community. Search Engine Roundtable posted a survey asking how the update’s impact. 42% of the respondents were in the ‘medical, health, fitness, healthy lifestyle space’ answered that had some sort of impact. Hence the informal title of “medic update.”

medic updateOf course, Google gonna Google.

They responded by saying “remain focused on building great content. Over time, it may be that your content may rise relative to other pages.”

There is no fix

While this was a frustrating non-answer to many people who just saw a drop in ranking and traffic, it does line up with what we believe: Keep creating good content and earn organic links and you will update-proof your web presence.

Key Takeaway: Business as usual.

Google’s Fixes to Indexing and Search Console

Google is not completely shrouded in mystery. Recently they were quite transparent and open about issues and bugs in their indexing and Search Console.

1. The Indexing Issue

What happened? Google admitted that they had “temporarily lost part of the Search index.”

In an attempt to push an update live, a malfunctioned removed “small number of documents” from the index. This means some sites disappeared from the index, which means you could have gone from page one of the SERPs to nowhere to be seen.

Google’s techs quickly found the issue and reverted to their most recently saved files. However, this took just shy of a week to detect and fix, which means some companies were missing from the SERPs and lost a week’s worth of traffic to their competition.

2. The Search Console Issue

The above indexing issue bled into search console as it was incorrectly reporting and displaying inaccurate results.

The Search Console database paused reporting for April 15th-30th while Google fixed things. This left a number of marketers without the data they needed for month-end reporting and other marketing activities.

What This Update ‘Really’ Meant

It meant that Google is not perfect and doesn’t pretend to be.

This was a slightly-uncharacteristic and borderline refreshingly open response to issues they had. Which is a great sign, because everyone who works in SEO craves data and transparency.

Google also provided the best ways to report a bug or issue:

  • Check our Webmaster Community, sometimes other webmasters have highlighted an issue that also impacts your site.
  • In person! We love contact, come and talk to us at events. Calendar.
  • Within our products! The Search Console feedback tool is very useful to our teams.
  • Twitter and YouTube!

Key Takeaway: If you have an issue with Google, don’t waste your time trying to get someone on the phone.

The Google Maverick Update

This was another quizzical update, with a lot of chatter in the SEO community and almost nothing said from Google.

SEO pros and pundits saw their screens light up with unexplained fluctuations across the board in July of this year. These fluctuations remained just as unexplained when they were brought to John Mueller’s attention.

He stated that, “I don’t have any update news. I saw a lot of blogging and tweeting on updates, so I don’t know what is specifically happening there. I don’t know. We will see. I haven’t chatted with Danny (Sullivan) about that. So, not quite sure.”

Ok, no answers there.

With no official word from Google, the update was unofficially dubbed “Maverick,” mainly because the trainer for Tom Cruise’s Maverick dropped online that same week. And to be honest, the trailer does look amazing.

SEO companies had nothing to draw from besides their own proprietary data they were seeing across their own site. For example, Barry Schwartz of Search Engine Land wrote that, “the general consensus was that this was a weird update and hard to find patterns with. Even when comparing it to previous core updates, this one seemed different.”

maverick updateWhat This Update ‘Really’ Meant

To be honest, this is a really good cautionary tale and case study for why you never chase the algo. If you lived and died by reacting to fluctuations, this update would have literally killed you.

There was almost no confirmation nor explanation from Google as to what happened, nor any consensus in the SEO community to what really happened or whom it really happened to.

If your business took a hit, don’t overreact. If your business saw a boost, don’t overreact. Don’t drive yourself nuts trying to make heads or tails of this one.

Key Takeaway: Don’t Go Chasing Algos

Parting Thoughts

Again these are only a few of the updates Google made over the last 12 months.

You probably don’t have time to keep track of all of these changes, nor take them apart and determine what they really mean. You’re better off hiring SEO nerds like us to stay on top of them for you.

As you read above, Google is not exactly overloading us with too many details about each upcoming or recent update.

They very literally have to play their cards close to their chest to protect the integrity of the entire internet. If they ever came out and said “You need to do exactly XY and Z to rank #1” it would create total chaos. Every marketer and business owner would do it at the exact same time and it would be impossible to actually rank. Google would immediately have to issue another update to fix the mass damage done by the last one.

This is why they give us vague, and often cryptic, details about any changes that have been made, and it’s up to the SEO community to test out what works and what doesn’t to draw our own conclusions.

Of course, we have our own theories and methodologies for what works in today’s world of SEO. If you want to ask me about them, feel free to click here.

Did Google Just Reclassify NoFollow Links as Sorta-Follow Links?

Any time Google changes anything from their end, there is an expected amount of subsequent freaking out and overreacting from the SEO community. We’re an excitable bunch.

So, when Google announced that they’re making changes to nofollow attribution (something they introduced in 2005 and have not changed since), the reaction from SEO agencies and pundits was pretty predictable. A lot of people were not pleased and let their freak out flags fly.

How big were these changes? Substantial. Google clearly had a lot of meetings and a lot of Hangouts to discuss this one, as this represents a massive shift from their side. You don’t simply change something after 15 years on a whim.

But, how big is this change to the average small-to-medium-sized business (SMB) or typical webmaster? Not very, actually.

Allow us to explain.

How Did Google Change NoFollow Attribution?

Man coding

For the last 15 or so years, you would add a nofollow to a link that you wanted to link to, but you don’t want to:

  • Endorse it
  • Send it any SEO link juice
  • Have Google crawl it. Maybe it’s a link in a comment or a forum

Your nofollow was your Swiss Army Knife coding that you could use for all 3 of these situations. But, last week Google announced that nofollow attribution was evolving. Now, you could choose from 3 different options, depending on your situation.

As of this week, we now have:

  1. Nofollow: For links that you don’t want to provide any sort of rank boost to, or endorse in any way.
  2. Sponsored: For links that were created as part of advertisements, sponsorships or other compensation agreement
  3. UGC: For user-generated content, such as comments or forum posts.

Three instead of one. It’s an interesting move on their part.

“The web has evolved since nofollow was introduced in 2005 and it’s time for nofollow to evolve as well,” said Google in their blog.

“Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links.”

Why Did Google Change NoFollow?

Google made this change in the name of making it easier for them to crawl our websites, which benefits both the search engine and the search engine optimizers.

“Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes,” said Google.

Of course, there was the typical blowback of jaded SEO pros who questioned the real reason why Google did this. But that’s to be expected and more on that later.

And Now the Part Where Everyone Loses Their Mind

Those were not the only changes that Google announced. They also stated that: 

“When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search.”

Wait… What. Did they just say that nofollow links could now be followed as a “hint?” And what is a hint, exactly? 

Cue the explosion from the SEO masses.

google update angry mob

The Fallout/Freakout 

Google’s representatives then had to take to Twitter for the always wildly unenviable task of explaining and defending these changes to a Twitterverse full of SEO pros carrying torches and pitchforks.

There were questions about “what’s in it for us?

No follow update

There were the usual complaints about how much more work this will be for SEO firms and professionals.

No follow update

And the ever-present concerns about forced adoption.

No follow update

When Will These Changes Take Place?

The three new link attributes (sponsored, UGC and nofollow) all work right now, and they are currently being taken as hints by Google for ranking purposes. Google has also announced that nofollow will become a hint as of March 1, 2020, for crawling and indexing purposes as well.

For a visual aid on all of the changes and the timelines, check out this helpful graph from the people at Moz.

Who Will These Changes Impact the Most?

In our somewhat-humble-but-pretty-darn-sure-of-ourselves opinion, this is going to have almost no impact on the average SMB, or SEO company and that represents a number of SMBs.

This will most likely impact large companies, publishers or higher-end sites with:

  • Multiple digital properties to manage
  • Blogs, videos, or other content with a lot of comments (USG)
  • Sites that host community forums (also USG)
  • Companies that have a large number of affiliate links generating income

For everyone else, this is likely something that is not going to really impact you. It’s pretty much business as usual. Google has openly said that you don’t have to do anything at all because of this change. It’s great to be aware of this, but it’s not really actionable.

What Did Google Mean By a ‘Hint’

The most second-guessed part of the announcement was clearly the word hint, and what that means exactly.

Our interpretation: The link will not be crawled… Unless it’s earning a lot of clicks and attention, which is a “hint” to Google there could be something noteworthy on the other side of this link and perhaps they should crawl it after all.

Google has said that:

“Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns.”

“By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.”

As a writer, I personally like to picture the people at Google on a Hangout call obsessing over the verbiage to use and finally landing on the word “hint.”

“Signal? No, the word signal is far too strong. How about tip? Too obtuse or mischievous? How about hint?”

Do I Need to Audit My Links?

Well, you should always regularly audit your links. But should you audit your links just for the purposes of these changes? You can.

Again, you should be mindful of this distinction for future links. But, Google has stated that you don’t really need to change your links retroactively. If you leave user-generated content as simply nofollow, you’re fine. There should be no penalty or change whatsoever.

However, Google said the one exception could be, “If you flag a UGC link or a non-ad link as “sponsored,” we’ll see that hint but the impact — if any at all — would be at most that we might not count the link as a credit for another page.” 

They added, “In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.”

However, if you’ve not taken a look at your links in some time and want to use this as an excuse to perform a much-overdue audit, by all means, have at it. One of the good passive effects of any major update is that it forces us to stop and look at what we’re doing and revisit the big picture.

SEOs can get so caught up in the day-to-day granular minutiae of creating mountains of content, it helps if we can regroup once in a while.

How Do I Audit My Links?

ahrefs link audit

If this is your first time auditing your links, don’t worry, automated software can do most of the heavy lifting for you. You don’t need to comb your site to test all of your links, or Google your company and your name and your phone number to try to find all of the links pointing back to your site.

There are a number of software solutions out there to help you audit your link profile, but we highly recommend ahrefs.

You can get a free (or cheap) trial of ahrefs and perform a full link audit in under 30 minutes. They walk you through how to do it here. This can help you: 

  • See your entire link profile
  • See your competition’s link profile (which is amazingly helpful)
  • See any bad links that may be hurting your SEO profile
  • Find new opportunities

Why Would I Want to Audit My Links?

Your link profile is the essence of your off-page strategy, which is worth about half of your SEO clout.

Your site’s ranking could be being held back by a spammy link as we speak. Google’s Penguin update really cracked down on bad links. You need to know if you’re currently being penalized. The good news is that if you are, the Penguin is now part of Google’s core algo, so you should see the results fairly quickly after you clean up your backlink(s).

The other upside that this audit presents is to find new opportunities by looking at your competition’s links. If a reputable site linked to them, there could be an opportunity to approach them to link to you. If someone ran a guest post from your competition, maybe they will run one for you too.

This can lead to some huge SEO wins and help you fill your content calendar for the next year.

Link-Building and NoFollows

link building nofollow

You would never actually build or work towards a nofollow link, right? What would be the point in that? That nofollow won’t be crawled or pass on any SEO link juice to your site, so why bother?

Well, we do it. Other SEO companies have straight-up laughed at us for doing it. But we still do it. Here’s why.

You need a certain level of moderation and restraint in your link building campaigns and strategies. If you just go full bore 100% of the time with hardcore aggressive link building tactics, Google is going to notice for all of the wrong reasons.

You’re going to set off warning bells that tell Google, “These guys are too aggressive, they’re trying to manipulate.” Now, all of your effort has been wasted, while both your links and your content are now meaningless.

We see nofollow links as sort of a PH balancer to level out link building tactics. We add in just a bit to keep things level. That way, our web trail has more variety.

Doing too much of anything can put your link building efforts in jeopardy, if you’re not careful and mindful of the trail you’re leaving. If you have given all of your writers a checklist that says you absolutely need to put a keyword in the opening paragraph, that’s going to leave the wrong type of trail.

So, in this regard, nofollows absolutely have some value. It’s not exactly exciting big-win-territory if you hire an SEO agency. You’re not anxiously awaiting your monthly report so you can see how many nofollow links they built you. But it should still be done. Try to see it as unglamourous maintenance that ensures the machine runs.

At the same time, there is also this new talk of hints. Major publications are somewhat famous for nofollows in the middle of their stories. So, if you happen to earn a nofollow on a big site, Google may see it as a hint if it’s garnering some attention. This could be a major win.

To be honest, I would have counted a nofollow from a big website as a win, even before these updates and hints becoming a thing.

Final Thoughts

To recap, this is a big move for Google, but probably not a huge move for you, unless you have a lot of:

  • Web properties to manage
  • User-generated content 
  • Affiliate/ paid links

You don’t really have to do anything right now, if you’re an SMB. Perhaps be mindful of the 3 new nofollow attributes for the future.

As we said, if you did want to use this as an excuse to revisit or audit your links, this is almost never a waste of time. You can identify anything that is hurting your SEO right now, while finding new ways to beat your competition.

Of course, if you want to hand off your link-building and off-page strategy to an SEO agency full of really smart SEO nerds, we would love to help! Click here to contact us any time!

Are User Reviews Overrated?

Is there a disconnect between how important restaurant owners think their online reviews are, compared to how important customers feel those reviews are?

A restaurant point of sale and management system called Toast recently released a whitepaper that stated:

  • Almost half (49%) of restaurant owners say online reviews are influential
  • But, only a third (35%) of customers say online reviews influence them

Huh. Ok, processing… From the perspective of an SEO company in Toronto, we must say this does not compute.

Based on the data on the browsing/searching habits we’ve seen from customers over the years, Toast’s findings don’t quite jive with us. We can tell you that restaurants are among the most Googled businesses across the globe, and data has always pointed to search engine results driving consumer behaviour.

Here are some other stats to consider:

  • BrightLocal has reported that 90% of consumers read online reviews before visiting a business
  • They also added that 84% trust online reviews as much as a personal recommendation
  • The Harvard Business School reported every one-star increase in a Yelp rating means a 5 to 9% increase in revenue
  • Reviewtrackers has reported that 94% of customers surveyed said a bad online review has convinced them to avoid a business
  • Customers are willing to pay 31% more for a well-reviewed business

We were not a part of the Toast survey data gathering, but it is entirely possible the customers they surveyed may not realize how much that online reviews will actually influence their purchase decision.

You could ask people something like:

Question: How important would you say online reviews are in choosing a restaurant?
Respondent Answer: Only Somewhat

But if we dig a little deeper:

Question: Would you go to a restaurant with 2.0 stars?
Respondent Answer: No 

Question: Would you go to a restaurant after reading that it’s way too expensive?
Respondent Answer: No 

Question: Do you check menu items online before visiting a restaurant?
Respondent Answer: Yes

When you break it into individual questions about their buying behavior, the big-picture-answer is, yes, online reviews are important.

So, if you were a restaurant owner who read the Toast statistics and was about to dial back your spending/ efforts on online reviews, think again.

Online reviews are more than just a foot-traffic driver to get people into your business. They are an absolutely crucial part of any successful local SEO campaign.

Are online reviews overrated from an SEO perspective? Hard, hard no. In fact, they may be dangerously underrated by many.

The Impact of Online Reviews on SEO

If you’re a small-to-medium-sized business in a competitive market, your online reviews are a huge part of your local SEO strategy and they are the pillars of your off-page strategy.

Your off-page strategy needs to involve real effort going into:

  • Your Google My Business listing (Moz has ranked it as most important)
  • Your online reviews
  • Link-building & brand mentions

How much weight is given to your reviews in relation to other on and off-page factors? Good question! Sadly, this is not like your credit score where FICO tells you that your payment history is worth 35% of your credit score. Google doesn’t tell you how much your reviews are worth. And it’s a good thing they don’t because that would create chaos in the world of local SEO.

Here are some universal truths that data has proven:

  • High-ranking sites have more good reviews
  • They also typically have a larger number of reviews, in general
  • Good reviews help your SEO more than bad reviews, but…
  • Bad reviews aren’t always that bad (More on that later)
  • It is almost impossible to rank in a high-competition local market without them

There is also no magic ratio that expresses how much good a positive review helps you, compared to how much a bad one hurts you. I can tell you that if I knew that number, I would not share it for free in a blog. I would be living on my own island right now and rule the world of local SEO.

Good Online Reviews Help David Fight Goliath 

SEO Reviews Restaurant

Let’s say you run an amazing (but small) steak house.

You want to earn enough SEO clout to steer people away from international steak giant, The Keg. You may wonder how the hell you’re supposed to do that. The Keg probably spends the equivalent of your restaurant’s entire yearly revenue on social media ads alone, right?

You can’t take them down, but you can stand next to them!

Your reviews are your greatest SEO weapon in this battle. The Keg has (way) more money than you, but they can’t buy good reviews (more on that later). This levels the playing field.

But you may say, “They get more customers, they’re a bigger name, and they have been open longer. They already have over 2,000 reviews and an average of 4.5. How are we supposed to get that?”

You don’t need to get that. You can be listed right next to them with 200 reviews and an average of 4.6. That could very well earn you a listing right next to The Keg, and now you’re a part of a lot more conversations about people’s dinner plans.

People Like to Help the “Little Guy”

It may actually be easier for small businesses to gather online reviews, compared to a major chain.

I don’t know about you, but if I get a slice from Pizza Pizza and the teller asks me to review them online, I’m like, “Nah, you’re good without me.” But, if I get an amazing slice from a new Mom and Pop place and they ask me for a review, I’m like, “Yes! That pizza was amazing. I must tell the world! I want these guys to do well and stay open forever.”

So, if a Keg waitress gives customers the bill and says “Don’t forget to review us online,” a lot of customers won’t give it a second thought. But, when they’re in your steak house, they’re more likely to think, “Yeah, I think I will. This place is great and I want to help them.”

They may even leave a review on the Uber ride home. 

It’s an easier ask for a small business, but you still need to ask the right way.

How to Get More Good Online Reviews

Good reviews do not simply appear because you’re amazing. You may think the secret to great reviews is delivering the best possible offering and customer service. But that’s far from the case. It certainly helps, but good reviews are not built on good intentions.

The businesses with most good reviews don’t simply have a great offering, they have a great system to encourage reviews. Like anything else in SEO, you need a strong system in place. You need a goal, a plan, and means of measuring success.

One of the first things you need to do is make sure you’re properly registered as a business with Google so you can actually get reviews. You can learn all about that by clicking here.

The rest blends elements of marketing and psychology.

user review seo

Know What You’re Up Against

You’re probably already aware of this: People are more likely to leave a bad review for a bad experience.

The old adage used to be that a happy customer will tell 2-3 people about their experience but a dissatisfied consumer will share their gripe with as many as 20. The numbers always seemed to fluctuate, depending on who you talked to. However, the message is always clear. Bad experiences resonate more, and people are more likely to share them.

It’s not that we’re all terrible and whiny people who would rather complain than praise. It’s just a matter of what resonates more. 

Think about it. Which experience is more memorable?

  • That time you waited in line forever and they still screwed up your order, or…
  • That time you ordered your coffee with no problems whatsoever

Which story would you be more likely to tell to coworkers in the lunchroom later? The bad one, clearly, and it’s not your fault. It’s just simply a more memorable, and sadly, shareable experience.

This means that if bad experiences stand out more in people’s minds, we have to give them a reason to review good ones. 

Know the Rules

Some people will tell you the secret to getting more online reviews is simple: Ask for them.

Well… that’s simply not true. The biggest problem with that is that Yelp does not want anyone asking for Yelp reviews in any form at any time. Ever. They want all of their reviews to be completely unsolicited and totally organic. 

Yelp takes this seriously and is getting better at catching the people who violate their policies every single day. 

However, many other major review sites like Google and Tripadvisor do you allow you to ask for reviews. They just don’t want you to:

  1. Pay for them in any way
  2. Misrepresent the truth or lie in them

Assuming you’re not targeting Yelp, there are a number of organic ways you can encourage customers to review your company.

Train Your Staff

Get your sales or customer service staff to remind customers to review you, right after an interaction where the customer is satisfied and their opinion of you is high.

This is why servers should do this after a meal and not before. But again, if you’re a restaurant, don’t let your servers use the Y-word that rhymes with “help.”

Swag/Print Materials/ Take-aways
You can also put reminders to review you online on your take-out menus, fridge magnets, business cards, or flyers. 

Emails and SMS Reminders
Once you have your customer’s email address or phone number, you can send them a gentle reminder/ nudge that their feedback is valuable. But, be careful as Google says you’re not allowed to “solicit reviews from customers in bulk.”

Your Website and Social Channels
It’s also good to have “Review Us” buttons and sections on all of your web properties.

Never Ever Do These Things to Get More Reviews

Now that we’ve reviewed the organic and above-the-grade ways that you should earn your reviews, we will now cover the black-hat shortcuts that will get you banned from review sites in a hurry.

Or you may get a humiliating banner on your Yelp page for 90 days, letting everyone know you cheated.

yelp consumer alertNever, ever do any of these things.

Review Your Own Company:
This includes getting any members of your staff to do it. The one exception may be asking your employees for good Glassdoor reviews. This is legal, but a lot of employees still hate doing it.

Create Fake Accounts to Write Good Reviews
We can’t believe people still do this. You WILL get caught and you WILL pay for it. In fact, you might even go to jail. Seriously!

Hire a Third Party Person or Company to Write Fake Reviews
Just don’t do it, for all of the reasons we listed above.

Offer Incentives For Leaving a Review
No matter what you come up with, it is very likely going to tread too closely to paying for good reviews.

It doesn’t matter if it’s a discount, or a promotion, or free sample. You’re best to err on the side of caution and just avoid incentivized reviews.

Write Bad Reviews On Your Competitors Review Sites
This can get you banned, and it also just simply reflects very poorly on your brand.

You come off very unprofessional and petty.

Can a Bad Review Be a Good Thing?

bad reviews SEOThere is an old saying in Hollywood that all press is good press. Does this apply to your online reviews? Is any review a good review that can help your SEO ranking?

We wouldn’t go that far, but there are definitely a few upsides to a bad review.

1. Volume is a Good Thing

Google’s ranking algo is incredibly complex and constantly changing. As such, nobody can truthfully say that a good review will help your ranking, nor how much a bad one will hurt, nor what the totality of all your reviews may do. You just can’t.

There are some who can say, “We increased our reviews from X to Y, and saw our ranking jump to Z. This caused our traffic to blah blah blah.” That’s anecdotal and isolated. It’s also very tied to a local market. A bad review in Toronto could have a completely different effect than one in Manhattan.

Two things we know for sure:

  1. Google will showcase the brand with the best reviews
  2. They will also showcase the ones with the most reviews

In this regard, yes, adding to the number of reviews you have is likely a good thing, provided you can offset the bad review with lots of good ones.

2. Legitimacy and Truthfulness are Good

You also need bad reviews to be taken seriously.

You need a mixed bag of both good/ bad reviews to be respected and accepted by both the review site and your would-be customers. Ironically, you need a few bad reviews for people to believe that your good reviews are all legit.

Nothing but glowing 5-star reviews is a red flag to Yelp or Google that you are faking your reviews. And it sends the exact same signal to the people searching for your business. You’re simply too good to be true.

So, in this case, a 4.8 likely looks better than a 5.0. People look at a 5.0 and say, “Pfft. Fake.” But they look at 4.8 and say, “Oh damn.”

Research has proven that customers are more likely to respond to moderately positive reviews, compared to widely positive reviews.

Everyone knows that nobody is perfect. There were critics who didn’t like The Beatles White Album or The film The Shawshank Redemption. Nothing can garner 100% good reviews. 

Even the online mattress mega-seller Endy brags about a slightly imperfect 4.8 rating on their site, front and center. They understand the legitimacy of the number, and that buyers will know that 4.8 is actually freaking awesome. 

3. You Can Turn a Bad Review Into a Success Story

This is a very legitimate opportunity to turn a detractor into an advocate, or a hater into a fan. 

The individual attention that you spend addressing a negative review (or comment on social media) can go a long way and make a lasting impression with this once-angry customer.

Google actively encourages you to interact with your bad reviews, and points to it as a way of gaining local SEO success.

Of course, there are also services that will help you intercept a bad review from a customer before it’s posted publicly, such as:

But we will take a deeper dive into them later.

4. You Can Turn Bad Reviews into Great Blogs

Sometimes a hurtful review can be the best things to ever happen to you. 

They say that when you lose, you should never lose the lesson. So take your bad reviews and make the most of them.

The root of their complaint is clearly that customer’s pain point. Someone else will have the same pain point. This is all invaluable and unfiltered market research. Put your pride aside and learn from this.  

Take the bad review, in its entirety (swear words and all), and ask yourself:

  1. What is this customer’s overarching complaint? Was this about quality? Wait times? Pricing?
  2. What are the words they’re using to describe how they feel? Annoyed? Hurt? Disappointed?
  3. Was this just one unhappy customer or is this pointing to a trend?

Take the verbiage and the pain points from this and turn it into targeted blog content and whitepapers that talk about this. If there is a common complaint, this is a forum to explain why.

For example, let’s say you’re a keto-friendly bakery. Someone gives you a bad review saying they drove all the way across town and you were out of the keto pizza crust that they really wanted. Maybe a few people have expressed a similar complaint.

Write a blog about how your pizza crust is made by hand and you can’t get this level of keto-friendliness and tastiness in a mass-produced crust. It has to be made with love and care by hand, which is how you do it. It sells out so quickly, because it’s so dang good, but so dang hard to make.

Do I Need a Customer Review Software?

There are a lot of legitimate tools that can help you mine, cultivate and manage your customer reviews across all platforms. You can use a single solution to manage your Google Reviews, Yelp, Booking.com, Tripadvisor or whatever you need.

They include:

However, these solutions are not cheap and can often be a big investment for a small business.

Is it worth your money to invest? If you’re in a very competitive market and can see a clear return-on-investment from better online reviews, it is definitely worth considering. Or, if you’re finding that you’re spending too much time managing them manually, you should also consider it.

Keep in mind that these programs are not autopilot mode for your reviews. They can do a lot to help you manage and organize, but they still require someone to use the program and “own” it. If you don’t want to tie up any of your own time or staff worrying about this, your best bet is to likely work with an online reputation management firm.

At the same time, if you’re considering a software solution because you’re dealing with a plague of bad reviews that you want to remove/move down, you should definitely work with a reputation management firm, because this requires an entire strategy.

Parting Thoughts

So, are online reviews overrated? In a nutshell, no, not even close.

There is a reason these reviews are the new currency in the world of local SEO. Google has always rewarded organic tactics to boost your brand’s web presence, and this definitely includes a strong trail of legitimate online reviews.

Like everything else in SEO, you can’t really buy success here. You need to put in the work to earn it. It’s not easy and it doesn’t just happen simply because you did a really good job. You need to cultivate online reviews, but not too aggressively, particularly when it comes to Yelp… Yelp needs to be handled with care.

If you have any questions about online reviews or reputation management, please feel free to click here to contact us at any time.

 

What Did Google’s July Deadline For Mobile-First Indexing Really Mean

We’re still hearing a bit of confusion about what exactly happened to Google’s ranking algo on July 1st, so let us clear the air.

Here’s what it wasn’t: July 1st was not the start of mobile-first Indexing. It was also not the deadline to make sure your site was up to Google’s mobile-first standards.

Here’s what it really was: All new websites (not previously crawled by Google) would be indexed mobile-first by default.

The short answer is that if you launch a site after July 1st, expect the mobile site to be looked at first. Which should not be a big deal, because most new sites should be built with mobile in mind.

The July date should have absolutely no bearing on current sides, and your site should have already been optimized. If it’s not, your homework is very late and the teacher would like you to stay after class to discuss it.

mobile first notificationWhy All the Confusion?

To be blunt, Google has been promising mobile-first indexing since November of 2016 and has rolled it out in various stages ever since. It may be hard to follow it all, if you haven’t been plugged into the SEO scene every day like we are.

At a glance, here is how it played out.

Friday, November 4, 2016
In the same year that mobile traffic surpassed desktop traffic on a global scale, everyone wondered when Google will start to put more weight on mobile sites when assessing your web presence. 

Google makes it official and announces mobile-first indexing as a way of adapting to how we’re all searching for things. No hard dates or specifics are announced, just an industry-wide feeling that, “This is huge.”

Monday, December 18, 2017
A bit more than a year after the Winter-is-Coming-like warning, Google states that:

“We continue to be cautious with rolling out mobile-first indexing. We believe taking this slowly will help webmasters get their sites ready for mobile users, and because of that, we currently don’t have a timeline for when it’s going to be completed.”

This confirms two things:

1. Google is serious about doing this the right way and wants us to all understand the scale of what is happening

2. If you haven’t optimized your mobile site, you really need to get on that.

A lot of people make their mobile site their New Year’s Resolution. 

Wednesday, December 19, 2018

So this is Christmas. What have you done (to your mobile site)?

A year after their last big announcement that mobile-first is being slowly rolled out, Google reports that mobile-first indexing is currently used for over 50% of their search results. They add that websites that are now being indexed mobile-first will be notified via Search Console.

If you have not acted on the previous warnings, you are very much behind at this point.

Monday, March 26, 2018

Google announced they are now switching over websites that appear to be following their best practices guide.

Tuesday, May 28, 2019

This is the first time Google puts a hard date on anything.

As we mentioned earlier, Google announced that, as of July 1st, all new sites that have not previously been indexed will now be indexed mobile-first by default, because, “Most new sites seem to work fine on mobile,” according to Google’s John Mueller.

As for all sites that were launched or created before July 1st, Google says they will continue to contact webmasters about their readiness.

mobile first indexThursday, June 27th 

Google introduces more mobile-first reporting in Search Console.

This shows you that you have been switched and the date you have been switched, so you can properly analyze how the switchover has impacted your traffic and other metrics.

search-console-mobile-first-indexing-data

How to Succeed in a Mobile-First World

If the entire Google algo shifting wasn’t enough incentive to make you think mobile-first, here are some stats to consider.

  • 88% of local business searches on a mobile device let to either a call or a visit for the business within 24 hours. This means mobile customers are buy-ready customers.
  • 60% of the people who search for local business use smartphones
  • 65% of people use their mobile phone in their “buying moments,” meaning they want to buy this thing right now and they will be frustrated if your site won’t let them do that

How do you set your mobile site up for SEO success? You need to focus on the mobile experience.

Mobile User Experience (UX) and SEO

If you have a user who is in the midst of a micro-moment or buying moment, don’t stand in their way. Google can tell when someone abandons your site or the shopping cart without buying anything.

There are a number of things that can ruin this experience:

You Have Intrusive Pop-ups, Interstitials or Overlays

Is it pretty much impossible to do anything on your site without clicking your pop-up newsletter or whitepaper CTA? This is a big no-no for UX and Google has warned webmasters about this.

If you’re suffering from this, we recommend referring to Google’s guidelines.

Your Videos Don’t Work on Mobile

This is immediately off-putting for a user and they will likely leave right away.

We recommend using Google Web Designer to create mobile-friendly animations in HTML5.

Your Users Have to Pinch and Zoom to Read/ Do Anything 

Are your users clicking one button by accident when they’re trying to click another? Do they have to pinch and zoom to read your text? These are also UX and SEO killers.

Once again, we strongly recommend you read Google’s Guidelines on how to properly space your elements.

They take a nice, deep dive into how everything should look and function.

You Have Too Little SEO Content

There is less space to work with, but the exact same need for SEO content. Google still needs good old fashioned text to read to know who you are and what you’re all about. With less screen space to work with, you simply have to be more strategic in how you place it.

Picture your mobile screen, divided into thirds:

The Top Third:
Is your logo, banner, and call-to-action (CTA). This CTA is the whole point of the whole page, so you want it clear and accessible. 

The Middle Third:
This is where you briefly describe your offering as concisely as possible. You will also put your social proof here, which could be your awards badges, industry certification, big-ticket client logos, or a great testimonial.

The Bottom Third:
This is where things can get a bit more text-heavy and you can add most of your SEO-rich keywords. This is also a great place for a dropdown menu that opens up more space for more content.

Need Help With Your Mobile SEO?

The real secret to success in a mobile-first world is mobile-first planning. Our creative and strategic thinkers can help your entire web presence.

Want to start now? You can click here to get a free consultation.

Sidewalk Labs’ Vision of Toronto: Dream City or Sci-Fi Nightmare?

Sidewalk Labs was asked to create a vision for a 12ish acre piece of criminally underused land on Toronto’s waterfront. What they came back with is truly amazing or absolutely terrifying, depending on how you see it.

The people of Toronto currently have two polar views of this plan. Some feel it will be a utopian future like the Jetsons, where robots clean out homes and dress our kids. While others feel it will be a dystopian future like Terminator, where the machines rise and enslave us all..

At first glance, Sidewalk Labs’ (Google’s sister company and Alphabet Inc.’s urban innovation organization) plan looks incredible. It would make Silicon Valley look like an expensive and overrated dump compared to Toronto’s new shining beacon of to future.  

They plan promises affordable and sustainable loft housing made of (gasp) wood. This new area would also be home to innovative and lucrative jobs, while we’re all whisked around the area on a new transit system that’s enough to make today’s Go Train or TTC commuter drool. We could walk anywhere and spend more time outdoors, and get to know our friendly neighbours. 

However, a closer look at their 1,500-page plan/ opus reveals a lot of high-level questions that still need to be answered, with some pretty massive concerns about what they’ve proposed.

So far, the plan has been very divisive and polarizing. Torontonians are either ready to sign a petition to keep Sidewalk Labs out, or sign up for the waiting list for one of those lofts.

We love Toronto. We’re proud to live in the city and operate a business here. We’re also an SEO firm and web design agency that talks about Google all-day-every-day. So, you could say that we have a vested interest in how this all plays out, and we’ve been watching the headlines carefully.

Here is a look at both sides of the issue.

About Sidewalk Labs’ Proposal

Earlier this summer, Sidewalks Labs unveiled their massive proposal called, Toronto Tomorrow: A New Approach for Inclusive Growth. We call it massive because it is 4 volumes and over 1,500 words long, and because it represents one of the most ambitious plans for any city in history.

They were basically asked to come up with a plan to repurpose and revitalize a 12-acre parcel of Toronto’s lakefront. However, that is just Phase 1of their proposed plan.

Their vision would see a 20-acre Villiers West site become a new expanded Google Canada headquarters, with some new residential and commercial properties as well. Next, they would set their sites on a 190-acre area of Toronto’s waterfront that they would turn into the Innovative Development and Economic Acceleration (IDEA) district.

Keeping in Google’s “Do No Evil” motto, here is some of the good that the plan would set out to do, 

More Jobs

This plan promises to create an impressive 93,000 total jobs, which includes 44,000 direct jobs. They also say it will pump $14.2 billion into the annual GDP output by the year 2040.

Approximately 2,500 of those new positions would be manufacturing jobs “catalyzing the mass timber industry through a new Ontario factory.”

More Affordable Housing

Sidewalk labs toronto housingAt the same time, affordable housing is a benchmark of the plan, with 40% of the housing proposed being rented at below-market rates. Another 20% meeting the traditional definition of affordable, which is defined as being offered at or below 100% of the average market rent for the given city. 

They have also allocated 5% of their units to meet the definition of “deep affordability,” which means they are up to 60% of average market rent.

However, it’s important to know that these units will be… “cozy.” The proposal has touted that the Quayside condo of the future would come in both efficient and ultra-efficient units. Today’s average 1-bedroom apartment/condo in downtown Toronto hovers around the 450-500 square feet mark, including a balcony. It will be interesting to see how these units are sized.

They say smaller and more efficient units will, “Enable affordability while remaining livable through thoughtful design features, such as space-saving furniture, shared building amenities, and access to off-site storage space with on-demand delivery.”

Affordable housing is a huge issue for people who want to live and work in Toronto. The most recent numbers point to a 1-bedroom apartment in Toronto now costing an average of well over $2,200, while a 2 bedroom would cost you much closer to $2,800.

Sidewalk labs toronto loftsThat’s a 14% year-over-year increase, and these numbers have been going up for the last few years. One wonders what the market rates will be by the time this plan is approved and the new apartments are actually built in a few years. 

More Walkability and Mobility

Sidewalk Labs have also made walkability and mobility a huge part of their vision of the future.

They have proposed to build a neighbourhood where more than 3/4 (77%) of all trips are made by public transit, cycling, or walking. They added that this would save households up to $4,000 a year.

They want to do things like build “people first streets” that increase pedestrian street space increases by 91% and offer new mobility services such as: 

  • Ride-hail
  • Bike- share
  • Electric vehicle car-share
  • E-scooters

The area would also have adaptive transit lights that give priority to pedestrians.

More… Good Weather?

No rain Sidewalk labs toronto loftsSidewalk Labs also envision an area where we can all walk outside and enjoy the weather, up to 35% more. They propose this by building things to keep crappy weather out, with an “Outdoor-comfort system.”

Their proposed outdoor-comfort system would have: 

  • Raincoats to shelter sidewalks
  • Fanshells to cover open spaces
  • Lanterns to block wind

We will be the first to say that the wind comes right off the lake during Toronto’s winters. The damp and cold winter winds seem to go right through you, even if you’re wearing Gortex, Canada Goose or The North Face. Toronto’s winters are no joke, and possibly why we lost Kawhi Leonard to Los Angeles. 

But, their weather-blocking measures remind us of that time Mr. Burns wanted to block out the sun.

 That could just be us, though.

The Reaction to the Sidewalk Lab’s Proposal For Toronto

Something this massive is always guaranteed to garner a mixed bag of reaction from anyone in the GTA with a Twitter account.

The supporters are applauding the ambitious effort to turn Toronto into the world-class city of the future we all see it as. The proposed plan would pump some much-needed jobs and money into an underused part of the city, potentially turning it into a waterfront crown jewel that all other cities across the globe are insanely jealous of.

However, the detractors have 4 main concerns:

  1. Data Security: There is worrisome talk of “urban data collection.”
  2. Land grabbing: This proposal is aggressively larger in scale than anticipated
  3. Private interests: A private company deprioritizing public interests

Here is a closer look at each.

Concern #1: Data Security 

We live in an age where the average person freaks out when they see a MEC ad in their Instagram feed the same day they visit the MEC website. People are hypersensitive towards data privacy these days and Sidewalks Labs’ talking about ‘collecting urban data’ has been met with a visceral reaction from many.

data protection Sidewalk labs torontoThere has been a predictable amount of freaking out over this so far. After a public meeting, one attendee said, “In an ideal world I’d be in favour of data collection in Quayside if there were enough protections… But Google has a history of breaking public trust.”

However, that man asked not to be named in the story, and added he doesn’t have a smartphone or a bank account. So… you know. He might be more concerned with privacy than you or me.

In response to these (and many other) concerns raised by the people of Toronto, Daniel L. Doctoroff, Chief Executive Officer of Sidewalk Labs, penned a piece in the Toronto Star called, “Sidewalk Labs plan sets a new standard for inclusive urban growth.”

He addressed the privacy issue by writing that:

“That’s why we’ve proposed a government-sanctioned, independent urban data trust. And it’s why Sidewalk Labs has committed to de-identification and privacy-by-design principles; to not selling personal information or using personal information for advertising purposes.”

Of course, the words ‘government-sanctioned’ did not exactly put the critics’ fears to rest. Many of these people wouldn’t feel comfortable with Google or our government safeguarding their sensitive data.

The words ‘de-identification and privacy-by-design principles’ also didn’t do much to quell any fears. There are many who believe that AI and quantum computing technology will effectively make de-identification impossible in the next few years.

Google’s Questionable History of Data Privacy 

There is also a question of who exactly Toronto would be getting into bed with, so to speak. Google is the world’s biggest data collector, as well as the world’s biggest data collection controversy collector.

It’s understandable, as Google has had a long history of questionable data collection and sharing practices, and widespread data security criticism for every product that they offer, from Maps to Chrome. 

Despite that, Sidewalk Labs insists that protecting private data will be a top priority for this development. They have stated that the feedback from the public has inspired them to set: 

A new standard for data privacy and governance in cities, and scaling back the role of Sidewalk Labs so local third parties can lead most of the real estate and technology development.”

Sidewalk Labs would even like to create an independent, government-sanctioned Urban Data Trust. Again, this was not met with an overall sense of relief, as right now there is no legal definition of what urban data even is. And many do not trust the government’s ability (nor their trustworthiness) to operate such a trust.

However, it will be hard to garner trust as the company keeps making headlines for all of the wrong reasons.

Concern #2: Land Grabbing

It’s hard not to be taken aback by the sheer size of what’s being proposed. It’s also worth noting that something that started out as a project to revitalize fewer than 13 acres of land quickly escalated into a plan to transform most of Toronto’s eastern lakefront property.

land grab Sidewalk labs toronto

This project would have raised questions of gentrification no matter what, even if they had just stuck with the original land parcel. After all, we’re talking about the biggest brand name in the world coming in to “improve” a historic and under-developed part of town. This always brings up questions like:

  • Will this suck the soul out of the area? 
  • Will it push the less fortunate people out?  
  • What’s to stop this from spilling into other neighbourhoods?

However, Doctoroff has stated that, “This plan proposes a limited role for Sidewalk Labs with government in the lead. Working with local partners, Sidewalk Labs would develop less than 7 percent of the eastern waterfront.”

Concern #3: Private interests

There is also a concern that this section of Toronto will quickly become ‘Googleville,’ while it pushes public interests aside and keeping all of the money for itself.

Doctoroff also addressed this in his Star piece, saying that, “Sidewalk Labs and partners would provide up to $1.3 billion in funding and financing.”

“We propose to make money on real estate development, fees on advisory services, charges on any optional financing provided, and a performance payment for hitting agreed-upon targets. Much of this would come only after public goals are achieved.” 

They have also proposed 10% profit sharing with governments for 10 years for some technologies developed for and used in the IDEA District.

Closing Thoughts

Like many people in Toronto, we’re adopting a ‘wait and see’ mentality when it comes to this development. We are huge fans of Google and see what an amazing opportunity this is for Toronto, particularly those of us who work in the technology sector.

There are a lot of things in the proposal that make us want to pinch ourselves because they sound so amazing. However, there are still a lot of unanswered questions that will be addressed as the proposed plan moves forward.

Will the IDEA District be the next head office for SEO Toronto? Only time will tell.

Add TF-IDF to Your SEO Research Before Your Competition Discovers it

No, we’re not about to tell you to completely scrap your keyword research. But, we are going to advise you to add a few columns to the spreadsheet; Crucial columns that could help you leapfrog the competition.

Today, we’re going to discuss TF-IDF, which we find insanely exciting. We pride ourselves on staying on top of SEO trends, without jumping on every little hack or trick that may or may not still be a thing in 12 months.

With that in mind, we can confidently say that TF-IDF is “for reals,” as the kids say… Do kids still say that? We have no idea. SEO trends are the only ones we follow.

In any case, TF-IDF is not an SEO fad. It’s a legit game-changer that very few people are taking advantage of. This is why you need to jump on this right now before “The Other Guys” do.

We’re going to take a deep dive into TF-IDF and explore what it is and what it is not. We will also show you exactly how it can help your site, and frame it all by putting it in the context of keyword research changes in recent years.

The Keyword Research That You are Probably Doing Now 

You’re probably using some variation of this content model right now:

  1. You use a keyword tool like Google Keyword Planner, ahrefs (our go-to), or SEMRush
  2. You find the keywords to match your customers’ problems and your company’s offering
  3. Weigh search volume against the competition score to find the keywords you want
  4. Pop all of this data into a big ol’ spreadsheet or content planner
  5. Create each respective blog with a primary and secondary keyword in mind
  6. Add them to your blog as naturally as possible, without stuffing them in
  7. Repeat for each respective blog

To be clear, this system works! This is some rock-solid SEOing. If you’re doing this now, you’re ahead of most businesses out there today. For god sake, more than a third of surveyed businesses are still keyword stuffing.

If you’re using the method above and your competition is also doing it, it’s a battle to see who can do it better.

Allow us to show you how to do it better.

TF-IDF sEO Research

What is TF-IDF?

TF-IDF stands for “Term frequency–inverse document frequency.” 

How does it impact your keyword strategy? Simply put, the tried-and-trusted keyword tools we mentioned are fantastic at telling you the keywords you need to optimize to earn Google’s attention. TF-IDF takes this a step further by showing you the other words that you will need to legitimize this piece as authoritative and complete.

Performing a TF-IDF analysis reveals the relevant words used in the top 10 results for whatever keyword you’re going after. It can show you words that could be viewed as conspicuous by their absence, in the eyes of Google.

Let’s say you’re writing a blog on the Avengers. You would probably use keywords like Iron Man, Hulk, Thor, and Black Widow. However, Google’s algo may not see your piece as truly authoritative if you’re not using words like superhero, comic book, hammer, or Marvel. But, doing some TF-IDF research would show you that 10/10 of the highest ranking blogs on The Avengers all have those other words. The data also shows you exactly how frequently each word is used in each blog.

If you want an insanely detailed and complex look at how this is calculated, we invite you to read this incredible piece.

For the purposes of this article, we’re not going to get into the granular minutiae of the algo. We’ll focus on its impact on SEO and the world of keywording as we know it.

How to use TF-IDF With Your Keywords 

Let’s say you wanted to rank for the term, ‘How to paint stripes on a wall.’ This is a pretty good mix of search volume and low competition. If you own a local paint or hardware store, this could be a good one to go after.

So, you would punch ‘How to paint stripes on a wall’ into your keyword tool (we used ahrefs, as usual) and get the following:

From here, you would look at the keyword ideas by volume and make ‘How to paint stripes on a wall’ your primary keyword and use secondary/ tertiary keywords such as:

  • How to paint horizontal stripes on a wall
  • How to paint vertical stripes on a wall
  • How to paint stripes on a wall without tape

That’s a great start. Now, let’s take this to the next level by doing some TF-IDF analysis to see the other words that we should be using.

Today, we’re using Surfer keyword analyzer. There are other options out there such as:

I like Surfer’s simple layout and deep-dive insights.

First of all, let’s enter our primary keyword, ‘How to paint stripes on a wall’ to see the other words that we should be using in our blog.

From the Popular Words tab, we can see what words frequently showed up in the current top-ranking sites for this query. We can see: 

  • The #1 ranked result used the word ‘Tape’ 35 times (1.94% density)
  • The #2 used it 52 times (0.73% density)
  • The #3 used it 20 times. 

You can also see the breakdown for other words like ‘Paint’ and ‘Wall.’

Would you have written this article without the word ‘Tape?’ Probably not. But this gives you an idea of how often the top performers are using it.

Let’s see what phrases the highest-ranked articles have in common. We click the Popular Phrases tab.

Now, we see the multi-word phrases that appear in the top results the most frequently.

Some of these phrases won’t apply to your blog. ‘Reply Beth’ was used in the #2 blog’s three times, but likely makes no sense in yours. But, you can see other terms that you should be using like ‘Base Coat’ or ‘Paint Stripes.’

Let’s drill down a bit further. 

The Common Words tab will show you which words showed up in all 10 of the top 10 results. In this case, you can see they all used the words: 

2019, wall, colors, color, paint, tape, time, stripes, painting, walls, painter, measure, base, stripe

It’s interesting to see that you may not have included ‘2019’ in your blog, but every page ranking in the Top 10 did.

Now you can take a similar drill down with the Common Phrases Tab.

TF-IDF sEO Research common phrases

You can see that 9 out of the Top 10 used the terms ‘Paint stripes’ and “Painter s tape.” Your blog probably would have too, but it’s good to know. It’s pretty cool stuff and shows you a more complete view of what you’re trying to rank against.

From here, you can see that if you and your competition each write an article about how to paint a stripe on a wall, adding the word “2019” could possibly give you the edge you need to outrank them… even though it may not have shown up in your initial keyword research.

How to Use and Implement TF-IDF Data

As you can see, TF-IDF data doesn’t replace your old keyword data, it gives it a nitrous oxide boost.

So, now what? What do you do with all of these new insights? You can see that the word ‘Tape’ makes up 1.93% of the top-ranking blogs. Does that mean you’re going to make sure that you use it 19 times in 1000 words? Or even bump it up to 20?

No, please do not do that! Unless you want to frustrate your reader and publish a truly terrible blog. 

Think of this data as more of a checklist to ensure you’re telling a complete story, and creating something the Google’s algo will see as comprehensive and authoritative.

Do not look at this data and say:

“We’re only using the word ‘Tape’ 6 times, we need to bump that up or we won’t rank.”

Instead, look at it and say:

“We’re already using the word ‘Tape,’ which is good. We’ll try to add it more where it makes sense. It looks like we should also pepper in the terms ‘Base Coat’ and ‘2019’ where we can.” 

Like traditional keyword stuffing, trying to awkwardly shoehorn these words in where they won’t belong will do 3 things:

  1. Kill the quality of the writing
  2. Frustrate the human reader
  3. Tip Google off to the fact that this is a not-so-good piece 

✖ The Wrong Way To Use This Information 

Create a finite list of words you need to use in your content, and use them according to the density your research has revealed.

✅ The Right Way to Use This Information

Use this data to identify any gaps in the story you may be telling (i.e. words you’re not using) to create the best odds for your content to rank.

Why You Want This Data Before Your Competition

If you’re currently embroiled in a competitive battle over spots in the SERPs, this is the leg-up you were looking for.

TF-IDF Data can help you make more informed SEO decisions and help you create better content. Or, you can use it to identify why some of your content creation work hasn’t paid off and you’re not seeing the traffic or ranking wins you had hoped for.

We sunk a good amount of time and effort into a few long-form pieces of content, only to see some disappointing results. We were left scratching our heads a bit. These were (in our humble view) high-quality pieces, with valuable and original content. We meticulously researched and used primary and secondary keywords.

However, after a quick TF-ID analysis, we discovered a few gaps. We saw that the top-ranking posts that we were trying to overtake were more complete and they had several terms that we simply did not.

This research uncovered sub-topics that we did not cover. It was a real lightbulb moment.

Simply put, these are the fresh and deep insights you need to claim SEO space from your competition. You can use it for a boost to leapfrog them in the rankings and send more traffic to your site. 

Vicious SEO From Outta Nowhere

If you didn’t know about TF-IDF’s value in SEO, you could be left staring at your reports wondering how your competition came from out of nowhere to overtake you. 

That’s because the changes you make with this data are small and subtle. They’re almost imperceptible. People can see that you suddenly started adding (or retroactively adding) keywords to your title tags or headings. That’s an obvious change and it’s a clear sign of why you/your competition could be gaining ground in the rankings.

But simply adding a few (seemingly benign) words to your copy? Even a hummingbird couldn’t catch that work.

The Evolution of Keyword Research Over the Years

We truly believe that TF-IDF is not just a fad, but a legit turning point in the journey to a more evolved internet. For the past few years, Google has sought to reward the people and companies who are offering the most complete user experience.

To give you a better idea of where we are going, it may be helpful to take a look at where we’ve been.

Here is a brief overview of the history of SEO and keywording.

1991- 2000: The Wild West

The early internet seems laughable to us now. Dial-up modems, AOL CDs, and webpages that looked like they were made in a Word Document.

There was no real law in this wild, wild west. The entire world was trying to figure out what this newfangled internet thing was and businesses tried to figure out how to ride this wave.

Many of the spammy techniques we still (sigh) see being used today were born in the early days … because they actually worked back then. Keyword stuffing and getting meaningless backlinks could actually help you!

It’s important to remember that we didn’t have to worry about Google updates, because Google hadn’t taken over the world yet. Google wasn’t even founded until 1998. With no defacto choice, search engines were a matter of personal preference and you could choose from:

  • HotBot
  • Altavista
  • Excite
  • WebCrawler
  • Ask Jeeves
  • Ask.com 
  • Yahoo

The content was pretty much all text-based. Our dial-up speeds were incredibly tedious, so easily consumable videos and pictures were not even close to on the radar yet.

Keyword research was very primitive, as was optimization. There was an overall feeling of, “If it works, keep doing it,” with no real playbook besides the one you wrote.

Spammers, keyword stuffers and other black hats were able to thrive in the 90s because there was no central adjudicator to make them stop. Again, there were half a dozen prominent search engines and their algorithm updates were all very slow to roll out, so you could get away with a lot for a long period of time.

Even the biggest brands in the world were guilty of keyword stuffing and using link schemes. But more on that later…

2000-2007: One Search Engine to Rule Them All

Somewhere around the turn of the century (it’s hard to lock down an exact date), Google started to grow from a start-up company to a household name. You didn’t look things up anymore, you Googled them. This forced the other search engines out of the picture. Excite declared bankruptcy in 2001, and pretty soon, the rest fell off of the map. Google’s algorithm became the only one marketers needed to care about.

As Google grew in size and reach, it also became more sophisticated. This is when we first started to hear “content is king” as the old tactics of simply stuffing and spamming were now being shunned in a more evolved internet.

It wasn’t just small businesses that had to adapt. Major international brands were being called out for shady tactics to gain online traction. For example, BMW was completely removed from Google’s index for massive keyword stuffing and using doorway pages.

There was now a rulebook, and we all had to follow it. Keyword research now had to be much more complete and methodical, as did the way you used keywords in your content.

2008 – 2011: The Age of Enlightenment 

Google was evolving. Google’s Universal Search now blended a search into a new streamlined experience, and users were finding what they wanted in less time than ever.

Marketers had to do a lot more to earn a piece of that traffic. However, they now had the tools to do it. Google’s algorithm updates were now a 2-way conversation, as their Webmasters Blog would give us a fair warning with transparent (yet secretive) updates about what changes were on the horizon.

At the same time, marketers now had Google analytics and other tools to tap into the keywords they needed to zero in on, and the playbook to do so.

Organic SEO started to take shape, as an art and science.

2011- 2014 Content is Truly King

Marketing teams no longer simply had to worry about “SEO.” There was now different aspects that you needed to blend to build an entire web presence.

Your keyword research would now have to include:

  • On-page SEO (Your website and blog)
  • Off-page SEO (Link building, guest posts and influencer marketing)
  • Social media (Facebook, Twitter, and YouTube)
  • Paid search (Google Adwords and PPC)

Google’s guidelines were now more than the law; they were The Commandments. Nobody was above them, as JC Penny was exposed for building their web presence via link schemes in 2011. The same year, the once-prominent Overstock.com disappeared from the SERPs they once dominated after they were discovered exchanging discounts for links.

2014 – Present: Living in a Mobile-first World

The widespread use of smartphones and voice searches has placed more importance on longtail keywords. Would-be customers are now Googling complete questions and the SEO wins go to the companies who provide the complete answers.

Smarter keyword research tools like ahrefs and SEMrush now help companies do more granular research, allowing companies to pick their SEO battles more methodically. You can now focus less on high-competition search terms (ie. “iPhone”) to more specific lower-hanging-fruit (“cracked screen on an iPhone 6”).

The name of the game is organic. The Google algo is now rewarding people and businesses who earn traffic through organic content. At the same time, you’re now able to use your keywords in your content more organically, without having to use exact match.

We’re also now living in a mobile-first world, where your mobile site has to be better than your desktop site. This is placing a newfound focus on tapping into the exact search terms your target audience is searching for while they’re on the go. You also have to maximize every pixel of space on a tiny mobile site with SEO-rich (but not stuffed) copy and keyword optimized images.

Parting Thoughts

As we like to say around here, “The Google algo has never made more sense than it does today.” 

The TF-IDF algo also makes a lot of sense. However, to content creators, it’s less of an exact match formula and more of a checklist to ensure a given piece of content is covering all of the ideas your user will expect.

Looking at the density of TF-IDF data to find ‘supporting words’ is massively helpful. However, if you try to replicate those exact words, in those exact ratios, you will ruin your content. It’s just like how trying to stick to a strict 2.5% keyword ratio would ruin your content. Also, neither will lead to any SEO wins. 

Use these new terms as organically as possible. Like your actual keywords, it is probably best to know your supporting words in advance of even starting to write a piece. That way, you’re not scrambling to retroactively add them, which can lead to awkward sentences, choppy content and perceived ‘stuffing.’

As always, it helps to work with an experienced SEO firm who can guide you through thorough TF-IDF research and help you make it a part of a comprehensive SEO strategy. If you want to use TF-IDF to supercharge your keywords, click here to contact us at any time.

Backlinks Can Be Your Propellor or Your Anchor

Your website is a boat floating in the ocean. Now, picture your site’s backlinks as either a propellor or an anchor.

Good links are a propellor that can lead you to SEO success. They push the boat where you want it to go. They can boost your brand to the top of the Google rankings, sending a huge wave of traffic to your site and leads into your funnel.

Building the right type of links and a healthy backlink profile will get your boat into harbour to all the people (traffic).

However, bad links are an anchor that holds your SEO success in place and keeps you from moving. Even though you’re still trying to go forward, you’re stuck in place and wasting your fuel trying to move with this weighing you down.

In extreme cases, a whole lot of bad links or ‘black hat’ tactics can be a hole in your hull. They’re doing more than just holding you where you are, they’re sinking your ship.

That said, you can also pursue good links too aggressively. That type of link profile gets noticed by Google for the wrong reasons, which is sort of like burning your engine out while trying to go too fast for too long.

Navigating The Uncertain Waters of Link Building

Here is the trickiest part of it: You can’t simply chase the Google algorithm to find your SEO and link building success. Doing that is the fastest way to sabotage your results and drive yourself insane. Even Google says that you shouldn’t chase the algorithm.

5 years ago Google essentially said, “If you’re using guest blogging as a way to gain links in 2014, you should probably stop.” However, here we are in 2019 and if you look at the sites with the most traffic and highest SEO ranking, they all have guest posts. 

So, the interpretation of the message becomes, “You can’t do guest posts a certain way anymore.”

If we were to sum up our view of what link building entails in 2019, it would be this: You need to create quality content that earns links.

To help you wrap your head around all of this madness, today we’re going to take two very simple ideas, what is a great link and what is a bad link, and show you all of the factors that could impact their value.

The summations are based on our interpretation of the stone tablets that Google sends down from above (their Webmaster Central Blog and guidelines), as well as our day-to-day experiences as pretty damn good SEOs.

How to Earn Links That Propel You

Link building in 2019 is an umbrella term that encompasses a lot of different tactics. However, at the core of it all is one very basic principle: Not all links are created equal. 

There are great links and there are bad links, plain and simple. Google has a sophisticated algorithm to analyze these links and determine their value. Knowing the difference can be the key to making (or breaking) your SEO campaign.

Any given link is weighed based on literally hundreds of factors that are constantly changing. Google gives us a ballpark idea of what their algorithm updates will mean, without ever truly telling us what’s in the secret sauce. 

Figuring out the difference is how we spend most of our day, because it’s incredibly complex. Link building is mysterious and subjective. SEO nerds like us make our decisions based on our interpretation of Google’s updates and guidelines, combined with analysing the data.                                                       

There are several different software options and tools out there to help you assess the value of a given link. However, even their algorithms are not plugged into Google’s algorithm. They’re based on the interpretations and inferences of SEO nerds like us. That’s why two links that are both scored as a DA 50 by Moz could impact your site in two very different ways.   

It’s the organic, authentic, and truthful method of earning links that the Google algorithm deems appropriate.

With that in mind, here are a few of the things that we look at when determining whether or not a given link can help a given site.

Links with a Strong Domain Authority Score

We are huge fans of Moz’s and their patented DA score. To us, this is simply one of the most predictive metrics of success with Google.

It’s often where we start, but far from the only thing we look at. It’s a nice and simple number, but it is still a subjective number. As we mentioned, you could be looking at two sites with a DA score of 50 (which is high), but these two links can each impact your site in two very different ways.

Domain Authority | 2019 SEO Best Practices – Moz

Think of it as house shopping. You could be looking at two houses for $500,000, but only one of them is right for your lifestyle. One is right next to your kids’ school and the other is on the other side of town.

The DA score is often where our research begins, but never where it ends.

Relevant Links

Quite simply, when Google’s algo sees that this site is linking to yours, will it be seen as a logical fit. Does it make sense that these two sites are connected? The relevancy of a link certainly appears to be a major factor in determining its value.

Let’s stick with the two sites with a DA 50 example we used above. Let’s say your company makes organic dog food. If one of those links comes from a dog breeder blog and the other comes from a dentist, the breeder link is clearly more logical and relevant, even though both sites have the same DA.

Well-Placed Links

There is a lot of data to support the theory that links appearing early (e.g higher on the page) in the copy are given more weight than ones buried at the bottom.

At the same time, links placed in the actual body copy of a blog or page will almost always be given more weight than ones in the footer or boilerplate of a given page.

Links From New Sites 

Let’s say you’ve got two new links to your site, and they are:

  1. From a site that linked to you a month ago
  2. From a site that has never linked to you before

There is a strong chance that more weight will be given to the “new” link, as it sends signals to the Google algorithm that your site is gaining new respect from new sources. As a result, that “new” link may give you a bit more of a boost than the “old” link.

However, there are always lots of factors in play there. If you have a choice between a new link from a local business and a second link from the Huffington Post, you should likely go with the bigger fish.

Links From Trusted Sites

This is sort of a quality-versus-quantity thing. It’s trust versus traffic.

How reputable, respected and trusted is this site? The concept of TrustRank playing a major factor in any given site or link’s value certainly has a lot of traction.

A link from a site with a ton of traffic (with low trust) may not be as valuable as a link from a trusted site (with lower traffic).

There are a number of metrics and tools you can use to gauge the trustworthiness of a given site.  Both ahrefs and SEMrush each have their own proprietary formula and score for a given link or site.  However, it’s important to remember that these formulas (like anything else in link building or SEO) are a very, very educated guess.

Google obviously has their own formula to measure the trustworthiness of a given site, and we’re all just doing our very best to estimate and ballpark what it is.

However, trust is very clearly a factor.

Links From Fresh Sites

Regularly updating your site with fresh and original content is almost always a good thing. It’s showing the Google algo that you’re actively publishing good content, which conditions them to check back more frequently to crawl your site.

It’s also good to earn links from those types of sites. A link from a site that pumps out fresh and relevant content every single day is likely going to help you more than a site that hasn’t published anything new in a couple of days.

Again, no single one of the above factors will make or break a link. It’s always a mixed bag, so to speak, and you have to look at all of them to see the big picture.

For example, a link from a strong DA score from an unrelated site can do more harm than good.

How to Avoid Links That Weigh You Down

link weightNearly everything Google has done to weed out bad links has been in the name of “providing value” for the searcher. 

Unlike the factors we went over in the previous section, any single one of the infractions we’re about to cover could be enough to instantly devalue a link. If a potential link has any of these red flags, you do not want it anywhere near your site.

With that in mind, here are the specific tactics that Google has said will get your links devalued.

Automatically Generated Content

Have you ever clicked on a link and landed on a blog that appears to be pure gibberish. 

That’s because it wasn’t written by a person, it was auto-generated by a bot. Or it was translated from another language by a bot, without human editing that would fix the syntax or other language nuances.

This is a shortcut that ‘black hats” sometimes try to avoid going through the trouble of actually creating unique content.

Link Schemes

This can mean so many things. Link schemes may include:

  1. Directory or bookmark site links, AKA link farms
  2. Buying or selling meaningless links, or exchanging goods or services for them
  3. “I’ll link to you and you link to me” partnerships that add nothing
  4. Automated programs or services to generate links for your site
  5. Hidden links embedded in widgets
  6. Hiding links in the footer or template of a site
  7. Hiding links in the comment section of a blog or page

These tactics do not work and have not worked for some time. 

Pages With Little Content or Scraped Content

Google wants to see that all of the links pointing to your site are full of original content.

A bare-bones site, or a site with content lifted from other sources, is a red flag. This can also take several forms. Most of it is republishing copy, images or videos without adding new insights or value. 

One particularly offensive way to do it is an automated program that simply finds a synonym for every word in the sentence to create a “new sentence.” So, that last sentence would now be: 

“A mainly unpleasant method toward complete it is a robotic sequencer that only bargains a substitute on behalf of each expression in the judgment toward produce a “newfangled decree”

It reads like pure gibberish and Google’s algo notices.

Cloaking

Simply put, are you trying to send your users to a completely different page than Google would see when it crawls this site?

The classic example (which Google’s Matt Cutts uses in the video below) is making Google think you’re sending your browsers to a site about Disney movies, but you’re actually sending people to porn. 

Cutts stresses that, contrary to what you may have heard, there is no such thing as white hat cloaking.

This does not apply to looking at your user’s IP address and sending them to a French site based on their country of origin. This also does not apply to seeing that your visitor is coming from a mobile device and sending them to your mobile site.

Both of those examples are kosher, but trying to deceive Google’s bots by giving their IP address a different experience in any way is immediately cloaking.   

Sneaky Redirects

Are you doing something sneaky(ish) with your redirect links? Are you doing a bit of a bait and switch with your users?

This could include:

  • Google seeing one page, but users are redirected to something totally different, similar to cloaking
  • Desktop browsers to go a normal page, but mobile browsers get a redirect to a completely different domain

Hidden Text or Links

To be honest, this is pretty bush league and we’re amazed that people still do this.

It’s 2019, yet we still see people:

  • Hiding white text in white background
  • Hiding text behind an image
  • Using CSS to position text off-screen
  • Shrinking a font size down to an imperceptible 0
  • Hyperlinking one small character, like a hyphen or dash

You’re better than that!

Doorway Pages

This is when you use a bunch of domain or pages targeted to funnel users to a single page. This could be creating multiple domains like:

  • BestPitasToronto.com
  • BestPitasScarborough.com
  • BestPitasBarrie.com

You can create and own all those domains. However, you need original content on each page and you can’t just send them all to the same domain. That’s bad for SEO.

Affiliate Programs (Without Adding Sufficient Value)

We get asked about this one a lot, and we invite you to read Google’s guidelines if you’re considering adding affiliate links to your site.

Adding affiliate links can be a great way to 100% legitimately monetize your site. However, doing it the wrong way could put you in Google’s bad books.

Here is the simplest version. Let’s say you have a golf blog. You want to earn a bit of income, so you add some affiliate links to the new Callaway driver at Amazon.

These links are good, as long as you’re bringing something new to the table. You need to provide an original review or description of this driver. You can’t simply copy and paste someone else’s content or provide very little of your own.

This will get you classified as a “Thin affiliate” in the eyes of Google because your content is too thin.

If you’re adding affiliate links, make sure you’re adding value. 

Irrelevant Keywords

This is good old fashioned keyword stuffing.

Google can detect words that don’t match anything else on the page and are out of context. They also ignore you if you use a keyword too many times on a page. You may also get dinged for adding meaningless city names that you want to rank for.

Malicious Behavior

This is straightforward. You simply never want your site to be associated with hosting or spreading malware such as phishing, viruses, trojans, or other badware.

Abusing Rich Snippets Markup

If you’re looking to rank for a snippet related to a keyword of yours, test your structured data using Google’s Structured Data Testing tool during development. You can also view the Rich result status reports afterward.

The Google Penguin and its Impact on Links

These waters have penguins!

No other Google update has had more of an impact in determining the value of a given link than the Penguin.

The Penguin was first introduced in 2012 as a way of cracking down on “black hat” tactics that people were using to give their sites an SEO boost. Some of these tactics included things like buying irrelevant links in directories or link farms, or hiding/ stuffing links on a given page.

Over the last 7 years, the Penguin has been rolled out in various stages and grown more sophisticated. Here is a look at how it has evolved.

April 2012: Penguin 1

Google announced that they are taking Another step to reward high-quality sites.

They called out black hats, keyword stuffers and link schemers by name and said they’re taking big steps to weed them out. Of course, they were predictably vague about the details saying:

“While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience…”

However, they did confirm that this change will noticeably impact about 3.1% of queries in English, with queries in other languages being impacted slightly more. 

May 2012: Penguin 2

Google’s Matt Cutts tweeted that Google just pushed a big algo refresh that impacted another 0.1% of English queries.

October 2012: Penguin 3

Cutts tweeted that another Penguin refresh has impacted another 0.3% of English queries.

May 2013: Penguin 4 (AKA Penguin 2.0) 

Another big change. This time it is an algorithm update and not a data refresh, which is why it is dubbed Penguin 2.0. The scope of what is done is felt more broadly and Cutts announced that 2.3% of English queries will see a noticeable impact.

In an attempt to help webmasters keep up with what is changing, Google released a blog and video about what to expect.

October 2013 Penguin 5 (AKA Penguin 2.1) 

Google announced an algo update + data refresh that impacted about 1% of English queries. 

October 2014: Penguin 6 (AKA Penguin 3.0) 

The Penguin 3.0 was actually a data refresh and not an algo update. However, it was rolled out worldwide over a few weeks and impacted about 1% of English queries.

This update was a big help to sites that were dinged by the previous Penguin updates, but did the right things to fix the problem. They would now start to see a recovery.

September 2016: Penguin 7 (AKA Penguin 4.0)

This was the big one.

Google announced that there would be no more updates, because the Penguin is now part of their core search algorithm and will update organically. There would be no more layered releases that incrementally impact sites, it was now realtime.

The good news was that these real-time updates meant that if your site was previously penalized by the Penguin, you could now fix the issues and see your results much, much faster.

This is where we sit today.

“CLEAN UP YOUR LINK PROFILE!”

Parting Thoughts

Don’t get stranded at sea! Earn the right type of links through high-quality content and organic links.

Good and organic link-building is future-proofing your success. Again, nearly every Google algo update rewards people who are doing the hard work of creating high-value content and earning great organic links, while penalizing the people who take shortcuts and have bad links.

Building links the right way is like being the kid who studied for the test. A new rule saying you can’t bring your phone into the exam room won’t affect you. A no-talking rule won’t affect you, nor will a rule about sitting too close to your neighbour. Meanwhile, kids that planned to cheat to pass are now screwed.

If you want to be on the right side of link-building or have any questions about how to do it, please click here to contact us.

Why SEO Experts and UX Experts Need to Be BFF

In theory, the relationship between a UX expert and an SEO expert should be a contentious tug-of-war.

You might imagine the two of them looking at a whiteboard together saying.

“We need to take this block of text away, it’s hurting the user experience.”
“Well, you can’t take it away. I need that block for SEO.”

They stare for a few tense seconds and then draw their lightsabers and duel to the death.

But it doesn’t have to be that way. In fact, it’s never been more important that they work together.

UX is absolutely one of the most important factors in predicting how likely your site is to rank. Good UX makes an SEO agency’s work easier to do. The Google bot’s crawling behavior now mirrors a human searcher’s behavior and it’s getting more human-like every single day. So, improving the user experience improves the Google bot’s experience.

At the same time, giving your users a good experience means they’re going to stay, click around, probably read some stuff, and maybe buy some stuff. All of these things are good for your SEO. At the same time, a bad user experience means your user is going to quickly get frustrated and leave you forever. This kills your SEO.

Here is a deep dive into why UX  people and SEO people need to get along.

The History of UX and SEO Working Together

In the earliest days of the internet, you could get away with a lot. You could achieve SEO success through keyword stuffing and spammy tactics. Simply having a web presence meant you were cutting-edge. However, those early websites were primitive and the user experiences across the board were pretty much garbage. Even the best sites for Fortune 500 companies relied heavily on the user having to do a lot of work to find what they want. The word “intuitive design” wasn’t really in any conversations about building websites yet… the word “design” barely was.

The Wall Street Journal, circa 1997

However, web design started to evolve and mature. Webpages got more user-friendly as marketers looked at how people were interacting with their site. Providing the best possible experience became a competitive advantage. And thus, the art/science of UX was born.

Companies started to hire UX experts to audit and fix their websites or their E-commerce pages. The company would say, “We need you to tell us why people are abandoning their shopping carts before buying from us.”

The UX expert would then audit the entire site and find any of the barriers that could stand between the user and the desired action. They would find the problem(s) and tell the company something like “Your instructions are unclear here” or “There is too much lag here.” And, huzzah! The problem would be solved. The sales funnel would unclog, the leads would start pouring in, and the UX expert would be hailed as a savior.

After seeing the difference that a good UX can have on a site’s CRO, the best SEO companies and marketing agencies realized that it can also help a site’s SEO.

We started to realize that if we structure our site in a way that’s easy for users to navigate, we’re also making it easier for Google’s bots to crawl. And, if we provide a strong UX, users will be more likely to link to our content, which also helps our SEO.

Today, we know the value of having the SEO expert and the UX expert sitting right next to each other while a website’s design (or redesign) is being strategized. They need to work together in perfect harmony.

With that in mind, here are 4 ways that a strong UX can boost your site’s SEO.

1. Page Speed is Crucial to Both UX and SEO Success

It all starts here. Users have no time for a slow site and neither does Google’s algorithm.

Google has been saying that page speed has been a major factor in their search algo since 2010, meaning you will be rewarded for good speeds and punished for bad speeds.

They have stated that you’re likely going to lose half of your visitors if your site takes over 3 seconds to load.

They also published this infographic to illustrate how your bounce rate will increase as your load time increases.

Basically, slow page load speeds will kill the user’s experience. They won’t even stick around to see how the rest of your site is. They don’t care, they’re gone. And they are most likely going to leave your site and go right to your competitor.

Mobile Speed

In recent years, Google has announced two major updates to its search algo:

  1. They’re moving to mobile-first indexing. So your site had better be good
  2. The Speed Update factors in your mobile site’s speed. So it had better be fast too

When announcing the Speed Update, they said it “will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries.”

They added, “We encourage developers to think broadly about how performance affects a user’s experience of their page and to consider a variety of user experience metrics.”

Mobile friendly test

The 2 Secret Reasons So Many Websites are Slow

We do quite a few site audits, as an SEO agency. One of the first things we do is take a look at the website’s speed to see if that is currently hampering their SEO. And we can confidently say that site speed is hurting at least 70% of the would-be clients we meet with. We actually sort of assume it’s a problem until we see otherwise.

Why is this the case? Two main reasons:

1. Most test it and forget it

Most companies will test their site once and assume it’s OK. However, they will then proceed to add a ton of new blogs, images, videos and plugins without testing it again. All of those new elements and assets will slow you down.

2. Most only test the home page

We were guilty of this ourselves, for a long time.

“But, I don’t get it. I ran a speed test on the home page and we’re fine.”

You can’t simply test your home page and assume that’s a site-wide score. You could easily have a fast home page with much slower interior pages. This means your blog pages and/or landing pages are running slow and your users are bouncing, while Google notices their slow speed and ranks you accordingly.

Run your speed tests on the entire site, particularly on your most important blog pages and landing pages.

2. User Engagement Signals to Google That Your Page is Good

This is a tad bit controversial. Google says that user interactions do not have a direct impact on how they rank your site. However, there is significant data to show that these things do matter.

For example:

Click-Thru Rates

UX experts can help you boost your click-thru rates. They create a seamless journey and make it easy for the user to see more and click deeper into your site.

This helps your CRO immensely, as a strong UX guides your lead from click to conversion in as few steps as possible, whether your end-goal is a free quote or an actual sale.

But… does click-thru matter when it comes to SEO? That’s very much up for debate and I think bar fights have broken out over this one. Google has gone on the record and said it doesn’t impact how they assess and rank your site.

However, there are a lot of well-respected SEO minds that would beg to differ and have the data to prove it.

Rand fishkin CTR

Take Moz’s Rand Fishkin for example, he’s put together a compelling video that definitely points to more clicks contributing to a higher ranking.

Time on Page/ Dwell Time

Before we go any further, these two metrics are often used interchangeably. They’re both very important, but they’re not the same thing:

  • Dwell Time: The amount of time the user spends on a page after the click, before they return to the SERPs.
  • Time on Page: The amount of time a user spends on your page before going anywhere else.

In either case, good UX ensures that your visitors are there to stay for a while, and not clicking away because they’re confused or put-off for any reason.

This is very big in CRO and a major predictor of pay-per-click success… But does it really matter for SEO? Again, officially, Google says no. And once again, there are a lot of SEO agencies and individuals that will tell you otherwise.

For example, ahrefs has identified dwell time as a signal that the user found what they were looking for at your site; one that search engines have to notice.

ahrefs dwell-time-illustration

And while Google hasn’t come right out and said these times matter, they have alluded to it. Particularly when they’ve spoken about how Rankbrain uses machine learning.

Nick Frost, the Head of Google Brain said that, “Google is now integrating machine learning into that process. So then training models on when someone clicks on a page and stays on that page, when they go back or when they and trying to figure out exactly on that relationship.”

AI learning how human searchers interact with pages would certainly point to Google factoring in our clicks and dwells. These are strong signals on whether or not we stuck around on a page long enough to read a blog or a service page, or clicked deeper into the site because we’re engaged.

3. Good Site Navigation Means More Clicks and Crawls

Good UX design leads to deep site navigation. Your UX expert can help you identify any issues with your information hierarchy and show you how to fix it. This may mean removing any unnecessary elements that are confusing or distracting to the user, or adding more clear instruction on how to navigate.

This is absolutely crucial when creating a usable and navigable site. But, does it help SEO? The impact of good site navigation may be a bit more tangible than the other points we’ve already explored.

Simply put, if a human being has issues finding out what your site is all about, so will Google. A bad user journey will be mirrored by a bad Google-bot’s journey.

The same navigation elements that we use to make a site more navigable for human beings will also help create a roadmap for a search engine’s crawlers.

These elements include:

Links in the Top Navigation, Body Content, and Footer

Are the links in your top nav well named and sorted into logically related groups? Do your internal links encourage your user (or a robot) to go from one page to another? Or are your pages dead ends with nowhere to go?

Also, remember, this journey will not always happen from your home page. In fact, if you do a very good job owning various keywords, more of your users’ journeys will start with a dedicated landing page.

Let’s say you’ve built a page for Toronto’s CNE. Maybe you’ve done a great job of optimizing “How to Get Around Toronto During the CNE” and you’ve earned a high ranking and lots of traffic for that search term.

Your user travels from the SERPs to your landing page or blog. They can now see lots of useful information about streetcars and road closures near the CNE. But, where do they go from here?

You want to make it as easy as possible for them to click around some more, so you provide easily seen and accessible information about ride-share options and accommodations near the event. You will put those links in:

  • Your header
  • Your body content
  • Your footers

Google’s crawlers also need those links to get from one page to another.

There is no official word from Google on which links carry more weight, but it is widely believed that:

  1. Body content links hold the most SEO value. And the higher up in the content, the better
  2. Top nav links are second
  3. Footer links hold little-to-no weight or value

The same can be said for the weight of both internal links and external links.

Headings

Headings are important to both human users and robots for essentially the same reason: They show you exactly what this page is all about.

The use of keywords in headings and titles is obviously important for SEO. This is the first signal you send to your readers and bots to indicate what this page is about.

At the same time, good UX involves adding headings to your landing pages and blogs to make the content easily digestible and to break up any big blocks of text for users. Humans will click away from any page that appears to be nothing but a big block of text, because it looks boring and intimidating. You can take that exact same block of text and simply add some headings and paragraph breaks to make it more digestible. You will also see far more engagement.

4. The Mobile Experience is More Important Than Ever

As we stated earlier, Google is now all-in on mobile-first browsing. Even if you think most of your target users will look at your site from a laptop or desktop site, it doesn’t matter. Your mobile site still needs to be amazing. Google will go to your mobile site first, even if your target audience doesn’t.

The small screen on a mobile device is where both the UX and SEO gurus really earn their money. You have less space and almost no margin for error. This small space and mobile-first indexing now mean that a bad mobile UX is going to hurt your SEO more than ever.

Google has provided a number of resources to help you make sure your mobile UX is where it needs to be. For example, they have provided these guidelines, which give you a pretty deep dive into how to make sure your site is mobile-friendly.

They give you detailed instruction on how to fix some of the most common mobile UX issues, most of which also happened to be SEO barriers. These include:

  • Blocked JavaScript, CSS, and Image Files
  • Unplayable Content
  • Faulty Redirects
  • Mobile-Only 404s
  • Bulky interstitials
  • Irrelevant Cross Links
  • Slow Mobile Pages
  • Small font size
  • Touch elements too close

common issue google mobile

However, keep in mind that making sure all of these things check out or “pass” does not ensure success. Consider them the bare minimum you need to get a 65% on the exam. Just because Google’s guidelines don’t flag any problems with your mobile pages, doesn’t mean your users won’t find any.

How to Get SEO Content onto a Mobile Page Without Ruining the UX

This is the classic game of tug-of-war we mentioned in the lead. The UX pro wants the pages to be lean with minimal copy. Meanwhile, the SEO pro says we need enough copy on the page for Google to crawl us.

Getting enough optimized SEO copy on a blog page is rarely a problem. It’s supposed to be text-heavy. People are there to read. But, things get a little more dicey when it comes to home pages and landing pages. The tendency is always to make them as light and lean as possible, with a minimalist approach to copy. However, you still need these pages to rank, so you still need SEO copy.

Can these two sides meet in the middle? Absolutely.

We have had great success with this formula. Imagine your mobile page cut into 3 thirds:

The Top Third:
This is where you will put the most important parts. You want your logo, banner, and call-to-action (CTA). If the users can only see a few things, you want it to be those 3.

The Middle Third:
Use some copy to describe your organization as concisely as possible. You can also put your social proof here. These may be awards, badges, industry certification, big-ticket client logos, or a great testimonial. These are things that cement your trustworthiness and legitimacy to users.

The Bottom Third:
This is where you put you more SEO-focused copy. Avoid the bulky look by adding drop-down menus to reveal the rest of your text. That way, Google can still crawl your copy, but users won’t see it unless they want to.

Final Thoughts

UX and SEO definitely have one thing in common: If you’re using 2016’s playbook for either, you’re way behind and your site will suffer.

Both areas are evolving at an incredible pace. Don’t get too attached to what is working today, because it may be very different a year from now. The shelf life for a trend in UX or SEO is measured in months, not years.

However, the secret to success is finding an SEO agency or a UX professional that can tell which of these trends are simply fads, and which ones will become the new best practices moving forward.

You can’t have one without the other. In a perfect world, you would work with both professionals at the same time while you’re in the planning stage of your website. That way, they can look at the whiteboard together and plan to do things the right way together. It can be much harder for either to work their magic retroactively on a site that has already been built.

Ideally, your dream team should include:

– An SEO expert
– A UX expert
– A designer
– A WordPress expert
– A copywriter/ content writer
– A PR expert/ link building expert for off-page SEO

SEO people should love working with UX people. They are the ones who set the entire site’s architecture up for success and make sure it’s appealing to both users and search engines. And UX people should love working with SEO nerds, because we’re the ones who make sure their great work isn’t being wasted on a beautiful site that nobody can find.

If you’re looking to build a new site and you have any SEO questions, or you’re looking for an SEO agency to hire, click here to contact us any time.

9 Website Redesign Mistakes That Kill Your Website’s SEO

A web redesign can either boost your SEO… or erase all the traction you’ve earned so far. The choice is yours.

You may opt to give your site a “facelift” to keep up with changing user preferences, web design trends, or Google’s evolving algorithm. Or, you may decide to blow the whole thing up and build it again.

In either case, the last thing you want is a very expensive and beautifully designed website that just sits there. It looks great, but nobody sees it. More importantly, it’s being ignored by search engines and not producing any leads.

“What went wrong? We put all this time and effort into this and we’re no further ahead. In fact, we’ve lost ground!” This is the cry of too many business owners after a website rebuild. Now, they’re looking for a new SEO agency in Toronto.

So what went wrong? In a lot of cases, it’s because a design was prioritized over SEO. This can’t happen. You and your team need to stay focused on SEO before, during and after the redesign process, or all that work you’ve invested in SEO over the years may be undone.

Let’s make sure your SEO rankings stay intact and your new site is set up to move the needle forward. Here are a few common mistakes that companies make.

Mistake #1: Removing or Renaming Pages That Rank on Google

You will probably choose to get rid of (or rename) some of your existing web pages. Maybe you’re moving everything over to a new domain name or you’re trying to simplify your existing website layout.

Whatever the case may be, remember that Google uses the URL addresses of these pages when ranking your website. If you suddenly change the URL, you’ll have to start from scratch when it comes to SEO. Each page goes back to square one.

Your best bet is to use a 301 redirect when deleting or moving web pages. This shows Google and other search engines that your website has permanently moved to a new address, so you won’t lose your search rankings.

When moving pages to a new domain, this should be fairly straightforward. But, if you’re deleting pages that rank well on Google (more on that later), redirect them to the most relevant page of your new website.

If you’re simply cutting certain pages without offering a replacement, you’ll have to say goodbye to those search rankings.

And let’s not forget about backlinks, still one of the most important factors for SEO. Your backlinks account for about half of your SEO clout. But if you change or delete some of these pages, the links go with them.

Users that click on these backlinks will be greeted by a 404 page, which renders your link building efforts null and void.

Again, you’ll need to use a 301 redirect to salvage your existing backlinks. Users that click on these backlinks are redirected to your new website instead of staring at a blank screen.

Mistake #2: Not Checking Internal and External Links

Your new website needs to be navigable. But when you’re working in the development environment, things get lost.

The development environment is a separate workspace where web developers and SEO professionals build the new website before it’s live and accessible to users. Things are bound to get lost in the shuffle as you move your new website from the development environment back to the live server.

For starters, you’ll probably use separate URL addresses in the dev site (development site) from those that eventually get published on the live server.

For instance, one of your web pages might be listed as “domain.com/client/products” on the dev site. Yet, on the live server, the web page shows up as “domain.com/products.”

This means you could be looking at a complicated puzzle of missing or broken links when you finally go live. Some internal links may appear as external links. They’re still pointing to the domain used in the development environment. It’s messy.

That’s why it’s so important to check all internal and external links before you launch your new website. Don’t worry! You don’t have to do it manually. Use a web crawl tool like Screaming Frog or SEMrush.

These programs quickly crawl and test every link on your website to make sure they’re all working. When you’re done, spend some time navigating the website to make sure everything looks, and works, the way it’s supposed to.

Mistake #3: Not Testing the Website’s Overall Functionality

Just like all those links, don’t forget to test the website’s overall functionality.

This means testing all:

  • Input/lead forms
  • Interactive programs
  • Videos
  • Slideshows
  • Any other features on your new website

Users don’t like broken things. If something isn’t working properly, your users will abandon your website in seconds. This ruins your SEO efforts. If they land on your site from a search engine results page and leave because something is broken, Google will notice.

We tend to create entire web pages around individual features, so users don’t feel overwhelmed. Thus, inattentional blindness takes over as the user focuses on the task at hand. But if one of these features isn’t responding, there will be nothing else keeping users on the page. Make sure all your features are working properly, so your users can interact with your website with ease.

Mistake #4: Overlooking the Little Things: Renaming Images, Title Tags and Meta Descriptions

Don’t forget to pay attention to all those tiny details. They can make or break your search rankings.

On-page optimization is the core of a successful SEO strategy. If you’re not familiar with the dos and don’ts of on-page optimization, take a look at Google’s official SEO Starter Guide.

You may have already optimized the individual pages of your old website, including:

  • Title tags
  • Meta descriptions
  • Alt text for images

But if you change this information when launching your new website, your search rankings could take a hit. If some of your pages were ranking well, leave this information as it is.

One of your web designers might speed past this step and rename one of your images “new image,” instead of using the alt text from the old website that earned you some SEO wins. If you decide to change some of your metadata, make sure it’s up to Google’s standards.

Mistake #5: Letting Search Engines Index the New Site Before It’s Live

Like an artist painting a masterpiece, you don’t want anyone to see the work in progress.

And you don’t want Google seeing your new website before it’s ready for its big debut. But Google might have other plans.

Unless you insulate your new website from Google’s robots, it might crawl your new website while it’s still under construction.

This means Google may create two separate versions of your website, which will frustrate even the most patient web designer. Your links will all be out of whack with some pointing toward the old website and some pointing toward the new website. Again, it’s a mess and untangling this yourself is awful.

To avoid this nightmare scenario, you can:

  1. Build your new website using a test domain
  2. Hide your website from Google’s robots
  3. Combine the two for even more peace of mind

If you want to build your new website with a test domain, choose a domain name that’s never been used before. Something like “www.skljgkllk.com” is sure to remain hidden, considering no one will be linking to that site.

Once you have a test domain, disallow Google’s robots by disabling the robot.txt feature in your website settings. Set up an empty index page, so your test website isn’t connected to your old website.  Finally, you may even want to set up a password for your test website while it’s still in development to make sure Google can’t access it without your permission.

Mistake #6: Forgetting to Let Google Index Your Website After It’s Live

We absolutely did not want Google crawling the site before it was ready. But now it’s ready, and we desperately want Google to crawl it.

Whether you’re working in WordPress or another website building platform:

  • Reconfigure your settings so Google can crawl and index your new website
  • Change your robot.txt feature to open the door for Google’s robots
  • Swap out your test domain for your real domain
  • Disable any passwords you might have used to hide your test website in the development environment.

Mistake #7: Not Optimizing for Mobile

Mobile traffic is now ahead of desktop traffic and this trend shows no signs of slowing down. Your new website needs to be responsive and mobile-friendly.

You may say, “Yes, but most of our target market won’t be using our site from a mobile device.” Fair, but don’t forget that Google has already started rolling out its mobile-first index policy. Which means it will crawl the mobile version of your website when composing its search results.

Even if you don’t think users will care about your mobile site, Google still will!

Make sure the web browser automatically resizes your content for the specific device. Otherwise, your users will have to scroll left and right just to see the heading of the page. You don’t want your users to have to pinch the screen to zoom in on a specific piece of content. This means everything needs to be visible from the get-go.

The text should automatically appear larger on smaller devices.  You also don’t want to have two buttons too close together. Or your users might click the wrong one, which is insanely annoying.

To help users see your content more clearly, use image expansion tools that blow up an image when a user clicks on it. This is especially important for e-commerce websites where users will want to see a larger image of the product before making a purchase.

And don’t forget to avoid full-screen pop-ups on mobile devices. While it might work for desktop users, clicking out of pop-ups can be a major pain on mobile devices. So the user leaves.

With all that in mind, make sure you test the mobile version of your website before you launch using a mobile-friendly test like this one from Google.

Mistake #8: Forgetting to Minify Your Code

Start a project with the right mindset so you don’t waste time fixing mistakes down the line.

Minifying code improves the usability and speed of your new website. If you’re unfamiliar with this term, it essentially means simplifying your website code by removing redundant characters and processes.

If you have a large website with hundreds of indexed pages, this is especially important! If you wait until the end of the redesign process to minify your code, it will cost you more than time and money. It may even delay the launch of your new website.

Be practical when building your new website and start minifying from the start.

Mistake #9: Sacrificing Speed for Aesthetics

Your old site was ugly or outdated, so you upgraded and updated the look. But never sacrifice speed for aesthetics!

Sure, all those fancy graphics and background videos might look great, but they could dramatically slow down your new website.

Why? Because a single second delay in load time can result in a 7% loss in your conversion rates. And 40% of web users will abandon a website if it takes longer than 3 seconds to load.

If you want your new website to be successful, you can’t afford to overlook the merits of speed. Google is driving users towards fast, responsive websites. Unless you speed things up, the competition will beat you to the punch.

Use these tips to speed up your website as much as possible before you launch.

Build it Once. Build it Right

When you’re building a new site, you basically have 3 choices:

  1. “We’ll worry about SEO after it’s live.”
    You don’t think you have the time or resources right now. But, the new site goes live with no SEO value. You actually drop in the rankings as you undo what you’d previously earned. Now, your site isn’t ranking or producing leads. You scramble to retroactively optimize your site…. Which may take months.
  2. “We’ll Worry About SEO After We Finalize Design”

Using the minifying code example from above, you now have to fix hundreds of pages that were built without SEO in mind. This could be massive rework. And rework destroys budgets and delays deadlines.

  1. “We’ll Worry About SEO Right From the Start”

Now, you’re thinking about SEO, right from the whiteboard stage. All considerations are met as SEO and design can work hand-in-hand to create something that is built the right way from the very start.

The site works out of the box, your previous SEO wins come with you to the new site, and you can start earning new clout right away.

Are you facing a site rebuild and want to make sure you do it the right way? We can help! Get in touch with us today for a free consultation today.

Get in touch with us!

Get in touch and we’ll get back to you as soon as we can. We look forward to hearing from you!