News

SEO News

2017/03/22 2017/03/22

How important is machine learning within SEO?

By |Categories: Uncategorized|Tags: |0 Comments

Some agencies still only satisfy the conventional ranking signals, that’s links, content marketing etc, however there’s a serious ranking factor now that cannot be ignored.

That ranking factor comes in the form of machine learning- also commonly referred to as Google’s new masterpiece called “RankBrain”.

We have discussed this new emerging ranking factor at great length within our own article section, that’s because most Cardiff businesses, regardless of company size stand to be affected by it.

That’s to say if your chosen online marketing agency is still just churning out content marketing at a rate of knots you could be missing out on a trick.

 

What is Rankbrain and why is it so important?

In early 2015 Google began it’s slow rollout of RankBrain, which simply put is an artificial intelligence mastermind that’s changing ranking positions as we speak, and helping Google to increase the quality and relevancy of the search results as well.

It’s been widely reported on many respected seo websites, like on Search Engine Land, that as of June 2016 RankBrain’s artificial intelligence software is now being used on all Google queries, so if you are an seo consultant or you run an agency, it’s not something you should be ignoring.

How does machine learning change the search results?

 

Agencies up and down the U.K, and around the world for that matter, have always tried to look for quick ways to bring about results for their clients, sometimes this means some agencies use what’s referred to as black hat methods, which is the totally wrong way to go about things.

This could range from placing slightly too many keywords within a blog post, or a main page, right through to mass link building and schemes to artificially increase the authority of a website. This is the wrong way to carry out seo.

Machine learning brings in another dimension to seo, that is that the work has to keep the end user of the information interested if the page or website is to rank well.

This means a website could follow every white hat seo method that has ever been written, but if the information does not hold the users interest, and Google’s RankBrain detects this on mass, then the website is likely to start being replaced by results that better answer a searchers query, the sites that ultimately answer the question best will win.

This to many agencies, including our own agency which is based within Cardiff, seems like a logical move by Google, after all its like releasing an auditing piece of software that’s constantly evaluating how happy users are with the results.

If a result that was once at the top of the search results has a very high bounce rate, and low user engagement, then it’s obvious that the results below that do better in answering the query should be placed higher up the page.

Do conventional signals still matter?

 

Conventional ranking signals like links, content, on-site seo methods still matter, a website still needs to prove its important and worth listening to, that’s if it is to rank well. However, its RankBrain that then picks up the task of monitoring how users use the information and if it helps a user or if it is just a mere piece of marketing literature.

How does this change things?

 

We are an established digital marketing agency, and for us RankBrain changed very little in terms of how we carry out our seo tasks, especially content marketing. That’s because we recognised a long time ago that quality matters, so if the work is to a high standard, the user of the information will know this, and in turn clients will receive better rankings. White hat methods still matter, it’s just they are now a minimum requirement, instead the work has to impress, have an in-depth knowledge of the subject area, and most importantly answer the users questions clearly and concisely. This is the future of SEO and we are at the forefront.

If you are a Cardiff or Newport based business and you need help increasing the visibility of your website in front of the right audience, why not give us a call on 029 21 760777 and talk to Ryan, Emil or Dan.

 

2017/03/21 2017/03/21

Why a hacked website is bad news for SEO

By |Categories: Uncategorized|Tags: |0 Comments

Ways to help prevent your website from being hacked

Some business owners believe that a hacked website will only damage the websites appearance, some believe therefore a hacked site will not have its performance impacted within the search engines.

However a hacked site can result in a website being taken out of search all together, also a website can be sent down the rankings because it has spam content or links.

This is obviously bad news for any seo consultant or agency working on the project, so we have put together a brief article to explain ways that you may wish to use to help prevent your website from being hacked.

It’s on the rise

Some business owners believe that hacking is undertaken by a person sat behind a computer with the intention of damaging a business’s web presence, this can sometimes be true, however there are a lot of automated scripts that undertake the hacking process without a human being behind the direct hack on your website.

This means often intricate and complex codes are automated to find vulnerabilities within websites and to exploit the weak areas of a site.

Google has recently mentioned that hacks are getting even “more aggressive” what this can mean is that the coding is getting even harder to detect and to fully remove.

Keep your website up to date

Make sure you keep your CMS systems up to date, this means if you have a WordPress website you should update the main WordPress folder, as well as all the plugins that you have installed on your website.

Failure to update a plugin could mean a hacker or a piece of automated code that’s roaming the internet could detect that you are using an old version of a plugin, one in which it knows how to bypass and gain control of all your websites settings.

Early detection system

Let’s face it we all lead pretty busy lives, but for most business owners a website presence can sometimes be taken for granted.

What we mean by this is for some business models rely 100% on new business coming through their website, if their website goes down then this could severely impact the business.

This means that a businesses website should never be taken for granted, a website should be checked at regular intervals for updates and to also make regular back ups as well.

Plus, the website should also be linked to Google Search Console account, as sometimes Google is able to flag up malicious code and tell the webmaster that the website has been hacked.

Sometimes everything looks normal

Sometimes someone might give you the nod that your websites is acting not as it should, that’s to say you may type in your websites address, and you think everything looks pretty normal.

However its only when you run a quick search within Google that you then find loads of pages that have been added, or that when you click your business listing it re-directs you to another website.

You should take action quickly to rectify such issues, so that the damage to your search engine optimisation can be minimised.

If you need further help and advice why not contact us today?

2017/03/20 2017/03/20

Has your website been hit by the recent Google Fred update?

By |Categories: Uncategorized|Tags: |0 Comments

Google’s Fred update- another strike at low quality websites?

 

 

It’s been widely reported that a new update released by Google has further demoted many low quality websites. This update has been named the “Google Fred update” and we will explain how it has decreased many websites organic traffic that have been affected.

 

What kind of websites got impacted by this new update?

 

This new update massively impacted certain websites organic traffic, some sites reporting a 90% loss since the update sprang into action.

The sites that have been impacted are reported to be low quality content sites, so yes you guessed correctly, the sort of sites that don’t have the user in mind.

What we mean by this is often people visit a website with a clear objective, that’s normally to purchase something, or to find out information. So let’s give a rather simplistic example, say somebody was searching for how to change a light bulb on a certain type of car for example.

Now this person searching Google for such information most probably what’s a clear and to the point article explaining how to install their car light bulb easily. They don’t want to be bombarded with advertising or directed to somewhere which advertises another product or service.

So if the article or website they land on doesn’t answer the question, and instead bombards the user with a product or service, well this doesn’t generate a fantastic user experience, it’s more likely to annoy the user. It’s this type of website that Google hates, instead the search giant will want to promote high quality and relevant sites instead.

 

Is this the first update by Google to target low quality sites in this way?

 

For some website owners impacted by the Fred update they may well be wondering if penalising low quality websites is a new phenomenon.

The fact of the matter is that most seo agencies will tell you that Google has been penalising low quality websites for a long time now- in fact it’s become somewhat of the norm.

Google, and most other search engines for that matter, want to offer search engine results that are high in quality. That’s to say they will want the user to go away finding the best quality websites, the sites that really answer the question that they have to hand.

Google has released a ton of updates in recent years that target low quality sites, take for example the Google Hummingbird updates and Panda updates.

The Hummingbird updates are important to keep in mind because they were an update that increased the relevancy and accuracy of the search results. It also put a much greater emphasis on changing the SERP’s so that the user was more likely to be happy with the information, this therefore meant that many digital marketing agencies started to shift away from conventional seo, and more into offering high quality information that really added value to the users of the information, therefore seo played a secondary role to offering first class information. Add machine learning into the mix and the only sites that really enter the first page now are ones that offer great information or service.

Gone were the days of just placing keywords within 400 words of text therefore!

 

What kinds of websites did Fred impact the most?

 

Google’s Fred updates were said to most impact websites where it was difficult to differentiate from the content and the ads, also websites that were built for revenue generation purposes using certain low quality methods were also said to be impacted the most. So if a website was low in quality and looking to make a revenue from this information, then this was the type of website that got affected the most.

It’s important to remember that every website is unique (Unless the work is duplicated of course), this means that if you believe your website has been impacted you should consultant with a quality focused agency as soon as you can to diagnose the extent of the problem.

We have Seo’s with tons of experience- so if you need quality advice why not contact us today? We offer the following services within Cardiff:

·        Content marketing services

·        Link building services

·        Low cost monthly seo packages

·        An established agency that you can trust

 

2017/02/21 2017/02/21

Google further clamps down on duplicated content

By |Categories: Uncategorized|Tags: |0 Comments

Duplicated content marketing and how it can harm your seo ambitions

Those of us who work within digital marketing will be well aware that Google simply hates duplicated content.

That’s to say if your website has a small amount or mountains of duplication issues its well fixing the problem. This is so your onsite work becomes unique, insightful and of course high in quality.

Within this article we are going to explain why your website shouldn’t have any duplicated content, plus we will also go on to explain Google’s new clampdown on the issue.

Why is duplicated content such an issue?

Google’s algorithm works to surface results that show the highest quality results from the most authoritative sources.

As you can imagine websites that have just exercised the use of the copy and paste function won’t rank highly within Google’s results.

The fact of the matter is this duplicated content can trigger what’s known as a Google Panda penalty- this means bad news to any website owner.

This can also mean a website gets demoted within Google or even removed completely from the index. So make sure you pick your next agency wisely!

What’s Google’s latest clampdown all about?

Google has been widely reported to have entered into a new deal to further help demote pirated content.

This new agreement has been dubbed “The Voluntary Code of Practice”.

Google has already aimed to reduce duplicated work appearing at the top of its results through its extensive Google Panda algorithm changes, however this appears to be another supplementary step forwards towards tackling pirated content through this new code of practice.

It’s not just Google that have stated that they will be further tightening their grip on the issue neither, Microsoft who operates the Bing search engine has also been stated to be participating with the code of practice as well.

With these two search engine powerhouses stating their intentions to further demote pirated work this should grab all website owners’ interest.

How could this change how agencies or website owners optimise websites in the future?

As always website owners should be looking to write high quality content that is unique and also high in quality. It should not be duplicated as a minimum requirement, instead it should answer questions and provide information in the best way possible.

This will help to stay on the right side of Google’s Panda update and Google’s Hummingbird updates.

However we thought we would throw in some extra checks that you may wish to carry out so that you can spot duplication issues on your own site.

How can I discover if my website has duplication issues?

Discovering duplication issues can sometimes be rather simple, you can either use a premium duplication checker like Copyscape.com or you can use Google as a method for checking for duplicated work.

A simple way to check would be to copy a line of text from your website, so for example a product description, then paste this into Google. Obviously this has to be carried out for every single page that you suspect duplication to be present.

Google’s search bots are so clever they will have already crawled and indexed your websites page (That’s provided it’s been online long enough to allow for a first index of that particular page to occur).

Once you have pasted your snippet of text into Google you may discover that there are a number of websites with the same chunk of text within the search results. In this case you need to fix the problem and pretty urgently.

Finding duplication issues can sometimes be rather complex, for example some product descriptions may have duplicated content but taken from multiple sources so that the writing is a jumble of stolen content.

To be sure your website hasn’t got any duplication issues you should contact a reputable search engine optimisation agency for help, like ours.

How can Ryan C Walsh Online Marketing help?

Duplicated content is a big problem for all Seo consultants to contend with, that’s to say most agencies encounter a site with this sort of problem on a regular basis. Some web designers in a rush to make a website look finished may steal content from your competitor not realising how this can harm search engine optimisation efforts.

Therefore you may require an experienced set of consultants to help your businesses online presence to flourish- for more help and advice why not contact us today?

Our digital marketing prices for online marketing services are some of the most reasonable around.

2017/02/06 2017/02/06

Did your rankings change on the 1st of February? – Why certain methods of SEO are not worth using

By |Categories: Uncategorized|Tags: |0 Comments

Introduction

Every now and again it’s reported that there has been a major ranking flux of some sort, this just means that Google has chosen to reshuffle the positions of certain websites.

Now each time this happens, and when there is a significant change in the rankings everyone wonders whether Google has changed it’s algorithm or not.

Is it Google Penguin, Google Panda or an unknown update?

On the 1st of February 2017 there was a lot of talk about Google possibly updating its Penguin update.

Now for most digital marketing agencies this would be a hot topic of discussion, but not at ours, and we will explain why this is so.

We don’t follow the herd

There’s always going to be a new way to carry out search engine optimisation, that’s to say somebody will always feel

they have found a way to bring about quicker results than another agency can.

And to some clients this may sound like music to their ears, that’s especially when a bunch of agencies said something couldn’t be done within a certain timeframe, and then there may be just one person who says they can do it quicker.

However that person may be using poor practice, and it’s these updates via Google that catches out poor practice. So if the majority of agencies promise they can build three quality back link’s a month, and another is stating they can build one hundred a month this should send the alarm bells ringing!

Quick gains are simply not worth it…

So February 1st 2017 marks just another long list of occasions where Google has punished a load of sites that have used poor link building practices.

This should serve as another reminder to business owners who may opt for search engine optimisation services that are below par.

What was this latest update all about?

Many suspect that the latest update was to further penalise websites that have built links to spammy sites, or are actively using dodgy link building methods.

Google’s Penguin updates are a reason why links should be built naturally and correctly.

One of the ways this can be achieved is by offering high quality and focused content marketing.

By also offering a first class experience on your website people are more likely to find your website useful, if they find it useful then you are more likely to obtain back links naturally to your site.

My website has dropped since 1st February 2017 what should I do?

A number of sites will have dropped around this date as Google further enhances its Google Penguin update.

You should contact a respected Seo agency like ours, or an independent consultant that can then look at your back link profile and decipher which links look like they are poor in quality.

High quality websites are easy to spot with the trained eye, and we have highly experienced consultants who can help you.

SEO advice that is second to none…

We have experienced consultants which have in excess of seven years seo experience, for further help and advice within Cardiff do contact us.