Copywriting Blog

Content creation isn’t an easy task. With technology advancing in huge strides each day, it’s getting more and more challenging for companies to create innovative, unique content for others to consume. There has been shift of emphasis, in terms of quality of content over quantity. You could have all the content distribution in the world; it won’t matter one bit unless the content is good, though. There’s a difference between information and content – the latter has to be useful.   Luckily, there are a range of content creation tools at your disposal to help you create engaging content that works. These are some of our favourites. Meme Generator – Memes took the internet by storm a few years ago and remain to be a very effective tool and making an idea go viral, along with a comical spin to it Visual.ly – This handy tool lets you create impressive infographics and other...

When trying to create perfect HTML code it is easy to make mistakes. This means that the code may not validate correctly every time it is used. Google Webmasters tool gives the user the ability to validate your HTML code, to check if there are any errors. This has raised the issue about how important it is to have validated code. Obviously having valid HTML will benefit your sites performance, but in practice not all code written can be perfect. Looking at this from an SEO perspective, does having invalid code affect your sites rankings? This was the topic of the latest Google webmaster help video. Matt Cutts has once again stepped up and helped explain that it is best to validate. He says that validating code makes it easier when you want to upgrade. Also it makes it easier if you want to hand the code onto someone else. Validating code goes without...

A study conducted by Stone Temple Consulting has found that Google+ 1’s don’t actually cause higher search-engine rankings. Their findings contradict what most people in the industry previously believed. This trail of thought was initiated by Searchmetrics, Moz and others who found correlations between Google+1’s and higher search rankings. However Matt Cutts (head of Google search spam) in a statement claimed that he was looking for the “politest way to debunk the idea that more Google +1s lead to higher Google web rankings.” This conflict lead to Stone Temple Consulting conducting this study by carrying out an actual measurement of causation between Google+1’s and search ranking. Their key finding was that: "Google Plus Shares did not drive any material rankings changes that we could detect." The president of Stone Temple Consulting, Eric Enge, details the findings of the study in an article on his website, Direct Measurement of Google Plus Impact on Search Rankings. Egne...

Google’s spam search division and its illustrious leader Matt Cutts have addressed concerns regarding using nofollow links on websites. In the video, Mr. Cutts announces that typically, links that are nofollowed cannot actually hurt your website’s Google search rankings. There are exceptions, however. While it is true in most cases that nofollow links won’t negatively affect a website’s rankings, Google may take ‘manual action’ on nofollow links if you are using them for the purposes of mass spamming at a large scale. For example, this means that if you are using the links to spam loads of websites, attempting to ‘piggyback’ on a website’s traffic, Google will take manual action in order to deter you from spamming. How do you tell if someone is using nofollow links in a negative way? In the video, Mr. Cutts explained that “if you are doing it so much that people know you, and they’re really annoyed by...

As Google continues the crackdown on low-quality content, you may notice a warning in your Google Webmaster Tools account stating that your site has ‘thin’ content and that you’ll have to correct this if you want success in Google’s search results. ‘Thin content’ refers to content that is non-original. They could be product descriptions that have been taken directly from the manufacturer or otherwise. It could be content that’s found all over the internet. Alternatively, it could also be an empty page with very little content on it. Matt Cutts, Google’s Head of Spam Search, has stated that affiliate sites are an example of how content placement could potentially go wrong. While there’s nothing wrong with affiliate sites to begin with, they still need to justify their existence by adding value. Cutts also criticised article syndication for the proliferation of thin content. This involves taking articles from free article sites, like Wikipedia, which...

While using Google is possibly the best way to access answers to quick questions, some search results will leave you looking for more information. Google have responded to this by implementing a new search result feature: in-depth articles. Google research suggests that 10 per cent of all search queries are related to in-depth information, hence their decision to include an in-depth section found in the middle of the search results page.These results will be ranked algorithmically, displaying extended, informative and authoritative content. In-depth content won’t be featured for all search results, how this works is yet unknown. It does raise the question how Google will display content around sensitive content such as abortion or conflicts. Will they be displaying extended articles for or against? Or a combination of both? Google staff member Pandu Nayak wrote in a blog post that the in-depth results will hopefully encourage people to invest in creating useful content that...

Since January 2013, Google has been asked to remove over 100 million links to pages deemed to be in breach of copyright. In 2012 there were approximately 50 million link removal requests which indicate that publishers are trying harder than ever to crack down on piracy in 2013. While copyright holders are trying to reduce piracy, it is widely seen as a futile exercise; when links are taken down, the material is immediately uploaded elsewhere, thus creating a never ending game of cat and mouse. This problem is primarily down to the fact that there is no central server which can be shut down to stop piracy. The main websites that pose problems for copyright owners are person-to-person sharing websites such as filestube, zippyshare, mediafire and rapidshare. The takedown notices are mostly made by third parties acting on the behalf of the copyright owners. Last month alone saw 14 million link removal requests to...

Bing is the first search engine to feature pop-up child abuse warnings for users that are searching for illegal content. Going on advice from the UK government, which is now considering an opt-in scheme for legal adult content, Bing is so far the first search engine that is taking its recommendations. Searchers who are about to see such content will now be greeted with a message that says: ‘Warning! Child abuse is illegal.’ The warning also features a link to help and advice as well as a reporting tool. The Child Exploitation and Online Protection Centre (CEOP) came up with a list of key terms that Microsoft has integrated into their Bing Notification Platform. If a search user tries to access the terms indicated on the list, the Notification Platform will be activated. They will then be given a notification that the child abuse content they are attempting to access is illegal. They will...

Long-tail keywords are specific, topic-related phrases that accord to niche markets. They can be broken up by a full-stop or a comma and can even spread across several sentences. They are definitely not a new concept. Long-tail keywords have been around for a while, but it’s only recently that SEO marketers have begun paying attention to their utility. The reason for their resurgence in SEO marketing is mostly due to Google’s Panda and Penguin updates. Google is now scrutinising the quality of all website link profiles; webmasters are accordingly going back to long-tail keywords because of this. Long-tail keywords work better than short ones for a variety of reasons. They account for a very large proportion of search traffic and more importantly, there is far less competition for the keywords involved. It’s true that the search volumes are less, but if a more specific set of keywords is targeted, they will be more successful than...

ForeSee Results has just released the latest survey scores for search engines, compiled for the American Customer Satisfaction Index (ACSI). It appears that search engines have received the lowest rates of consumer satisfaction in 10 years, with social sites scoring even worse. Google received its worst score in the entire history of the survey - 77 out of a 100, dropping down 5 points from last year. Yahoo and Bing also suffered a loss in points compared to their scores last year. AOL scored the worst, with 71. ForeSee declared that the overriding reason for the drop in scores was because of the view that “advertising is diminishing the customer experience, especially among search engines.” Given that there hasn’t been a massive decline in the relevance of search ads or a sharp increase in the number of ads on the page, this doesn't really make sense in terms of search marketing. The survey claims that...