This phrase refers to the traditional way in which search engine results were displayed. Once a query had been searched, the search engine would bring up 10 blue links as the results. This method is extremely basic but ultimately laid the groundwork for the way search engines results are presented today.
10 blue links, as a phrase, is generally used today to refer to outdated search engine results pages & a basic layout. SERP's have been upgraded & improved upon a lot since the 10 blue links days, with Google offering the best example of that. Google now offers a myriad of results when you conduct a search expanding upon a simple list of relevant websites. Typically, a Google search will include elements such as relevant shopping options, a Google maps result, a Google business page & even image results.
If you're a business or organisation that relies on local customers to buy your products & services then you'll want to know about the 3 Pack. 3 Pack refers to a type of SEO that focuses on driving local, nearby traffic to your business, it is the listing of three businesses you see in the search results when you search for a keyword that is locally relevant such as "near me" or "near [location]".
These searches appear with a map above it that highlights where the businesses in the 3 Pack are. Google interprets your search query & offers up three Google My Business listings that may be most suitable, based on what it is you're looking for.
For example, if I lived in Manchester, & wanted some sushi, I would Google "sushi in Manchester", in seconds the search engine brings up a 3 Pack that presents three different sushi restaurants that are in Manchester that I might be interested in.
Sends users to a different URL to the one they clicked on. Different to a 302 redirect, which is temporary, a 301 redirect is a permanent change. The term '301 redirect' is taken from the HTTP status code for this action.
Commonly, 301 redirects are used when a company has a new website under a different domain name & needs to ensure users can find it whilst being unable to access the old URL. Once a 301 redirect has been placed on a URL that webpage is no longer accessible as it will automatically send users to the new page.
Often, when a URL has garnered a high value in terms of its linking & ranking Google, the owner won't want to lose the quality by simply removing the page. Instead, a 301 redirect can transfer the value of the original URL to the new URL to which users are being directed too.
A 302 Redirect tells search engines that a page, or an entire website, has been moved somewhere else temporarily. This type of redirect is ideal if you want to briefly direct people to a temporary page that they can use, be it to get contact details, business locations, or to purchase products & services, while you work on building a new site or updating the current one.
Crucially, you should only use a 302 Redirect if you fully intend to restore your original website. Another handy use for a 302 is if you want to test a new page & glean customer feedback, without impacting the ranking & general SEO value of the existing page.
The difference between a 302 Redirect & a 301 Redirect is that the latter is a more permanent option (more than a year). You'd only use a 301 if you were permanently closing your website, or web page, for an extended period, say 12 months or more
The error code received when the link you've clicked doesn't exist. Broken links can occur when the webpage no longer exists or it's been moved to another URL. This can happen if a 301 redirect hasn't been applied to the old URL or the redirect hasn't been applied properly.
404 errors are quite common as sites are moved all the time without the owners of pages linking to the site ever being notified. When the user attempts to view the webpage via the broken link, Google will return with the 404 error notifying the user it no longer exists. Custom 404 error pages can be created by website owners which notify their users on what they should do once they receive the message. Just like with 301 redirects, the 404 error got its name from the HTTP status code.
AEO stands for Answer Engine Optimisation & is a form of SEO that has gained greater popularity in recent years thanks to the rise in voice searches, & devices such as Alexa, Google Home, & HomePod by Apple. As more & more people use voice-assisted devices, the need for industries & sectors to adapt their marketing & SEO to accommodate it has grown. AEO focuses far more on one singular answer, this is because you're not viewing a screen, you're listening to the answer so there can only be one response, not a list of six or seven.
AEO isn't going to replace SEO, billions of people are still going to search for things the old fashioned way (if you can call it that), but the prevalence of AEO is certainly going to increase. It can even match or surpass the number of searches made by typing out queries, especially as voice technology & AI get more sophisticated.
Artificial intelligence is intelligence that is displayed by machines which are different to the natural intelligence that humans & animals demonstrate. AI is a form of intelligence that doesn't involve emotions or consciousness. The term can also refer to any machine or piece of technology that displays particular problem-solving traits & has been shown to learn as it is fed new information.
The goal of AI is to allow machines to receive information & make rational decisions based on the data. Rather than what we have now where machines are just facilitators for our decisions & play no part in the process other than storing & displaying the information we've created. Machine learning is an associated term & this refers to the idea of computers learning & adapting to new data on their own.
Agile Content Development (ACD) is a methodology that looks to continuously improve & optimise content. Rather than just writing content based on data, publishing it, & seeing how the chips fall, ACD aims to tweak & change content based on requirements & search behaviour.
By continuously improving it, the content has a far greater chance of ranking higher, for longer, because it is being tweaked & kept current. ACD is a customer-centric methodology & must meet demands, queries, & intentions at different times.
Agile Content Development is split into four phases: Discovery, Briefing, Optimisation, & Measurement. By adopting this method, copywriters can enjoy real-time recommendations on keywords & topics that inform their content creations & ensure it is always optimised. ACD removes the guesswork & replaces it with knowledge.
ACD should be something all copywriters & website owners do to avoid work becoming stale, outdated, & ranking for keywords that are no longer relevant or getting the search traffic they once had.
Ahrefs is a tool used by marketing agencies & businesses for thorough SEO analysis & to monitor backlinks. Ahrefs is made up of a range of different tools that can help people looking to rank for keywords, & monitor the performance of pages that have already been indexed by search engines.
Split into six parts, Ahrefs is one of the most comprehensive SEO analysis tools out there, it is divided up as follows:
Alexa Rank is a relatively new global ranking system that lists millions & millions of websites in order of popularity. The way this system works is that the lower the ranking, the better. Amazon calculates this ranking by examining the average daily unique visitors & the number of page views over the most recent three month period. Alexa Rank should be thought of in the same way as Google Analytics & is Amazon's attempt to compete in this market. Ironically the website that has the best Alexa Ranking - 1 - is Google, which just showcases the breadth & power of this internet behemoth.
It is popular but isn't without its sceptics, while the ranking system may allow businesses to charge more for advertising, & attract better quality guest writers, the data is limited to users that have the programme installed so websites with extremely high traffic, maybe ranked poorly, despite having great results.
An algorithm is defined as a process or set of rules that are carried out by calculations & similar problem-solving operations. Algorithms are often carried out by computers because they are extremely complex & hard to understand. In terms of how it relates to SEO, an algorithm is a complex system that Google undergoes to determine the rank & return of the billions & trillions of pages that are indexed by the site every single day. The algorithm at Google is quite mysterious & is relatively unknown by people who don't work there. However, things such as long-form content, ontology & long-tail keywords are favoured by it & are often rewarded with a high ranking.
Alt tags, otherwise known as alt text or alt attributes are image descriptions written in HTML that inform search engines about the images you are displaying on your web page. This is important because search engine bots aren't very good at reading actual images, so by specifying alternate text & including a brief but accurate description of the image, you are giving web crawlers a better, clearer & more comprehensive description of your web page.
Often overlooked, alt tags can be optimised with proper keywords & descriptions to improve visibility on Google's image search while also improving indexing accuracy & improving content relevance.
Anchor text is the clickable text of any link, often denoted as blue underlined text. Every time you see a link & click on it, you're reading & clicking on the anchor text. Anchor text is used to provide information - both to users & to search engines - about what the web page being linked to is about. For example, when we link here to a blog we wrote about anchor text earlier this year, the text you click on to be directed to the blog is the anchor text.
Anchor text is more important than a lot of people give it credit for, because it helps navigability & allows crawlers & users to better understand & move around your website. & if you try to influence this with spammy keyword stuffing tactics, you'll find yourself penalised for it.
Answer The Public (ATP) is a handy keyword research tool that visualises search engine queries & questions, auto-complete terms, & suggests keywords in something called a "search cloud". ATP breaks down a search term into six different categories, the 5 Ws ('who', 'what', 'when', 'where', & 'why') as well as 'how', 'can', 'are', 'which', & 'will'. It creates these in the form of reports that can be saved, stored, & shared by multiple users (this is a feature that is only available on pro accounts).
ATP is perfect for businesses looking to examine search intent & glean insight into what their potential customers are searching for. By using ATP, businesses can plan out content & create documents that directly answer these questions. It is a good place to start, but businesses should be aware that ATP doesn't come with search volumes, however, it does help give them greater insight & a better understanding of their target market.
'"Nofollow" refers to the value of the same name that is found in the rel attribute. A rel attribute is another sub term that provides context about the relation of the linking page to the link target. The "nofollow" value is used to signal to search engines that they should essentially ignore this link & not put any authority on it. The concept behind the term is an old one & dates all the way back to 2005. Google introduced this feature to try & prevent spammy links giving undue authority to sites & blogs. The "nofollow" link attribute allows Google to learn about the context of the link & use that information to make a ranking fairer. There are four main reasons why you'd use this attribute. The main reason you'd use it is in cases where you want to link but not be associated with the link target. The other reasons include when you link to widgets, certification badges, & press releases. Unfortunately, Google now no longer treats them as directives & instead takes it as a hint that they shouldn't put any SEO weight on those links.
Standing for 'Business to business', B2B is one of the most commonly used terms in the business world. B2B involves one business selling its products & services to another business. For example, a company that designs lighting solutions could sell their products to an electrician who uses their products as part of their service. B2B tends to happen when a business is looking for raw materials. B2B can also loosely refer to the way a website is designed, who a business advertises to, & the type of language that is used.
Business-to-customer is very similar to B2B. In this instance, the business is replaced with customers. B2C refers to the selling of products to consumers directly, without any supply chains or third parties. A great example of a B2C business is Amazon which sells goods directly to customers. The term became very popular during the dot com boom during the 1990s. Online B2C businesses, such as Amazon, arrived & became a threat to traditional high street customers.
Standing for Bidirectional Encoder Representations from Transformers, BERT is quite simply the biggest update Google has released since they released RankBrain. Google states that BERT will impact 1 in 10 search queries & is - to be it simply - their neural network-based technique for natural language processing.
To put it even more simply, Google is trying to improve the ways its machines interpret our searches to provide us with better results. It's the biggest attempt since the release of RankBrain to take the onus off the user to type the ideal phrase & shift it to the search engine to provide the right results.
As BERT rolls out more & more users will start noticing their results are more accurate & that Google will be able to better understand nuances & contexts of search queries. Impacting 10% of all searches may not sound like a big deal but it is, because, in future, it will only grow.
Pretty soon BERT will more than likely affect all searches made on Google and, unfortunately for people in SEO, there is very little they can do about it in terms of optimisation. In a way, that's a good thing, websites & those looking to rank well can now focus on providing real value to their target audience, instead of overly worrying about keywords.
A link from one page to another, also known as inbound links. Within Google's algorithm, backlinks essentially count as votes for a page & any web pages with a high number of backlinks often have a high organic ranking. These backlinks notify Google that the content which is being linked is relevant & useful to users. Webpages with few or no backlinks at all will be recognised by Google as irrelevant & will be more difficult for those pages to achieve a high organic ranking.
Not all backlinks are the same, however, which is why it's important to ensure your website uses quality backlinks otherwise they will count for very little. Poor quality backlinks account for very little & even with a thousand of them, they wouldn't return the same value as a single high-quality backlink. Sites with good domain authority can provide the most valuable backlinks, as Google will read this as the site passing on its authority to your site.
This term refers to a form of advertising that a user would usually see in the form of a banner on a separate website to the one that is selling the product. Banner advertising follows you around as you search the internet in an attempt to get you to buy the product you didn't purchase or keep you aware of that particular brand in case you want to purchase something from them in the future. For example, let's say you go to a candle store, put candles in your basket & either get distracted or decide at the last minute you don't want to buy them, you leave the site. There stands a good chance that, on the next couple of websites, you'll see banner ads that display the products you were looking at a moment ago. Clever, right? Banner ads have a long & storied history, they first appeared back in 1994 & were the first form of advertising specific to the internet. This history brings with it great success, overall the internet advertising business is worth around $124 billion. These days, banner ads are underpinned by something called programmatic marketing which allows marketers - harnessing AI - to bid in real-time for ad space in the time it takes for the ad to load.
Black hat SEO is essentially any SEO tactic aimed to improve page ranking that Google's quality guidelines. Characteristically, these tactics revolve around content creation designed specifically to manipulate the search engine algorithms instead of creating rich & audience-focused quality content.
Now, black hat SEO has changed over the years, & many SEO practices that were commonplace 15-20 years ago might now be considered black hat. Google's Webmaster Guidelines change, & with them so does what should be considered black hat SEO. However, you can usually identify black hat SEO tactics such as keyword stuffing, hidden text, cloaking & doorway pages in the way that they ignore the user experience in favour of algorithm manipulation.
A blogger is someone online who hosts a blog on any given topic. Quite simply, they are a content creator online who has their website or blog on which they upload their thoughts & grow their business.
The reason that bloggers matter to SEO is primarily one of link building. Bloggers grow their business & can make for excellent business partners, so SEOs will often seek relevant & authoritative bloggers for their link building outreach. Rather than working with another business or website, working with bloggers can be more personal & they can be an excellent resource for authoritative & relevant links.
Blogger Outreach is a process that businesses undertake when they want to leverage the influence of bloggers, influencers, & prominent users of social media, to help boost their brand awareness & keyword reach. The process begins by reaching out to a pre-selected group of influencers in a particular industry, one that the company wants to become more prominent in, usually. Often, the blogger or influencer is given access to products & services for free, or for a fee, in exchange for them promoting it on their social media channels, reviewing them, & generally using their influence to market the business on their behalf.
Done correctly, this can be a very cost-effective way of growing your business & your brand because the 'cost' of this type of outreach is simply letting someone use the product or service you want to promote. This is far cheaper in comparison to other methods such as PPC or Digital PR.
Bounce rate is an important metric that shows you how many visitors came to your website & then left without engaging with your content or visiting another page. Generally speaking, a lower bounce rate is better & shows that your content & user experience is doing something right, encouraging interaction with your website.
Bounce rate will vary from industry to industry & is affected by a variety of factors. But SEOs working on a campaign will always try to improve the bounce rate of a website. While Google claims that bounce rate is not a ranking factor when it comes to Google search, it can indicate & highlight important site or content issues.
Bounce rate could be affected by slow site speed, bad content, high ad density, poor relevancy, & more. It might not be a direct ranking signal, but the bounce rate is something that every good SEO will take into consideration & try to improve.
A branded keyword is a specific type of keyword that includes your brand name in it. For example, if you're looking for headphones & search for 'Apple headphones' the branded keyword in there would be 'Apple'. Keywords that don't include a company's name are classed as 'Non-branded keywords'. For a successful SEO campaign, it's important to have a mixture of both branded & non-branded keywords. Too many branded keywords & you could be at risk of negative search results if you receive bad PR. On the other hand, too few branded keywords will mean your branding is limited.
Breadcrumbs are navigation tools - they are a small text path that lets the user know where they are on your site. It also helps Google to establish the structure of your site too, & breadcrumbs that appear in search results give users an overview of where the webpage is situated on your site. Most breadcrumb trails are usually visible at the top of a webpage. To add breadcrumbs to your CMS, such as WordPress, you can download various plug-ins to do so.
There are several advantages of using breadcrumbs, but the main one is that Google appreciates them, which is always beneficial for SEO. Google sees breadcrumbs - especially those that appear in the search results - as valuable to users, as they do enhance user experience. To keep your site visitors satisfied, & to ensure that they enjoy browsing your site, you should use breadcrumbs so that they always know where they are. Breadcrumbs are also great for lowering bounce rates - if the page that the visitor is on doesn't provide the solution they are looking for, a breadcrumb trail can direct them to another part of the site. After all, it's better to redirect them to another part of your site than back to the SERPs.
CMS stands for Content Management System & is a dynamic website on which multiple users can manage, control, edit & maintain the content & structure of a website. A CMS is a database that is typically easy to learn & set up, providing greater accessibility for people to create & run their website. There are a lot of different options out there when it comes to choosing content management systems - the most popular example of which is WordPress.
Different versions of a CMS will offer a varying amount of control over the code of the website, affecting your ability to implement thorough technical SEO. This allows people who don't want to deal with code & who just want to set up a simple website with a template to get started quickly, while SEOs can use a different version to more comprehensively optimize the website they're working on. There are also plenty of SEO plugins that you can install & use to improve your SEO while using a CMS.
CSS stands for Cascading Style Sheets, & it is a programming language that, used alongside a markup language like HTML, describes to the browser how the web page's HTML elements should appear to the user. CSS is all about the presentation of a web page - for example, at its most basic, the colour & fonts used. CSS can also be used to ensure that web pages adapt & appear differently when viewed on different devices.
Not to be confused with the key ingredient in coffee, Caffeine is Google's indexing system that crawls the web in search of relevant web pages. These pages are then indexed, assuming they comply with Google's guidelines, & can then appear as part of Google's SERP. Previously, Google's old indexing method would consist of a web-wide crawl every few weeks, leading to a layered approach which often led to outdated results.
The Caffeine approach, however, is a continuous process that can provide the most relevant, up to date results available. This means web pages are being added to the index all the time & therefore ensuring the SERP isn't producing outdated results. The program is a huge advancement in technology when compared with the old method, & is also being continuously improved, making Google the number one search engine to focus on & optimise your website for.
This theory - created by our very own James Welch - relates to his belief that Google's aim is to try & determine how big a company's canteen is. The general thought is that the larger the canteen, the larger the company is, and, therefore, the more likely that they are to be trusted. & Google wants the most trustworthy of companies at the top of its listings. This is because trustworthy companies are more likely to give a good customer experience to Google's visitors - & the more that this happens, the more that visitors will return to Google.
But why a canteen? Imagine the canteen of a large company full of people eating lunch. Each of those people is likely to have at least one social media account, with most having at least two or three. Each of these accounts is somewhere that each employee could share that they work for the company in some way. Maybe it is in their LinkedIn bio, maybe they have tweeted about their job. Maybe they have a link on their Facebook profile. Some of these people may have a personal blog that mentions where they work.
All of these are signals that can be made only by a large company. A small company - which will have a small canteen - cannot make the same signals, because it doesn't have the same number of people able to make them.
But it is not limited to people within the canteen to make the signals. A larger company is more likely to have signals created by customers of the company - & non-customers, too. For example, a retailer with tens of stores is likely to accrue more tweets, posts, blogs, & news stories about it than a company with just four people in an office.
In Google's aim to produce the best search results, it has to use factors that are hard for smaller companies to replicate. This fits in to James' mantra that 'the harder something is to do, the more impact it has on Google'.
A call to action (CTA) is a term that refers to a prompt or an invitation for a user to take a specific desired action. These are often phrases that are incorporated into webpage copy, advertising messages or specific buttons that help the user to complete the action e.g. to visit a contact page, to get in touch with someone. A well-written & successful CTA will be clear, easy-to-understand & will result in conversion after prompting the audience to take a specific action. Whilst there can be multiple CTAs on a webpage or within a piece of content, they should not confuse or overwhelm the audience - the next step, & the desired action, should be extremely clear.
CTAs will often include strong action verbs to prompt the reader, such as 'call' or 'buy', & some may use a sense of urgency within the tone of voice to prompt the reader to take action only. This is often done by using specific timeframes, such as 'buy this now, available for a short time only'. An effective CTA can be a powerful tool in growing your audience & increasing your sales.
A canonical, or 'preferred', URL is a URL that Google believes is most representative, from a set of duplicate pages on your webpage. In other words, it is Google's most preferred version of a webpage. A canonical link element or tag is found in the webpage's HTML header, to inform search engines if there is a more important version of the webpage. These elements prevent issues with duplicate content, in the context of search engine optimisation. The canonical can be situated in a different domain to the duplicate.
You should choose a canonical URL from a set of similar pages for numerous reasons. First, it helps to specify which URL you want to appear in the search results. Secondly, it can help to reduce the time that a crawler might spend crawling on duplicate pages. A canonical link will help a crawler to get the most from your site - it can spend time crawling new pages of your website as opposed to crawling similar versions of pages, such as desktop & mobile versions. Other advantages include the management of syndicated content, that it helps search engines to consolidate the information that they have for URLs, & it can make tracking metrics for products & topics much easier, as this is usually more challenging with a variety of URLs.
Clickbait is a term that refers to a piece of text, or picture that is sensationalised & is designed to entice people to click. The defining features of clickbait are being over the top & often misleading. Headlines are often dishonest & entice people in to read the content. This content doesn't necessarily reflect the sensationalised headline that made people click in the first place. An example of clickbait could be: "Leading Surgeon Reveals The Worst Food That You Eat Every Day!". Clickbait is a form of fraud but is not punishable by law. It's a practice, however, that is frowned upon by the online community.
The practice of presenting misinformation or different information to what the user expected once clicking through to a site. This is considered a breach of Google's guidelines & will result in the site being penalised once it's flagged. This can be done by coding a page in a certain way which means when a search engine crawls the site it only reads HTML whereas to a human user, it will display images or other content.
The main goal of cloaking is to boost a page's ranking for certain keywords & when a page is clicked on, send the user to a different place than it would the search engine. Cloaking is considered a black hat SEO practice & something which you should not actively participate in. If your site is found to be cloaking by Google, it will receive large penalties & potentially be de-indexed.
When a searcher wants to compare one product against another to determine which is best. These types of searches are often conducted with the intent of research or purchase. The reason these different types of query exist is that Google needs to read & understand the different types of searches to provide the most accurate results.
For example, searching for a distributor for your product would count as a commercial investigation query. These types of search queries can provide valuable information in terms of keywords & information about your competitors. These queries are very important during keyword research as information on the searcher rather than the keyword can be far more beneficial in some cases.
In SEO terms, competition refers to two different things, direct competitors & SEO competitors. The former looks at competitors who sell similar products & services, or operate in the same area. Direct competition could refer to online companies, or bricks & mortar rivals. SEO competitors refer to business & rivals who are competing for the same keywords on page one of Google. For example, a dozen or so businesses could be writing content & optimising their website for the keyword "LED lights". These will be all in competition with others & the business that satisfies the algorithm most will appear top.
A broad term but one that is very, very important, & a key phase for any business looking to disrupt the market. In SEO terms, Competitor Research involves spending time looking at businesses & their websites, sitemaps, & the way they go about creating content & displaying information about the products & services that are similar to what you're selling. Beyond that, competitor research also involves examining what keywords, both short & long, & questions they rank for. By doing this, you can identify what kind of keywords you need to target to surpass your competitors.
Competitor Research is vital, without it you're not making informed decisions & missing out on learning about potential topics that you could use to help improve the performance of your website, from an SEO & keyword perspective. Having an incredibly clear picture of who you're up against will allow you to find spaces & areas in your industry that haven't been looked at, these opportunities, however small, could be the difference between success & failure.
Content, specifically web content, is the textual, visual & aural content published on a website online. While SEOs & content marketers most often use the term content to refer to written text on a website, content can be anything from videos & images to written blogs & podcasts. Content is essentially any creative element on a website, whether it consists of audio files, embedded video, applications & tools, or text.
YouTube, for example, is a website that consists almost entirely of video content, whereas a website for podcasts consists almost entirely of audio content. The quality of content that a website produces is a key SEO factor.
Web content is so important to SEO because it is a driving factor behind traffic generation. A website with high-quality content that is thoughtful & insightful will generate very different results to low-quality content that uses keyword stuffing techniques & is not valuable to users. The creation of engaging & quality content - like our biography of Elon Musk - & organising this for easy navigation is a key facet of SEO & web design.
It is in the content of a website where keyword optimisation is performed - overall, content marketing is a very powerful SEO tool. While video & audio content is very popular & widely used, search engines still favour text-based content when crawling & indexing a website - which is why many SEOs still focus on producing written content for a website.
A content delivery network (often known simply as a CDN) is a distributed network of servers that deliver content to users. From on-page text & image content to applications & downloadable content, a CDN serves HTML or static resources based on geographic location.
The servers that make up a content delivery network are to be positioned around geographic groups of users, speeding up content delivery massively. CDNs were first created in the late 1990s as a way of dealing with the massively expanding demand for fast & reliable internet across the world.
– Contextuality is absolutely crucial for SEO. After all, SEO involves making each webpage on your site as contextual as possible. Contextuality has been an important concept since search engines were created, but over time, algorithms have got better at analysing contextuality. As Google's algorithms evolve & continue to improve, they are now understanding webpage & entire websites better than they ever have before.
You can improve & maximise the contextuality of a page in many ways. Much of this involves the written content, by including keywords & relevant ontological phrases. However, it doesn't just mean adding as many words as possible. It's all about context. If you're writing about a product, discuss what it is, who it's for, the benefits of the product & the solution it solves. Make sure that you don't just write solely about that one product, but other topics that branch off in relation to it. Contextuality isn't just about words though - using breadcrumbs on your website is another way to improve contextuality for search engine optimisation. It allows the crawler, or bot, to see where it is on a website, thus increasing contextuality.
The conversion rate is a calculated percentage of users that have completed a specific targeted goal. A conversion can take different shapes, whether it be a sale, a form fill-in, or anything else. When goal tracking, a conversion is a term used to describe a user meeting your desired goal.
This can be used for ads, website engagements, emails, & anything else - the conversion rate is the percentage that shows exactly how successful your campaign or ad is. Conversion rate is the ratio of visitors who completed the desired goal against the total number of visitors.
Conversion rate is an important metric in analysing campaign & advertising performance, & it can help advertisers to optimise, tweak & improve their strategy.
It's a term we've all heard & blindly clicked 'accept all' when landing on a new website but what are cookies? Well, essentially cookies are small pieces of data that identify your computer to a network & help improve your browsing & web experience by tailoring, mainly ads, to things that are relevant to your recent internet travels. There are two types of Cookies - Magic Cookies & HTTP Cookies - the former is a slightly outdated concept that refers to the transfer of information sent to & from computers & databases. The more common form of Cookies, the HTTP ones, are the type you'll most likely be most familiar with. They are designed to track, personalise, & save info for each user's sessions. Let's say, for example, you go on a website that sells shoes, once you've accepted the site's cookies, had a browse, & left, you'll more than likely see ads for shoes when you go on other websites, be it to browse content or watch videos. HTTP Cookies were first used by Lou Montilli in 1994 when he recreated the concept while helping an e-commerce company fix overloaded servers.
A core algorithm is essentially an algorithm that is a fundamental part of Google's ranking functionality. In all honesty, there's been a lot of confusion amongst SEOs as to what exactly defines a core algorithm, & Google hasn't been particularly clear on the matter either.
After the Panda update, Andrey Lipattsev, a senior search quality strategist at Google, said that a core algorithm is one that Google essentially no longer has to worry about or work on. In his words, a core algorithm (PageRank being a good example) is past the experimental stage & now functions on its own & will be functioning, unchanged, for the foreseeable future.
Site crawlability can be improved upon & influenced in several ways, including optimising & improving site speed, optimising image & video, using the proper redirects, use of efficient internal linking, & creating a proper site map. These tactics allow the search engine's crawlers to more easily navigate your website, improving its ability to find your content & accurately index your website.
This is a term that describes the level to which a search engine will index pages within a given website. Take a look at any half-decent site & you'll see that they contain main pages & subpages which go deeper & deeper, similar to files on a computer. Crawl depth is calculated by starting at the homepage (which has a depth of 0), then, any page that is linked from there has a depth of 1. This number increases the further from the homepage you go. As a general rule, you want to be able to find the most important pages within three clicks.
A crawler is a program that search engines use to crawl the web, including your website. Alternatively known as a bot, spider, web crawler or Googlebot, crawlers allow search engines to scan & analyze websites across the internet to accurately rank & index them. Crawlers will visit your website to collect information on website navigation, performance & content, adding & updating the information it finds to the search engine's index.
Crawling is the process by which search engines discover & index your website & web pages. Using a crawler, Google & other search engines use crawling to collect information on the internet's billions of public web pages to accurately keep its index updated.
During the crawling process, the search engine will analyse the content & code of your website & follow internal & external links to build a picture of your website's position on the internet, its relationship to other web pages, & the quality of its content. There are also tactics that SEOs will employ to influence search engine crawling through improving crawlability.
To cross-link is to link two pages within a website to increase the relevance of both of them. For instance, if you had a service page that talked about the black shoes you sold, you could link that page to a blog that discusses all the types of shoes which are available for women. Cross-linking in this way is an ideal way of showcasing to google your authority on a topic because, as it's scanning a site on your page, it's constantly being directed towards other relevant content that relates to that page. What better way to showcase authority? If you can prove authority by effectively cross-linking, you're setting yourself up for SERP success. Good cross-linking is about a couple of things - the web page/content it is linking to & the anchor text. Both of these have to be extremely relevant for you to achieve the authority that's going to see that page fly up the rankings. You can't just link to a page that isn't relevant to the content that you're linking from. Further, the anchor text should be relevant to the site that it is anchoring. Done right, you can easily see how dozens of cross-links help to increase link juice & overall domain authority.
Crowd Marketing is a fairly new term, which isn't uncommon in the marketing world, & it's used to describe a new type of marketing that is all about appealing to the masses. Its newness means that it gets misdefined by a lot of marketers. In essence, Crowd Marketing goes beyond Influencer Marketing - which is what it gets confused with - & incorporates content creation, SEO, & social media marketing, all of which results in verified lead generation. It's different also because it focuses on targeting an audience within a market, rather than just masses. Crowd Marketing helps to build the authority of a business within an industry. There are five types of Crowd Marketing - Classic, Backlink Generation, Content Distribution, Reputation Management, & Crowd Influencer Marketing. Classic Crowd Marketing simply involves publishing high-quality content within your industry. Backlink Generation is about getting attention from other users in your industry & ultimately getting them to link back to your site. This can help boost domain authority which improves ranking drastically. Content Distribution involves publishing vast amounts of content across various channels. Reputation Management involves creating profiles on every relevant platform so that customers can find you quickly. Finally, Crowd Influencer Marketing involves reaching out to macro & micro-influencers who will push your products/services on your behalf (if they believe in it, of course).
Customer journey is an all-encompassing term that looks at the process by which a customer purchases a businesses product or service. Depending on the sector or industry, the customer journey could take minutes or months. The journey depends on the products & services being sold, the price of them, & the effect they have on the customer. For example, a typical journey for purchasing life insurance may start with someone searching for more information, thinking about the options available, consulting with family members before deciding on where to purchase this product. This could take a long, long time. Businesses must ensure they plan content & advertisements that ensure they are always at the forefront of a customer's mind during their decision making.
Often shortened to CLV, Customer Lifetime Value is an important metric that measures the total worth of a customer to a business throughout their relationship. It's a metric that uses the following formula: customer value x average customer lifespan. CLV is very important because keeping existing customers, & extracting value from them, is much cheaper than trying to acquire new customers. For example, if a customer has repeatedly bought a product over five years, their CLV to the business will be very high & much cheaper for the business to keep them because they don't have to spend money on advertising & promotions to attract them. It's been labelled by some as "the most important metric that companies ignore". To back this up, a recent study indicated that 34% of people knew fully what the concept of Customer Lifetime Value included. CLV can help you segment the value of your customers, allow you to focus on long-term company-wide growth, & accurately measure how much you should spend on customer acquisition. It's a powerful metric to build your strategy on.
A broad term, data is loosely defined as a collection of facts, figures, & empirical evidence. Data can be collated & used for reference, analysis, or to make decisions. Data is the most valuable commodity & resource on planet Earth, recently surpassing Oil. Data is incredibly important for a business. The right data can help them make more informed decisions & allows them to use their budget in the most efficient & effective way. An email is the most basic form of data. However, spreadsheets, keyword research, & analytics are other more complex forms of data.
This kind of page is one to be avoided. A dead-end page can disrupt user flow & encourage people to leave your website - which you don't want. These dead-end pages have no internal or external links, & they also don't have a call to action or real point to them. All pages should be designed & written so that there is an end goal - be that a phone call, email, or to click through to another page.
Deep linking is the practice of linking to a specific page on your website, or someone else's. This isn't just linking to a homepage or service page, it involves linking to a very specific piece of content such as a news article or blog. Deep linking can be tricky to do but, like all things that are difficult in the world of SEO, search engines will reward it if it is done well. An example of this would be if you wrote: "In the past few months we have seen how employees enjoy the benefits of remote working." & then in that sentence, you linked to a specific news article that discussed the increase in remote working.
This is when Google takes an action on your site to deliberately remove it from the Google index & generally carries a negative connotation. This is due to Google crawling your website & finding something which it has deemed in breach of its quality guidelines. To ensure your site can be indexed & as a result, unable to be de-indexed, you will need to follow these guidelines extremely closely to ensure complete compliance.
Often de-indexing can be attributed to a few factors & it generally relates to potential spam content. For example, if your website has acquired a large number of backlinks in a very short period, Google will notice this as suspicious behaviour & will penalise you for it. Participating in link farms or spamming comments will also be flagged as uncooperative behaviour & result in your site being de-indexed.
To prevent your site from being de-indexed or to get it re-indexed if this has occurred already, you will need to cleanse your site of spam links. This will require you to audit your site & disavow any links which are deemed spam. Once spam links have been removed & you're happy your site only contains natural links, you can submit a reconsideration request to have Google re-index your site.
-Digital Public Relations (PR) is a strategy that involves creating high-quality content to increase brand awareness. This content is then pitched to online publishers who will share the content, citing your brand as the information source. For PR to be successful, strong relationships between writers & publishers is crucial. Most content published for PR purposes is emotional content - it's content that the audience can resonate with, generating interest in the content & subsequently the brand & any relevant services or products.
The relationship between SEO & digital PR is often overlooked, but it is highly important. Promoting your brand via different online publications will help you to become more credible & trustworthy, both in the eyes of potential clients & customers & from Google's point of view. Using digital PR tactics & link-building can go hand-in-hand to do this. By linking press releases back to your website, you'll allow potential clients & customers to click through with ease, whilst Google will recognise your sector as trustworthy & a knowledgeable source within your sector.
The takeaway message here is that by getting authoritative, trustworthy domains - such as digital news publications - to link to your site, Google will trust your site, see it as a credible source & this will only work in your favour when it comes to those all-important search rankings. A higher search ranking provides the opportunity to attract more traffic, subsequently resulting in more conversion.
You would disavow a link if you thought that that link was a threat to your site's SEO performance. In the same way that good links from sites with authority benefit your site, bad links from spammy sites can significantly damage your reputation in the eyes of Google. By disavowing a link, you're telling Google that you do not want this counted toward your site. Disavowing a link is an absolute last resort & should be avoided because it can hurt your own SEO performance. Before opting for this, try & manually request the link be removed.
A tool that is used to discount the value of an inbound link. This is used to prevent any penalties for link spam which could harm the ranking of your website. As Google's ranking algorithm has been improved & adapted, it's now able to recognise if there are too many links to a particular domain, registering this as spam linking & penalising the website as a result. This is done in an attempt to provide its users with the most relevant sources of information & prevent them from viewing spam content.
For those looking to achieve the optimum ranking for their website through organic SEO, this is an extremely important tool as it can essentially tell Google not to count the link when it crawls your site. The bad link which you may have unknowingly created could lead to an unsavoury site or another site that Google has recognised as not relevant, hurting your ranking as a result of your website being associated with the other.
A display network (specifically, the Google display network) is a group of over 2 million websites, apps & videos on which your advertising can appear. Advertising through the Google display network means that your Google ads can be seen on YouTube & through Gmail, as well as appearing on millions of other websites online. Through advertising on the Google Display Network, you can target your ads to appear to particular audiences, locations & in particular contexts. Display network sites reach over 90% of internet users across the world.
DA, or domain authority, is a search engine ranking score & metric used in part to predict how well a website will rank within its sector, industry or niche. It is used to give businesses & marketers a picture of how authoritative a website is - & it's no secret that authority is a big part of how Google indexes & ranks websites.
Domain authority is essentially a score for a website's overall strength, building up over time as more links are acquired & more content is produced. The score can be increased by improving your website's authority. This can be done in several different ways, but perhaps the best is to earn top quality links & backlinks from high ranking, trusted & reputable sites.
The term domain authority has an awful lot to do with link equity (otherwise known by SEOs as 'link juice'). Earning strong backlinks that pass a lot of link equity to your website will earn you a higher domain authority - which, in turn, will improve your visibility & ranking in the SERPs.
A domain name is essentially the address of your website online. It is the text that a user enters into the browsers address bar to get to your website (for example, l33roy.com).
A domain name is a known ranking factor in Google's algorithms, & the difference between a good domain name & a poor one can have a significant impact on your SEO & performance. Having a relevant, strong & SEO focused domain name is certainly something you should consider.
This is a term that isn't commonly used but refers to a page on a website that is made specifically to rank for particular keywords. These pages act as a doorway to other areas of the site, usually product pages, & are quite unpopular online. They are unpopular because they have often been used for nefarious purposes. In practice, they should be unique, content-rich pages that provide genuine value with pushing a hard sell. However, in practice, they have often included mass-produced content that has various versions that don't deviate from each other. Doorway pages clutter up search engines & present a challenge to the Googles & the Bings of the world.
In short, duplicate content is when a significant amount of content on one web page matches content that exists elsewhere - either on the same website or a different website entirely. If two substantial pieces of content across different web pages are, either identical or matches closely to one another, Google may deem it to be duplicate content.
The issue arises when a search engine crawler finds & indexes the content in two different places & isn't able to tell whether the content has been copied, ending in a potential penalisation. Duplicate content is generally considered black hat & has become something that many SEOs fear due to penalisation, but there are also some common misconceptions. The main issue is when Google struggles to decide which version of the content is most relevant to any given search term, potentially impacting search rankings.
You might have duplicate content that needs to appear across many different URLs on your website, but this isn't actually a problem. Good SEOs will know how to canonicalise content for search engines.
The amount of time a user spends on a page once clicking through from the search engine results. The dwell time officially ends when the user leaves the website. Although similar, this is different to bounce rate as this is the rate at which users view & leave a page in a certain period. Dwell time is specifically the amount of time the user spends on a webpage, either reading it or initially deciding whether it's what they were looking for before leaving.
This stat can prove extremely useful for website owners as it gives them a clear indication of what users think of their site's first impression. If the dwell time is high, their website is eye-catching, informative & overall useful to the user leading them to stick around. If the dwell time is low, this will be reflected in the bounce rate as users are generally taking one look at your webpage & deciding to leave very quickly after clicking through.
This is a specific URL that features content that depends on variable parameters which are provided by the server which delivers it. Characters such as '&' '$' '+' '=' '?' '%' 'cgi' indicates a dynamic URL. Various search engines won't index dynamic URLs. However, Google does as long as the information in that URL is specific to the industry & is packed with content. It's vital that you have at least one URL which does have a static URL that doesn't change (a homepage URL is a good example of that).
These links are very good to have. An editorial link is an organic inbound link that is naturally used by a website with high authority. For example, if a university's website linked to your site because it had content that they wanted to reference. An editorial link is a great indication that your content is thorough, well-written, & useful. They are also a sign of a strong link profile & are different to acquired links which are usually requested or paid for.
The amount of interaction a user has with content. For example, if a user clicks a link or likes a photo on social media, this is classed as engagement. Often, the success of an advertising campaign is measured by engagement. High engagement generally means that the campaign has been a success & poor engagement helps you identify areas for improvement.
Measuring the engagement of your posts & content allows you to get an understanding of whether your target audience thinks it's relevant or not. Engagement rates may fluctuate as there is no definitive way to guarantee engagement. This is why analytics & data are so important in the world of SEO as it allows you to predict the types of content which will achieve high engagement rates.
Evergreen Content is a term used to describe website content that has been written to be relevant for years to come. Evergreen Content can help establish a business as a thought-leader in its industry, & from an SEO POV, it allows search engines to understand the business & over time award them with consistently high rankings because they have put in the work to create content that is thorough, insightful, & in-depth.
An example of Evergreen Content could be an in-depth guide, that would end up being around 4000, 5000, 6000 words or more (there is no real limit, the more words the better), about a topic in an industry that will remain relevant for years to come. For instance, if you're a business that operates in the telecoms industry, an 8000 word 'Guide to VoIP' is a great example of a piece of Evergreen Content, as VoIP is a topic that isn't going away anytime soon.
Evergreen Content should be combined with blog content to create a potent SEO strategy that ensures long-term success as well as short-term gains, which can be achieved by creating regular blog posts.
This is a type of PPC option that only allows your ad to be shown when people search for that specific phrase. This can reduce cost & prevent you from showing up for keywords that you don't want to be associated with. For example, if you were selling black women's shoes, you would set up an ad that would only appear when people type in the keyword "Black women's shows", it wouldn't show for similar keywords such as "Black shoes for men" or "Children's black shoes".
An external link, otherwise known as an outbound link or outgoing link, is a link that leads away from your website. Essentially, if you have a link on one of your web pages that leads to a web page on a different website to your own, that is considered an external - or outbound - link. For example, if LJ digital was to link to your website from our own, that would be considered an external link on our side, & an inbound link on your side.
FTP stands for file transfer protocol, & this is the system used to deliver & transfer computer files between system & server. As an example, if a website has been built without a CMS (content management system), then to publish a web page you will need to use FTP to transfer the web page file from your computer to the server's file.
A favicon is a small icon that can serve as helpful branding for your company. It's a 16×16 pixel icon that can be found on tabs or drop-down menus. Favicons are tiny which means they should only contain your company logo or one or two letters. Favicons are increasingly becoming a more important part of company branding & act as a handy visual marker for people looking on their tabs list or reading list. Favicons are not exactly SEO critical but they are just one of many things that form part of your overall web strategy. The key to a good Favicon is simplicity, using space wisely, ensuring your brand identity is clearly displayed, & leveraging abbreviation & colour coordination (not easy right?). The best Favicons are the simplest - think YouTube, Whatsapp, & Twitter - so don't overcomplicate yours when you're creating yours. It could be your logo or the first letter of your company name in your branding.
For some search terms Google will display a box with answer summaries above the term's organic search results - these are known as 'featured snippets'. Usually used for question-based search queries, the featured snippet box contains a summary answer to the query with a link to the web page from which the answer came.
Often known as 'position zero' on the search engine results page, featured snippets are often sought after by SEOs because they provide a stamp of authority. When Google determines that your answer to a search query should be displayed as a featured snippet, it can essentially be seen as Google having deemed your answer to be the 'best' on the internet. This might not always be the case, but 'position zero' is, nonetheless, a great thing to earn for your website.
For further reading on featured snippets & earning position zero through great content, we wrote a blog at the beginning of 2020 detailing how & why position zero is the SEO aim for the year - & how, while you can't influence Google's programming, there's still plenty you can do to try & earn your spot above organic search results.
A follow link (or do-follow link) is a link that passes authority - or 'link juice' - from one site to another. All links are 'follow links' by default, meaning that they can be simply categorised as any link that has not had the 'nofollow' attribute applied to it - there is no specific 'do-follow' attribute.
Follow links pass on authority, & they do this because they allow crawlers to follow the link & more accurately place a page's ranking in the SERPs. Authority is a known ranking signal, so follow links from high authority pages that pass on PageRank is very valuable for SEO performance.
Frase is an AI content creation tool that helps business owners & content writers create content on any topic in a way that is ontologically relevant, as well as targeting keywords. Users write their content within Frase, after entering a target keyword. Before a user begins writing, the platform scans the internet for that target keyword & then pulls together a report, based on the top 10-20 search results. In this report are the headers, questions, & titles that are used by websites in those search results.
However, where Frase comes into its own is by compiling an exhaustive list of 'Topics' which aren't just standard keywords, they are phrases & words that are regularly used by competitor websites that are targeting this keyword also. By incorporating these 'Topics' into your content you can ensure that you're talking the same language as those websites which are already successful. This deep level of ontology is a trend that is going to become more prevalent in SEO as it becomes more nuanced & intelligent.
A 'friendly-URL' is a type of Uniform Resource Locators that is easy to read by both search engines & Google's spiders. On large, dynamic websites the URLs will just be a series of words, symbols, & numbers. For instance, it could be something like www.domain.co.uk/page?id=132. In today's SEO world though, it's important that your URLs can be understood & have relevant contextual information in them - it's all part of building up that picture that your website is worthy of good rankings. So, how does one go about building a 'friendly URL'? Let's say you're a website that sells tiles. For the friendly URL, you'd have something such as www.uktiles.co.uk/tiles/wall-tiles In this URL, not only do you know who the company is, but you can clearly see that they have a 'tiles' page & within that a 'wall tiles' page. It makes sense that the latter page would be under the former page. This, to Google, is a clear indicator that the website is well thought out & planned out in a user-friendly manner, the URL reflects this.
A vast amount of searches have a local angle to them, 'dry cleaners near me', 'restaurants in London', & 'trains to Birmingham'. Therefore, more & more search engines are throwing out different search results depending on people's geography. For instance, if you're based in London & type 'sushi restaurants near me' you'll get an entirely different set of results than if someone in Manchester typed the exact same term. It may sound obvious but geo-dependent requests are a huge part of Google's service, & it all boils down to them providing users - you & I - with the most relevant answer to their query. Brands & local businesses can leverage geo-dependent requests by bidding for paid ad positions on the search engine results pages that are based on geography. For instance, if you're a digital marketing agency in Manchester you could potentially bid on keywords such as 'SEO agencies near me' knowing that people who search this time in your location will see your business.
Often referred to as URL parameters or query strings, this term refers to a URL structure that can be used to gather specific data or adjust how a page's content is viewed. Get Parameters can come in two forms: Active parameters - adjust the visibility of content on a page. The URL formula can be modified to either filter out content, or order it in a systematic way. Example: l33roy.com/index/?type=getparameters (the '?' Is always present in this URL & is followed by the command you wish the page contents to follow) Passive parameters - a passive GET parameter doesn't alter the visibility or order of content, but instead enables website hosts to collect user data. This data can be incredibly insightful for evaluating marketing campaigns, as the tracking feature of this URL provides information about how a user landed on the page. Example: 33roy.com/index/?utm_source=google these UTM parameters work in collaboration with tools such as Google Analytics to gain data about page visits. It is important to consider, however, that too many GET parameters used across a website's subpages can have an adverse effect on rankings. An experienced SEO professional would utilise this handy feature in an SEO-friendly way, eliminating unnecessary parameter URLs to mitigate any risks to duplicated content damaging rankings.
Adwords is a platform that is run by Google which allows advertisers to pay to display their ads at the top of relevant search engine results pages. Adwords is the opposite of organic SEO because businesses don't need to spend time writing keyword-rich content & wait 3-6 months to watch their content rise through the rankings. Adwords work on an auction basis, a user submits their ads & accepts to pay a certain amount per click of their advert. When someone types in a keyword that is relevant to your ad, search engines, in milliseconds, determine which 3 adverts should appear at the top of the search, above the organic search. The word 'Ad' is displayed prominently next to the content.
Google Alerts is a notification service that you can set up to track activity related to your target keywords & search terms. You can set it up so that whenever there is a change in content indexed by Google for use in the search results, you receive an email notification.
For example, if you wanted to track your company name, a particular service or product, you could set up Google Alerts to notify you when changes are indexed. Google alerts is commonly used for reputation management & link building because it's a great tool for outreach & to identify potential opportunities. Google alerts is also often used to monitor the competition & to track any change they make to their relevant content.
Google launched Google alerts in 2003. While the service has had its problems & has come under criticism in the past, it has come to be a widely used SEO tool.
A Google algorithm (or any search engine algorithm) is a complex computer program with a process & set of rules, which Google uses to retrieve its indexed data & deliver ranked search results.
You'll often hear people refer to 'the Google algorithm' in the singular sense when talking about how Google's search engine functions. But the truth is that Google is made up of many individual algorithms all working simultaneously. Google uses a combination of many algorithms when it delivers ranked web pages to users via its SERPs (search engine results pages), all based on a large number of different ranking factors.
Analytics is a service offered by Google that allows users to track & analyse the performance of their website. It was launched in 2005 & has since become the most widely used analytics service on the internet to date. Users can track session duration, bounce rate, pager per session, & demographic information. Users can use that information to create targeted campaigns to improve conversions & leads.
Googling Bombing or Googlewashing, as it's sometimes called, is the practice of getting a website to rank for terms & phrases that are in no way relevant to the product or service that is being sold there. Google Bombing is done by linking to various, irrelevant, web pages on anchor text. It would be like writing about car insurance & linking to a website that sells LED lights. It is definitely a black hat SEO practice & will look to exploit Google's algorithm.
The mass building of unnatural links to a competitor's website. This is considered a black hat SEO practice & began when Google started penalising sites for attempting to achieve higher rankings through linking on forums & other spam blog sites.
Google is extremely clever when it comes to deciphering between natural & unnatural links & how to deal with the sites which use unnatural links. To create the most relevant experience for its users, Google reduces the ranking of sites associated with spam links to reduce the chance that users will view the site. While it may seem that building a large number of links is a good idea, carrying it out in this way can actually have a lot of negative results.
To help your SEO & PPC campaign get off to a good start, the Keyword Planner could potentially be a handy tool. It is a free-to-use service, found within Google Ads, that can help you generate keyword ideas & inform you of bid estimations, both of which are key aspects of a thorough, & successful, marketing campaign.
The Keyword Planner allows you to search for keywords & groups of ad ideas to determine how they may perform. It can also help you identify areas that you may want to target, that are within your budget so you don't have to worry about overspending to be successful.
As mentioned above, a key feature of this service is that it is completely free for people to use. The downside to this is that it doesn't give you the most accurate data, it simply gives you estimations & broad ranges. However, something it does do that other services can't do is suggest unique keywords that you won't find elsewhere. This can potentially give you an edge over other businesses who've neglected to use the Google Keyword Planner.
A fully virtual mapping of 98% of the places in the world in which people live. Location details, street view & 3D recreations of locations are all available through Google maps, including a directions function which has led to it being used in place of a sat nav. With live data feedback, Google maps can provide real-time updates on traffic, road accidents & even the position of speed cameras.
Integrated into other features on Google, such as business pages, Google maps can provide the location of business premises. This is tied in with other important details the business may want to share such as contact details & a link to their website. It can also be used in organic & paid adverts on the search engine to define an audience by distance or location.
A tool that allows businesses to manage their online information across different Google features including Search & Maps. Google offers businesses the option to increase their online presence by entering relevant details into their Google My Business page. This may include details such as business contact information, opening times or even location. By entering this information, Google can understand more about the business & produce more relevant results for searchers.
By entering the location of a business, Google will be able to present your business as a result of a location-based search, meaning if someone is searching for a service you offer in the local area, your business will likely be provided as a result. Including as much information as possible about your business not only helps Google, but it also helps the searcher & prompts them to take an action on your site.
A news feed of collated articles based on the user's preferences. Google's algorithm will read a user's viewing habits & use that to create a personalised news feed based on topics & news sources that you view regularly. Designed to provide quick hits of up to date information, the feature (which can be downloaded as a separate app) works best when embraced as you're able to select specific topics of news to follow.
In-depth features allow the user to view the weather in their area, local news or news which has been verified by a fact check. Acting as a hub for all of the news relevant to a specific user, you can also search for topics, locations & sources which you wish to explore. The 'For You' page is a collection of articles that Google's algorithm has read & understood to be relevant to you.
Google Tag Manager is a free application for managing & deploying marketing tags on your website (or on your mobile app), without modifying the code. A very simple example of how GTM works is as follows. Information from one source of data (your web site) is shared by Google Tag Manager with another data source (Analytics). If you have plenty of tags to handle, GTM is very helpful, because all code is stored in one location.
Grey Hat SEO is a risky practice & is a term that refers to the practice of techniques that aren't clearly defined as 'bad' by Google, & the material that it publishes about SEO.
Grey Hat SEO is a murky area & is a topic of SEO that is contested by many. There are some clever pieces of innovation that businesses could undertake which could boost their site, or see them lose thousands of pounds in lost traffic. Further, what is considered Grey Hat SEO one year, may well be classified as White Hat or Black Hat SEO the following year by Google. There is a great deal of risk involved in Grey Hat SEO & if search engines decide that these tactics violate their terms of service, you could suffer greatly. Therefore, it is best to stick to White Hat SEO practices, have some patience, & be confident that your long-term approach, which follows the rulebook, is best for your business.
Guest posting is a straightforward concept - you post on another blog or website as a guest. By doing so, you'll earn greater exposure because of the external backlink to your own blog from this site. To successfully guest post, or blog, on somebody else's platform, you should establish strong relationships with others, as growing your network can open up opportunities to guest post. By guest posting, you may increase your influence both on your own site, external sites & even on social media platforms too, as it gives you exposure to another, wider audience.
Guest posting also offers potential benefits to the host site too, making it beneficial for both the guest blogger & the host site. By hosting guest bloggers, the host site will keep generating new & interesting content regularly. This is great for SEO & makes you look like a reputable, reliable source of information. If you're posting on other blogs, you should therefore offer to allow the hoster to post on your site too. That way, you'll both reap the benefits of guest posting. When guest blogging, it's important to pay attention to the links that you include in your content. You should ensure that your anchor text is relevant & useful, in the context of the URL that you are linking to. As with all SEO strategies that involve linking, your links should be useful, legitimate & trustworthy.
Standing for Hypertext Markup Language, HTML is the standard code used to create web pages & applications, & it is the code that search engines read your website in. HTML is used to create heading tags & site maps, & HTML source code is the foundation for any programming. Want to read this page's HTML? Just right-click & select Inspect, & you'll be given the full HTML script for this page.
HTML is a core part of web development & is often the first coding language people learn to build a website. It's also a crucial part of SEO, as the vast majority of technical SEO is done within the HTML source code. When done well, technical SEO uses HTML understanding to keep HTML clean &optimised, making it easier for search engines to crawl & read your website.
The whole internet is based on the Hypertext Transfer Protocol (HTTP). Hypertext links are used for loading web pages. HTTP is an application layer protocol for transferring data between networked devices & running above other network protocol stack layers. A standard HTTP flow includes a customer requesting a server & sending a response message.
The head section of a web page refers to the part at the top of an HTML document that doesn't display in the web browser when the page is loaded. This is a sort of behind the scenes part of the web page, containing things such as metadata, links to CSS, & more.
The <head> tag is placed between the <HTML> & <body> tags, which allows it to not be displayed in the browser. The metadata that the head section can contain includes information on the document title, the HTML itself, the author, characters, styles, scripts, & more. It's an important part of on-page & technical SEO because it gives an opportunity to include important keywords that describe the page to Google.
A homepage is the web page on a website that operates as the primary starting point of that website. It is the default web page that you load when you visit a web address domain name. For example, visiting l33roy.com will take you to the my home page.
The home page of a website often includes a contents section, navigation bar with links to the other pages of the website - it is essentially a website's hub. There isn't a standard home page layout, but many include navigation tools such as a search bar & informational content about the website & business. The home page is located in the root directory of a website.
A homepage should, very quickly, explain what the company is, what it sells, & how people can get in touch with the company to enquire, ask questions or request a quote. The homepage sets the tone for the rest of the website & should be regularly updated to ensure company information is always correct.
A hyperlink is a type of link that goes to another page on your website. It can be in the form of an icon, graphic or piece of text and, once clicked, will take you to the page that is mentioned. Text hyperlinks are usually blue & underlined, & when you hover over it your mouse will change from an arrow to a hand. Hyperlinks are found all over websites but can also be used in PDF documents & other similar pieces of content to allow people to jump to different places quickly & easily. Essentially, hyperlinks allow people to move across the web at super-fast speed.
An IP address (or internet protocol address) is a series of numbers that identify a website's domain location online. IP addresses often have a domain name assigned to them, as this is a much easier address for humans to remember than a string of numbers. But the IP address is still the main way in which the internet & your browser locate a website.
IP addresses can be either dedicated, where a website has its own unique address or shared. Shared IP addresses are used when several websites share an address on a server.
While your internet protocol address is not known to be a ranking factor, it can affect site performance. For example, a dedicated IP address can see an increase in site speed which, in turn, is a Google ranking signal.
A feature that can usually show up to five images & allows site creators to display a series of featured images. A similar feature is used on social media ads where 'tiles' can be used as a swipe-to-view advert. These are also in use quite regularly on websites as they can display a series of images without covering large sections of the webpage.
These can be extremely helpful when a site creator would like to display images on a webpage whilst still having plenty of room for text. However, their effectiveness has been challenged by some as claims have suggested that they reap low-conversion rates & fail to engage customers in the same way other content does.
An inbound link - or a backlink - is a link to a web page from another website. It's a term used to differentiate between links coming from other websites & the internal linking on your own website. Every link is both inbound & external on either side - inbound linking refers to those coming into your website. For example, if LJ Digital were to link to your website, that would be considered an external link for us, but an inbound link on your side.
Commonly known as backlinks, inbound link building is a common & powerful SEO strategy that can have a great impact on your search rankings. Building a strong link network & securing high authority inbound links is an important search engine ranking factor & is a big part of most SEO strategies.
A report of the URLs you own & the status they currently occupy in Google. All of your URLs will be listed & grouped by the status & the reason for that status. For example, if there is an error status, the reason for the issues will explain why the URL has been flagged as an error. The index coverage report is the best way for a site owner to see the collective status of how their site is performing.
Although having 100% coverage may sound like a positive thing, it means that every single page on your site is being indexed. While this may be okay for sites with a small number of pages, it may mean for others that pages such as order confirmations are indexed, which can damage your site's ranking as they likely won't be optimised for the relevant keywords.
Pages of a website that has been crawled by a search engine & indexed as part of its database. Pages can be indexed upon request from the website owner to have the search engine crawl the site or naturally, as the search engine finds the page through high-quality & relevant links.
By having its pages indexed, a website can improve its domain authority & be officially recognised by a search engine. This increases the chances of your web pages showing up in a Google search & as a result, drives a higher amount of organic traffic to your website. If your website is not indexed, it's likely due to it being new as Google has to make its way through millions of domains. Alternatively, if the pages still aren't indexed after a long period, the issue could lie with the structure of your sitemap.
Indexing in the context of SEO is the process whereby Google's bots crawl your new website, page, or blog, & index it on the search engine results page for your chosen keywords. The indexing process involves Google understanding what your page is about (which is why linking & ontology are both so important) & rewarding it with a ranking. As you add more content & information to your page, your rank will increase as Google rewards you for providing users with more information & context about a topic than your competitors.
Infographics are ways to visually represent data - such as statistics - or any other information or knowledge that needs to be seen & understood clearly & quickly. Several types of infographics can be created for various purposes, such as statistical infographics, geographic, process, or informational. As humans, we can visually digestpatterns & trends, & infographics make use of this skill.
Infographics are a great way to improve your SEO. They can be used as an effective digital marketing tool & can play an influential role in increasing brand awareness, especially if the purpose of your infographic is to share information or data about your business, products or services. By increasing brand awareness & generating interest amongst your audience, they can increase your web traffic.
Because infographics are easy to understand & engaging when designed correctly, they are also a great piece of shareable content. By getting your infographic shared & published across different platforms or publications, & linking back to your site, you'll increase your reach, your exposure & your credibility. Ultimately, when designed & shared effectively, infographics have the potential to be a key marketing tool that, when used alongside other web elements, can increase your ranking in search results.
Information architecture (IA) in SEO is the site structure & overall hierarchy of a website's various web pages. The information architecture is a focus on organising & labelling a site's web pages & content into an efficient, coherent, clear & effective structure.
Information architecture is important because not only does it make your website more navigable & user friendly, but it also enables search engines to more easily understand, index & rank your content by improving crawlability. The goal is to help both users & search bots find the information they need more easily.
Setting up a strong site structure & information architecture requires understanding the hierarchy & importance of different web pages & pieces of content, allowing them to become part of a larger, more coherent picture. The main components of information architecture are:
A search conducted purely to gain information on a topic. A user will use a search engine to enter a keyword or phrase to achieve informational results. By categorising queries into these different categories, search engines can determine which results to prioritise for users searching. Searching using the queries allows users to benefit from certain processes & keywords which the search engine will use.
Each different word in the query can be interpreted as a different driving point for results. This allows Google or whichever search engine you're using to provide multiple results which are all relevant. It is also able to prioritise certain pages or sites which it deems the most relevant to you.
When it comes to SEO, the intent is how we refer to what users want when they enter a query into a search engine. It's the 'why?' behind a search term that helps us to understand what users need and, by extension, how best to give it to them.
Search intent, often categorized by SEOs as either informational, navigational, or transactional, gives us an insight into who is searching for what, & why. This, in turn, allows SEOs to adapt their offering to suit the audience, aligning content & optimising for specific search intent.
An internal link is a link that connects web pages on the same website. If the destination of a link is on the same website on which that link can be found, then it is considered an internal link.
Internal linking is used mainly for navigational purposes because it helps both visitors & crawlers (search bots) to more easily move around & understand your website & its page hierarchy. Internal linking keeps traffic on the same website & improves user experience, while also making your website easier to crawl & index. SEOs employ internal linking strategies to build a proper website structure & hierarchy.
-An ISP is a company that offers a range of services that allows users to access & use the internet. As such, they are commonly thought of as a gateway to everything that is on the internet. These companies can operate in many forms, like as a privately owned company or a non-profit organisation. ISPs provide Internet access to businesses & consumers & can also offer other services such as domain registration & web hosting. ISPs have evolved significantly since the internet was founded. Access used to be gained by using a dial-up connection, before moving to satellite, copper wire, fibre optics & other high-speed broadband technology.
This stands for key performance indicator & is generally a way of measuring the success of either an activity or employees within a workforce. It's not limited to this, however, & is commonly used in other areas of business, including websites. Setting a KPI for a website may require the site to achieve a certain amount of click-throughs or sales within a set period. As a result, if this is not achieved, it indicates to the website owner that changes need to be made to improve the effectiveness of the website.
An essential part of any SEO strategy, keyword analysis is the process of searching & analysing keywords that are relevant to your industry. It's so important because it gives you the jumping-off point to write content that targets these keywords, to get people to your site and, ultimately, increasing sales. Finding relevant keywords that are of good quality, with high search volumes, is essential & is the start of any search marketing campaigns. Broadly, there are three types of keywords - short, mid, & long tail. It's important to target a mix of these keywords as they are all searched by different people, with different intentions.
This rather striking term refers to an issue that is quite common among several sites. Keyword Cannibalisation refers to what happens when a website has several pages that target the same keyword. When more than one page ranks for the same keyword they start to diminish each other's authority & reduce click-through rates (CTR) & conversion rates because they are competing with each other, hence the cannibalisation part of the term.
To avoid this, businesses must create a clear sitemap to ensure that each page targets a specific keyword to avoid cannibalisation. A good example would be if a company sold shoes & within that, they had a page which targeted white shoes, cannibalisation would occur if they then created a new page which was called 'More White Shoes'. As both pages are targeting the same term, 'white shoes', they would split the CTR (click-through rate) & conversion rates that the company would receive for the traffic that is driven to the pages. While this may sound good, it isn't, it is much better to focus on one page with a 3rd place ranking than two pages with a 6th & 7th place ranking.
The term keyword density refers to how often a given search term or keyword appears in the content of a page. It is a metric calculated as a percentage - for example, if you have a 500-word blog & a keyword appears five times then the keyword density would be 1%. Sometimes known as keyword frequency, keyword density is a foundational aspect of SEO.
Keyword density as a metric is interesting because of the way that it has changed over the years. Search engines & their algorithms have grown a lot over the years - for example, 'keyword stuffing' used to be a viable SEO tactic in the 'old days', whereby some SEOs would force a high keyword density to influence SERP performance. However, some SEO experts believe that search engines now use high keyword density to identify spammy content, which can potentially lead to lower search results. Some SEOs believe that keyword density hasn't been a 'thing' for a number of years.
This is a metric used to define the competition for a certain keyword. When keyword research is being carried out for certain terms, there is software that will return this data & inform the user how many other sites & pieces of content are being optimised for this same keyword. The more sites which use this keyword, the more difficult its rating is which means it's going to be hard to beat the ranking of these existing sites.
Keyword difficulty is extremely valuable as it can help users determine which keywords they should optimise. If the keyword which someone is looking to optimise has an extremely high difficulty, they may opt for a similar keyword or keyword phrase which has a lower difficulty & are therefore more likely to rank higher than if they attempted to use the difficult keyword. Finding the right balance is also key as there is a reason these keywords have a high difficulty, which is that they have a high search volume. These keywords are the most popular because they are the most relevant & most popular in terms of searches.
This refers to the number of times a keyword is mentioned on a webpage or in a piece of content. The more times a keyword is mentioned, the higher the frequency. Frequency is closely linked to keyword density which describes how many keywords are featured next to each other in a sentence. Getting the frequency right is key to a successful SEO strategy. A low frequency means your page won't rank for the keyword you're looking to target, too high & you run the risk of over-optimising your content. Try to use variations & synonyms of the keyword to keep things on track.
Keyword proximity is the term used to describe how close keywords are in a body of text. For example, let's say you sold black shoes, you'd naturally want to target the keyword "Black shoes". The close these two words are together in the text, the better keyword proximity. "We sell black shoes" is better than "We sell shoes that are black". Good keyword proximity allows search engines to better understand the context of the page. Good keyword proximity is one of the easiest things people can do to ensure a successful SEO strategy.
One of the core & fundamental skills of any successful SEO is that of keyword research. Search engine optimisation can't live without keywords, & keyword research is the process by which SEOs determine what relevant words & phrases users might search for, & which of these are the best to be optimised for.
Keyword research isn't just about simply identifying what your customers are looking for, it's about finding out which terms have the most search volume & competition, which keywords will either be easy or difficult to rank for, & researching which keywords are going to drive the most traffic to your site, increase brand exposure, & provide your business with the most profit possible.
Keyword research is one of the foundations of any SEO campaign.
A website or software which uses an algorithm to determine which are the most popular keywords used on Google. The tool will be able to identify exact match keywords based on how many times they appear on relevant web pages & the density of the keywords. It will also be able to identify long-tail keywords, which include the keyword you suggested, but in a longer format such as a question. Reports based on these keywords will be able to identify how competitive they are & how much value they will have to your ranking position.
These tools are used to identify which keywords a piece of content should be optimised for to achieve the best ranking on Google.
Stemming refers to a search engines ability to understand the different ways a particular search query can be spelt. Google in particular has used stemming in its algorithm for years. Stemming means you can use the word 'buy' in a keyword, & then use 'buying' & 'bought' in other contexts. Search engines will understand that they all mean the same thing. It's important to use Stemming because it makes your content look more natural, & reduces the need to crowbar in keywords as they are written. You can afford to be more relaxed with this & still receive high rankings.
You want to avoid this. Keyword stuffing refers to the overuse of a keyword in a way that doesn't read like natural conversation. Keyword stuffing used to be popular when algorithms rewarded it but, while it is still used by some in an attempt to gain an advantage, it is now largely punished. An example of this would be: "We absolutely love skips & skip hire so if you want to hire a skip in the UK contact us & hire your skip today". Natural content that flows nicely & includes one mention of your specific keyword will rank much higher than the above example. Simply because it is nicer content for the user to read, which is what search engines care about.
Google's knowledge graph is a useful concept that benefits so many of us, yet not many people stop to understand how it actually works. Ever typed a quick query into Google & received a fast, concise & enlarged response? Go on, try it now. Search, for example,'Toy Story release date'. What follows is a very specific, short answer that eliminates the need to continue researching for answers within the page results. A knowledge graph is a hub of information derived from different entities & the connections that are easily identified between them. This can include tangibles such as locations; organisations & individuals, as well as intangibles such as colours & emotions. The data stored on the web around certain topics helps Google to recognise the most common answers to search queries, allowing the search engine to collate the most relevant information for users. By implementing a strong combination of SEO practises such as link building & adding large chunks of informative, authoritative content regularly, Google will begin to validate your content & connect it to other sources.
In short, a landing page is nothing other than the page on your website on which a visitor arrives after clicking a certain link or ad. It's called this because when people visit your website, they have to 'land' somewhere. Wherever they first land, that's technically a landing page. But in general, the term has come to mean something more specific.
The term landing page is perhaps more commonly used to refer to a specific standalone web page. This is a page designed with the sole purpose of capturing leads & generating conversions on your website. While Google Analytics uses the term landing page to refer to any page on which a user first visited a site, you'll almost always hear the term used for specifically made, standalone pages.
Most websites build landing pages to complement & above specific campaigns, whether it be for a special offer, new product, or anything else. Paid advertising & search results will often send users to specially made landing pages to encourage specific conversions.
Another key feature of a landing page is that, very often, users receive something in exchange for their information. This could be a guide to an industry-related topic or a piece of content that offers insights that can't be found for 'free' on the internet.
Landing pages go beyond standard content, which has a goal of educating & enticing potential customers in & looks to secure the sale. Clarity, short copy, & easy to fill in forms are all key aspects to a successful landing page.
Link building is the process of increasing both the quality & quantity of backlinks - or inbound links - to your website to improve your SERP visibility & search ranking. Link building is a major tactic in most SEO campaigns and, alongside great content & web optimisation, have a great influence on your rankings.
The aim is to get other trusted, relevant & high authority websites to link to your website, sending traffic in your direction & also signalling to Google the relationship & trust between your websites. There are a lot of different techniques SEOs use for link building, some of the most common of which includes:
No SEO campaign is complete without a properly targeted link building strategy in place - it's one of the tenets of great SEO.
Also called 'Link Rot', 'Link Death' or 'Reference Rot', Link Decay is a term that describes the process by which a hyperlink no longer points to the original file, content, page, or server that made it be used in the first place.
Link Decay can occur when the resource it links to is changed to a new address or has been made permanently unavailable by the domain owner. Link Decay can damage website authority because the content it is citing is no longer as authoritative as it once was.
This topic is researched & studied by people around the world & is subjective because of the level to which the internet's ability to remember & preserve information is talked about. As a result of this discussion, the estimations of these rates differ dramatically. A good rule of thumb though is to ensure that the links you do use aren't to a personal website or more frivolous blog content. Another tip is to use WebCite to permanently archive information.
Diversity of links is a term that aims to describe how varied the links that feature in your content are. If you get as many diverse links in your content as possible, search engines will reward you with high rankings. Diversity can include the type of content you like to such as blogs, videos, & articles. It can also include the type of URL such as '.co.uk', '.edu' & '.org'. Be diverse in your choice of anchor text too, this will ensure that links are more naturally placed, while still being relevant to the content they are linked from.
It goes without saying that building domain authority is what boosts a website's ranking. But, how can domain authority be built effectively? Earning links from other notable websites is one of the best ways to establish credibility. The higher the quality of links you have connecting from another site to your own, the greater the impact on increasing domain authority. Donation links are a particularly straightforward type of backlinking. By researching & identifying sites that happily accept links for donations, you can expect a link to appear shortly after enquiring. However, it is essential that you look at the authority of the donation site to evaluate how impactful the backlink will be for your website. Obtaining link donors can be pricey, so it's always important to explore all of your options to locate the right sites!
Often more commonly known as 'link juice' by SEOs, link equity is a term that describes the way in which a link can pass authority from one page to another. It's a known search engine ranking signal, which is why it's an important part of any SEO or link builder's job.
The value of link equity is determined by a few different factors, including the topical relevance of the link, the linking page's authority value, HTTP status, & much more. Essentially, if a page with high authority & good SEO links to another website, that will have high link equity.
Link equity is a known ranking factor that Google uses to determine a page's ranking, but it doesn't only count for external & backlinks. Internal links can also pass link equity, allowing SEOs to better control the flow & structure of authority through a site's structure.
Back in the good ol' days of the internet - around 1997 to the early 2000s, link exchange emails were at their highest in terms of volume. After a real surge ofbusinesses finding themselves with a website, & reading blogs about how to promote themselves in search engines such as Excite, Yahoo, & Google, they invariably would send out emails to other website owners, having learned that links were 'everything' when it come to online marketing success. Essentially, an email would be sent from one website manager to another, asking for a link to & from each other, so that they could both benefit. Over the years, Google would push out many messages saying that this tactic was not as efficient as people thought it to be. & over time, these emails dropped to almost nil.
Another remnant from the very experimental & 'Wild West'-like late 90s-early 2000s were link farms. In many cases, these were single page websites - or single pages of links on an otherwise normal website - that housed anywhere from tens to thousands of links to various websites with zero thought of theming or rationale. Over time, these pages would become more palatable as Google changed algorithms. Theming was an option for many of such sites, while others just soldiered on until their impact was negligible or even detrimental. Thankfully they don't exist anywhere near as much as they used to. However, they do exist in the forms of PBN networks of sites, but in a form that looks a lot more palatable to the human & robot eye - although great work is being done by Google to find & rid these sites of their positive SEO impact. At the time of writing this entry at the end of 2021, link farms are very few & far between, & PBNs have nowhere near the impact that they used to have.
A link graph is a visual representation of the surrounding network of a website or specific URL. In other words, this graph is a map that pinpoints every backlink that connects to the central webpage in question. A link graph is used to collect useful data about how the authority of a domain may be increasing or decreasing - as the map demonstrates the quantity & quality of the domains that link to the focal URL. By analysing the relationships within the dataset, a business can accurately audit its website & keep track of the backlinks obtained.
When a website is solely building inbound links & neglecting outbound links. Google's & its algorithm reads websites from a neutral perspective meaning that it understands the higher value in links that are interconnected. While some think that just building inbound links will help protect their link value as outbound links can reduce the value, this isn't the case.
If your content is referenced as equally as it references other content & websites, Google will read this as a much more relevant source. Link hoarding can lead to Google flagging your site as spam & penalising it as a result. Understanding that link building is a two-way street & ensuring your site isn't built solely on inbound links is important to maintaining a good ranking on Google.
This is a term that refers to linking to a page from multiple pages but with different link text each time. This link text needs to be loosely related to the page that it is linking to & it's a great way of helping Google understand the target page more. This is because you're showing that other words are relevant to the page you're linking. Getting, for example, nine or so relevant keywords among just four links will help Google understand the site more.
Spamming links involves posting out-of-context links on comment boards, forums, websites & blogs that aren't relevant to where the link goes. The goal of link spamming is to increase the number of external links that a page has which, in theory, will boost its authority. Link spam is frowned upon in the SEO world because it's not providing any value & is a bit of a cheap trick. Also, because search engines are smarter than ever, the rewards for this practice aren't as great as they once were. A long-term approach that is genuine, & looks to provide value will organically boost your authority over time.
The speed at which your website is being linked to by other sites. This accounts for both other users building these links or you pointing links back to your own site. Some think that achieving a large number of links in a short period, therefore a high link velocity, would lead to them being penalised by Google but this isn't the case. Google won't penalise you for having a high link velocity as it will only count the quality of your links.
That being said, the chances are if your website is receiving a large number of links in quick succession, they're likely to not be natural links & this will harm the ranking of your website. Low link velocity, i.e. building links over a long period of time, generally constitutes a better quality of link & ensures they are natural which is much healthier from an SEO perspective.
For many small & medium businesses, targeting the correct audience is paramount for gaining revenue & custom. One of the main targets that businesses tend to tackle is their local area, as this is where the majority of their customers will travel from. More often than not, a person looking for a service within close proximity to themselves will add their location area to the search term. Let's say, for example, Julie in Manchester wants to find a local hairdresser. Julie will then go on to search for 'hairdressers in Manchester'. This is where SEO plays a major role in meeting the search criteria with relevant onsite content, helping to guide this user to their webpage for the services they wish to receive. Simple? Yes. Effective? Absolutely! Successful Local SEO marketing should include the following:
Whilst long-form content can seem self-explanatory, there is a lot of debate around the minimum length of long-form content. An effective piece of long-form content would usually be around 1500-2000 words at a minimum. Whilst longer in length, this type of content should still serve a clear purpose & remain engaging for the target audience. The upper limit to the amount of content on a page depends on numerous factors such as the purpose of the content, the target audience & the topic.
Pages that rank well on Google for certain keywords & ontological phrases are long-form pieces of content that are written well & have plenty of links within them. To write this type of content, you must think critically & conduct thorough research into the subject you are writing about. Long-form content also gives you plenty of room to write a piece that is well-written with keywords & ontological phrases subtly weaved through - keywords should not be jammed into the piece simply to fulfil SEO goals, this tactic will not create long-term success.
The best content for SEO & a good ranking on Google will be well-researched - to show the search engine & the user the knowledge that you have on the subject - & will use best practices when it comes to incorporating keywords & any relevant ontological phrases. Using research tools, such as Frase, Ask The Public & Ahrefs, to investigate the search volume of keywords, high-ranking sites in the relevant SERPs & frequently asked questions to search engines can help to inform your long-form content. Above all, long-form content will establish your brand as authoritative & credible, which will be rewarded in the SERPs & in leads & conversions.
A keyword that spans more than a single word. When used in written content, long-tail keywords are generally more specific & can be highly valuable if used correctly. They're used as a more targeted approach to customers, including extra words to make it tailored to a more specific audience. For example, if the keyword is "shoes" & you want to turn this into a long-tail keyword to target a more specific audience, it could become "men's smart shoes". Just by adding those extra words, it's already targeting a much smaller group of people, therefore making the chance of purchase higher.
Long-tail keywords are best used as part of a question or search phrase as the longer nature of the keyword will account for more of the question. They're also more effective when used in voice search terms.
If a human reviewer working for Google thinks that a web page or website doesn't adhere to Google's quality guidelines, then they may invoke a 'manual action', which would mean that a web page or a site as a whole will see lower rankings - or even be removed from Google's index altogether. In many cases, this will happen without any prior warning. Google does not like its index to be manipulated, & a manual action is the most fierce way for them to show this.
If you are unfortunate to have been given a penalty from the result of a manual action, you will most likely find a notification about it - not necessarily how to resolve it - in Google Search Console.
A mega menu is an expandable menu on a website. These are most commonly displayed as a dropdown menu, showing lower-level site pages to the user. When a mega menu is made effectively, it can improve the contextuality of your website, which is beneficial for SEO. With mega menus, an effective tactic is to alter the location of the menu HTML, to the bottom of a HTML document. This makes your web pages more contextual - the H1 of every page, along with the first paragraph, are closer to the top of the HTML document. However, the user will notice no difference, & their user experience will not be impacted.
Whilst having an effective mega menu is important, it should not be your only focus regarding links. You should also ensure that you interlink your pages with hyperlinks, situated within paragraphs in your webpage. This helps Google to understand your site better - with greater contextuality - which is beneficial for SEO.
A meta description is a short snippet of content that is found under the title of a piece of content that is indexed on Google. It aims to summarise the content that is available on that link, to allow the reader to decide whether or not they want to click on that link.
Meta descriptions are a perfect opportunity to entice potential readers to your site and, it gives you a chance to incorporate your target keywords into another aspect of your content. This further boosts the SEO performance of the content & gives you a greater chance of getting a higher rank.
Think of meta descriptions as an advertisement for your content, & write this short piece of copy as such. Therefore, make it compelling, exciting, & readable, remember, you only have seconds to attract someone who is scrolling through page one. If you don't write one, Google, & other search engines, will automatically pull through a sentence of text from the content, though this is nowhere near as impactful as writing your own.
Meta tags provide information that can be read by search engines & web crawlers. The information pertains to the webpage, & is found in the HTML of the document. Search engines will use meta tags to retrieve & understand information about the webpage, which they can then use for numerous things - for example, to determine rankings, & to display search results snippets. Whilst it should be standard practice to have certain meta tags on every page, some don't always have to be used. Examples of meta tags that don't always need to be used include social meta tags, robots, language, geo, refresh, & site verification tags. On the other hand, for good SEO practice, you should always include meta content type, title, meta description, & viewport. There are so many different types of meta tags that some are now deemed unnecessary by SEOs far & wide. For example, many say that the following meta tags serve very little purpose & are a waste of space, even in the eyes of Google - author, rating, expiration, copyright, abstract, cache-control, distribution, generator, & resource type.
Put simply, metadata is the data that informs search engines what exactly your website is about. Metadata gives descriptive information about a website & its content. Important examples of metadata include title tags, meta descriptions, & robots, but metadata can be found all over a website.
Metadata optimization is an important part of technical SEO & improving the nuts & bolts of your website. Like a lot of SEO, it's all about making life easier & more clear for Google, making your website more navigable & quicker to crawl, index & understand.
A metric is the quantifiable measure of one piece of data. For example, 'Page views' is a metric of how many people viewed a particular page on a website. Metrics are very singular, important, & extremely clear. Businesses use metrics to measure success & will set predetermined benchmarks that define said success. There are hundreds of metrics in the world of SEO, & it can be hard to find the most relevant ones to use. And, it's important to note that your chosen metrics will all depend on your business, products/services, industry, & sector. With that in mind, the five most important metrics could be argued as being the following:
Microblogging involves writing concise, short content, often for platforms that are specifically designed to publish & share this type of content, such as Twitter & LinkedIn. Links, images & videos can also accompany the text on a microblog, to maximise audience engagement & interaction. The content & the file size of a microblog is typically much smaller than a standard blog, & of course long-form content.
Whilst people often enjoy consuming these shorter snippets of content as opposed to a lengthy blog post, long-form content is still the key to ranking high on Google & generating more web traffic to your blog post, & your site overall. That's why microblogging usually takes place on platforms that are designed for this type of content, such as Twitter, Facebook, Instagram & LinkedIn. In other words, microblogging should not replace long-form content. Instead, it should act as a different tool & type of content to use within your digital marketing strategy - it can be well implemented into a social media marketing strategy. Microblogging is particularly effective when accompanied by a link to a long-form piece of content that has had its core messages taken & used in a microblog to promote the piece.
The purpose of micromarking is to help search engines quickly find, & understand, the content on your website. Introducing micromarking to your website involves the use of tags & attributes to structure information. Micromarking uses what could be considered as a unique language made up of tags such as <div> & <span>, amongst others. Microdata can be implemented on a website in two formats usually - JSON-LED, & Schema.org. So, how exactly does micromarketing affect SEO & the promotion of a website in the search engine results pages? Well ultimately, it can affect website promotion, although this doesn't happen directly. By using microdata, the site can become attractive to the search engine, which is then reflected in the SERPs. Having a higher position in the SERPs can increase the click-through rate of the snippet.
A Mobile Speed Update is an update performed by search engines that factors in mobile page speed when deciding on page rankings. This update punishes pages that are slow to load on a mobile device & promotes websites that perform quickly on a smartphone.
Search engines have a vested interest in providing users with a seamless experience, therefore, it makes sense that they prioritise websites that not only have optimised, well-written content, but pages that load quickly, thus reducing the amount of time that the customer is waiting.
Thankfully, there are plenty of tools out there that can test your site speed so that you can ensure your content will load in good time, & not affect your ranking. A good tip is to test your site on the weakest signal (i.e. 3G), because, by making sure your site runs well on the worst signal, you can be pretty confident that it'll be as good, if not better, on other signals such as 4G, 5G, & WiFi.
This stands for name, address & phone number. Using NAP in your web pages is important if you want to rank for locally-based searches. By regularly stating your name, address & phone number on your website will build your presence in the location in which you operate. Typically your NAP can be added to the footer of your website, which means it will appear on each webpage & boost the number of times it is mentioned.
By ensuring this information is readily available & placed in important places on your website, you're reinforcing your website's local identity. Making sure this is kept consistent throughout your site will give you the best authority, which means if you have several business numbers or business names, it's best practice to use the same ones throughout. Setting up a Google business page & adding your location is vital if you want to rank for searches in your location.
These occur when other websites, blogs & online content links back to your website because it's relevant to their topic. This assures you that your content is of good quality as it's being referenced by other sites & will help improve your ranking when Google crawls your site. Typically, this is the best & safest form of linking as it drives traffic directly to your website & any changes to Google's algorithm will leave your site unaffected.
The more natural links your website gets, the better the ranking will be & the higher your site will appear on Google. Once this happens, more people will see your content & likely link to it on their site, which further improves your site's ranking without you having to do any link building yourself. When other highly recognised blogs or websites link to your site, this massively improves your ranking.
This is a query which is searched with the intent of finding a specific page or website, unlike an informational query which is a more general search to find any information relating to the topic. For example, if you search for a brand name, the first result will likely be that brand's website & will be classed as a navigational theory, whereas a question would be informational as you're looking for an answer.
These types of queries are difficult to rank for as google understands exactly what type of search this is & will only bring up relevant results. If you're searching for a specific brand, Google won't provide you with similar results as the first choice because Google knows you're looking for one specific website. Intercepting the search journey of the user & the result with this type of search is near impossible.
The nesting level of a webpage refers to its location within the hierarchy of the website, in accordance with the main page. The nesting level is referred to as a number, such as 1 or 2. As an example, the main page of a website would have a nesting level of 1. Consider this main page to act as a trunk for the rest of the webpages to branch off. These branches might be documents with a nesting level of 2. Then, branches that extend from these branches have a nesting level of 3, & so on. When adjusting nesting levels & optimising them, you should always remember that the further documents should be located a maximum of 2 clicks away from the main page of a website. By putting the resource further away, you are essentially hiding this information - the search engine will deem this too complicated, & this will be affected in the search results.
By understanding the concept of network science, you can also gain a better understanding of how to create a successful website that is search engine optimised. Networks are all around us - there are computer networks, social networks, the Internet & even genetic networks. Network science, as an academic field, is the study of these networks & the connections between several elements.
When you have an understanding of network science, you will begin to understand why websites rank highly, & why others do not. You'll also know what it takes to make a site rank well & the steps you need to take, as your knowledge of network science will help you in your thought process of why things are the way they are.
Most links on the website are 'follow links' - this means that they are links that search engines are intended to follow when indexing your website. However, in instances where you do not want a search engine bot to follow a particular link, you can choose to add a 'nofollow' tag.
Nofollow links allow users to click through & use the link but prevent crawlers from following & indexing them. If you want to send users to a particular website but, for whatever reason, do not want to provide that website with any SEO buff - or link juice, as it is sometimes known - a 'nofollow' attribute will accomplish this.
This prevents the passing of domain authority to another website via a backlink if, for example, it is a paid link. The 'nofollow' meta tag is a great way of controlling your linking & the way that your website passes authority & link juice onto other websites without restricting your linking for user experience & value.
A HTML tag that requests search engines not to index the page & remove it from search engine results pages. This can be done when pages need to exist as they are crucial to your website but, due to their nature, could harm your website's ranking if they were crawled by Google. An example of these pages would be the login page to your website or pages which thank a user for subscribing.
If you don't have direct access to your server, including this tag in the HTML of your webpage is the best way to ensure Google doesn't index it. Inserting the following tag into the header of your webpage will prevent most search engines from crawling your page, <meta name="robots" content="noindex">. To prevent only Google from indexing your page, use <meta name="googlebot" content="noindex">.
This a collective term that describes all the work done to improve a website's SEO performance outside of its website. At the heart of off-page SEO is backlinking, which is the process of painting links to your site on other people's websites. Getting your website URL on other people's websites will help boost your authority, especially if those websites themselves have high authority. For instance, your off-page SEO would improve if you got a link to your site from a University website or charity website. These types of '.org' & '.ac.uk' domains are valued very highly by search engines.
On-page SEO is the opposite of off-page SEO & is a term that characterises everything done on your website's individual pages to improve the ranking in the SERPs. On-page refers to the content of the page as well as the HTML source code. Ways to improve your on-page SEO include things such as adding keywords to your content, adding in links, & creating a good header structure. On-page SEO aims to improve your domain score, which is measured 1-100 with 100 being the best, & 1 being the worst.
Ontology for SEO purposes is an extremely important factor, as it can help to increase your ranking & the position of your site on search engine result pages (SERPs). To rank high on Google, a common process involves incorporating keywords into your copy, so that Google recognises these terms & considers your site to be worthy of a high ranking on the SERPs. However, as with most aspects of digital marketing, things are ever-changing when it comes to the way content is ranked. Now, ontology is more important than ever.
According to the Oxford English Dictionary, ontology is 'a set of concepts & categories in a subject area or domain that shows their properties & the relations between them'. For your content marketing, this means that you can produce content that shows your knowledge about topics & concepts outside of simply the basics. You can show how these link together & relate to each other, & the main topic of discussion.
To use ontology to your advantage, you need to change the way you think about your keywords. Gathering keywords with a high search volume to include in your content should now be considered a starting point, rather than the only step in the process. These keywords should be a foundation on which you build your content or your knowledge. From this foundation, you should then find related ontological phrases to use within your content too. Using ontological phrases effectively will show your in-depth understanding & knowledge about the subjec you are writing about. This will then demonstrate to both the search engine & your audience, that you are best placed to offer the products, services or advice.
Open Graph is a type of meta tag - it's a snippet of text that communicates the content on a page to social media platforms. Taking Facebook as an example - Open Graph allows integration of your site & Facebook, & communicates what content should show up when one of your pages is shared on Facebook. Whilst many people argue that Open Graph tags don't directly affect your on-page SEO, it's still worth using as they can influence the performance of the links used on social media. People do also argue though that with Open Tags, people are more likely to see & click shared content when the tags are optimised. As a result, you gain more traffic from social media to your website. There are usually three reasons for this - because the tags tell people instantly tell users what the content is about, the content is considered more eye-catching on social media feeds, & because they also help Facebook understand what your content is about. The latter can help to boost your brand visibility through search.
Organic search results are those that appear on the SERPs without having been paid for. These natural & organic results are the result of Google indexing & ranking your page based on its content quality & relevancy against any given search term. If, for example, you implement a long term & dedicated SEO campaign for your website, you may eventually organically rank at number one on the SERP for a targeted search term without having to pay for advertising.
In turn, the traffic that visits your website through these organic rankings is known as organic traffic - traffic that has found & visited your website through making a search & finding your website on the SERP. This is essentially free traffic, as opposed to the traffic you gain through paid advertising.
This is a term that refers to the practice of putting too many SEO techniques onto your pages, or site. Over optimisation can take many forms, the most common way it is found however is by overusing a keyword. Back in the day, before Google got wise to this issue, companies could stuff their website with tonnes of keywords & be rewarded for it
Nowadays, Google is much smarter & severely punishes businesses who 'keyword stuff', not only is it a bad practice but it also makes the user experience much worse. Another way businesses over-optimise is by pointing all their internal & external links to what would be considered 'obvious' navigation pages that are top-level & apparent. Google rewards hard work, therefore, businesses should look to link to pages found deep inside the website as this shows you've worked hard to link to specific pages.
Linking to toxic sites, & trying to rank for keywords that aren't relevant to your business, will also be punished by Google.
A Portable Document Format (or PDF as it's commonly called) is a type of file format that was developed by Adobe in 1993. It is used to capture & send electronic documents in an intended format. A key feature of PDFs is that they display a document in the exact way the user wishes it to appear, no matter what device it is being viewed on. For regular Word documents, a PDF may not be needed. What they are good for is larger documents such as articles, product brochures, & academic papers. PDFs can be interacted with & users may zoom in on specific parts of it if they wish to.
Pay Per Click (or more commonly PPC) marketing is a form of paid advertising used to show your ads across the search results & Google's display network. This is a marketing model in which adverts are optimised & paid for, most often bidding on specific keywords & search terms to target relevant traffic to be sent to a specific resource, whether a product page, landing page, or anything else.
PPC is a form of SEM (search engine marketing), often working alongside SEO as a paid and, arguably, more targeted alternative. The cost of the ads varies according to a number of factors, such as relevance, competition, & account history, & campaigns can be set for a number of different criteria. If you want ads to burn through your budget for as many results as possible, you can do that. & if you want your budget to carefully deliver a greater return on ad spend, you can do that too.
Combining SEO & PPC is a powerful marketing tool, providing greater clicks, conversions, & increasing your real estate in the search results. There are a few different platforms through which pay per click marketing can be done, including:
Google's PPC ad platform & the most commonly used on the web.
Microsoft's alternative PPC ad platform.
Yahoo!'s PPC ad platform.
Page Authority is a term used to describe a score that was developed by Moz which indicates how well a page will rank once it has been indexed & placed in the search engine results page (SERP). Page authority is based on a 1-100 scale (with 100 being the highest, & one being the lowest). Page authority isn't binary, which means that getting from a 70 score to an 80 score is much, much harder than getting from 30 to 40. This is because the processes & work needed to increase the score, when you're already in the top end, is more complex, takes longer, & requires specific knowledge.
A great way of boosting your score is by getting your website link on a well-trusted website (such as a Government website or a '.org' domain). This shows that you're well trusted & deserving of a good score because an established site deems you authoritative.
Google Panda was initially released in 2011 & was originally known as 'Farmer'. The algorithm update's purpose was to get rid of low-quality websites in the organic search engine results, & instead reward websites that could be considered 'high quality'. Whilst originally rolled out separately from Google's core algorithm, Panda has since been incorporated into said core algorithm during March 2012. The problems that Panda was designed to address include duplicate content, thin content, sites that lack trustworthiness & authority, content farming (low-quality pages, often aggregated from other sites), low-quality content, content that did not match the search query, & high ad-to-content ratio, amongst plenty of others.
Parsing is a form of automation that involves gathering & extracting information from online resources, such as a website. The information/content is in the form of HTML code, & the results are added as a database. An example of parsers are search bots that can analyse the data that they receive is stored in a database, & then they display the relevant documents when searching. Parsing happens in three phases - the content is retrieved in its original form, the data is then extracted & transformed, & the result is generated.
A Google penalty is the search engine's way of punishing websites for errant behaviour. This penalty can take the form of being delisted for a particular keyword or having your ranking drop significantly to the point where audiences can't find you. A Google penalty can affect any website & can be handed out as a result of well-intentioned efforts to improve a site's performance. The reasons behind why the penalties are handed out are shrouded in mystery, much like the algorithm. Reasons aside, Google penalties are to be avoided at all costs as they are hard to shake off.
Shortly following Google Panda was Google Penguin - another algorithm from the search engine designed to reward high-quality websites & content & limit the amount of low-quality websites, such as those engaging in keyword stuffing & using manipulative link schemes. After its release, Penguin went through ten updates before becoming part of Google's core algorithm in early 2017. As mentioned, the primary purpose of Penguin was to diminish the presence of websites using keyword stuffing & link schemes. Keyword stuffing involves adding large quantities of keywords to a webpage in an attempt to manipulate your ranking position. As for link schemes, these refer to the purchase, development or acquisition of backlinks from sites that could be deemed irrelevant/unrelated & low-quality. This ultimately paints a picture of relevance & popularity, which is false, to try & gain a higher ranking.
This is a modern feature that Google uses to provide the most relevant search results for its users. These boxes consist of questions that are relevant to your search query as Google attempts to predict your next move to save you time. The inclusion of this feature allows Google to provide you with more information on a particular topic than simply a list of web pages.
Another factor that makes this feature so popular is the inclusion of a snippet from the accompanying website. Once expanded, the box will provide a relevant snippet of information taken from the website to provide the user with the answer to the question without them having to click through.
The popularity of this feature has increased dramatically since its introduction, giving businesses the chance to rank on page 1, for search terms which they otherwise may have been further down the pecking order.
This is when standard organic SEO results are overridden in favour of other results which Google has deemed more relevant to them based on recent searches. Google is continuously gathering data on us as we use the search engine & can use this to provide us with incredibly accurate results. This feature makes keyword optimisation & rank tracking more difficult & can create uncertainty as to where exactly your site ranks.
For example, if one user was to search for 'football boots' & follow that up with 'socks', there may be sock brands that ordinarily would rank first but Google may pick a brand that specifically sells football socks & make that your first result. This isn't Google changing that result for everyone though, the hard SEO work the sock brand has done to gain that top ranking won't be undone, this is simply a personalised result for you.
Used to build websites, apps & software, code is what tells the website how to operate & look. Everything on the internet essentially boils down to lines & lines of code. In terms of SEO, code can be used to help strengthen the ranking of a website through certain coding techniques. Writing or rewriting code using this technique will allow Google to read & index your content. This is done through the use of key phrases which Google will then read as relevant to the topics included in your website.
In addition to links on your website, this can help give your website a boost in its current ranking. Though links can do a lot of the work in terms of achieving a good ranking from Google, coding which has been SEO optimised will put further emphasis on the relevance of your site.
Pop-ups are windows that, without prompt, appear when you land on a webpage. These pop-ups will often encourage people to sign up for a newsletter to receive a discount or remind them that there are items in their basket. Search engines hate old school pop-ups (those that open in a new window) & will very often ban them & prevent them from appearing on the user's screen. The new types of pop-ups that appear within a window don't tend to affect SEO performance but they should be used sparingly as they can annoy users & discourage them from going back to the site.
Also known as a public blog network or a PBN, a private blog network refers to a collection of websites designed to build links to a single site to manipulate the search engine rankings. As this involves several sites linking to each other or to one central site, PBNs are similar to link wheels & link pyramids. The sites used are typically built using expired domains that have some previous history. These expired domains are usually re-registered & have some content added featuring links to the target site. You may be able to identify PBNs due to a few different factors. For instance, the sites might all have the same IP, & similar site designs & themes. The site ownership might also be the same, & duplicate content may be present. PBNs are not a great way to build links & authority, & ultimately, you should steer clear. When it comes to link building & gaining greater authority on the web, best practices are always the favoured option!
A type of lead that has been deemed ready to be contacted by the sales team. Typically, a lead will come through a website & get in touch. If their enquiry matches the business's criteria, or the problems they are messaging about can be solved by the products/services sold, they'll become 'qualified'. A qualified lead will then often be approached by a member of the sales team who will take time to answer specific questions based on their enquiry, & provide one-on-one time. They do this because they know this qualified is interested & more likely to purchase the product than most.
The Google quality guidelines are essentially written guidelines for webmasters & SEO detailing what tactics are forbidden or discouraged. The quality guidelines highlight what actions Google deems to be malicious or attempting to manipulate search results. The Google quality guidelines essentially define what can be considered 'black hat' or 'white hat' SEO.
Black hat SEO & spammy tactics, for example, are actions that go against the quality guidelines. Meanwhile, those SEOs who abide by the guidelines can be considered white hat. The Google quality guidelines have changed a lot over the years & it's important to stay up to date - failing to do so could even lead to a manual penalty & a huge step back in your SEO efforts.
A Google Quality Update is an update that is undertaken by the search engine every so often with one goal - to demote poor quality content. These updates can have a serious impact on your website if your SEO isn't good enough, & your content is old, uninformed, & not following best practices.
However, if your website has recently been injected with fresh, optimised content that is keyword friendly you may find yourself benefiting greatly from these updates, as Google rewards you with higher rankings. What does 'quality' look like to Google? Well, increasingly, 'quality' work is determined by how beneficial the content is for the reader. Google is putting the user at the heart of its algorithm, & as a result, so should your content. Yes, SEO should still absolutely feature, but so should be well-written content that is genuinely informative, brings value, & is in-depth. By showcasing your knowledge you're helping the reader & thus, impressing Google.
The term query often refers to a search query which is simply whichever phrase you've entered into the Google search engine. Google will provide a list of results it deems relevant to your search query, which although it seems like a simple process, there is a lot of work that goes on behind the scenes. For example, your search query will contain a keyword or keyword phrase which certain websites will be optimised to rank for & this is how Google chooses the most relevant results.
This further proves the relevance of effective keyword optimisation as it's crucial to ranking highly for certain keywords. This also creates a competitive environment as websites are continuously competing with one another to achieve the best ranking. Whilst this competition is happening, Google is benefiting the most as it's streamlining its results & providing the most relevant websites possible to its users.
Standing for return on investment, ROI is the term used to describe the amount of money a company receives from their initial investment. In SEO terms, this could relate to the amount of money spent on a new website, a PPC ad campaign, or the investment made to hire SEO content marketing experts. ROI is relative to every business & each company will have different definitions of good & bad return on investment. For instance, a new company may consider a good ROI to be to break even. Whereas a larger company may see that as a terrible ROI because their investment was much higher.
Marketing a product can be expensive & can be done through different channels & methods. Return on marketing investment (ROMI) is a metric used to measure the effectiveness of a marketing campaign. As such, it analyses the results of the campaign in relation to a specific marketing objective. ROMI is similarly to return on investment (ROI) but is more specific as it pertains specifically to marketing. For ROMI to be effective, marketers should set measured metrics for the campaign. Simply, ROMI is measured by calculating the total revenue generated against marketing investment, & it should only reflect the direct impact of a marketing campaign. In the context of SEO, ROMI calculates the ROI for your SEO campaigns - if the organic revenue generated by your SEO campaigns is higher than the cost to run them, then you will have a positive ROI.
A live feed of updates relating to a specific source. An RSS feed can be set up to provide results on a specific topic or news source & the feed will deliver updates on new pieces of news & articles which are published. Rather than being a list of new content, an RSS feed is used to notify users that this new content is now available.
Typically, they're used to notify users when blogs or podcasts have been published & in their most basic form will be text-only. RSS feeds with images & videos are available, but it's recognised as a text-only feature. RSS feeds often appear as a widget option to insert on web pages & blogs.
'Ranking' is the broad term used to describe the position of different web pages on search engines. The higher the ranking, the more likely you are to receive more web traffic, and, in theory, more sales because users don't want to scroll down too far, preferring to just click on the top result.
All businesses that feature on search engines are aiming for a high ranking, getting on page one of Google for a search term is a good goal to start, from that, efforts should be taken to get as high as possible on the first page. Getting on page one is so fundamental because, quite frankly, no one scrolls to page two - it is that stark.
Rankings can be changed by regularly updating & adding to pages with fresh content, & ensuring site speed & performance is good, & not over optimising. Regular monitoring is key to ensuring rankings remain stable.
A ranking signal, or ranking factor, is the term for any one thing that is believed to contribute to how Google's complex series of search algorithms will analyse & rank your website, determining its organic search results & rankings. While Google traditionally keeps its cards close to its chest, it has for years claimed that its algorithms rely on hundreds of unique ranking factors to help deliver its users with the highest quality & most relevant search results.
Reciprocal Linking is a term that refers to when two hyperlinks link to each other. For example, if LJ Digital had a link to the Manchester Evening News (M.E.N) homepage, it would be reciprocal linking if the M.E.N had a link on their page that linked to l33roy's homepage. Reciprocal links are usually pre-agreed between two webmasters & are done so because it is seen as mutually beneficial, with both websites achieving a boost in authority & almost helping each other up the rankings ladder.
However, reciprocal links are quite controversial & many SEO experts see it as a bit of a scheme. This is a notion that is held because there used to be instances of webmasters getting together & monopolising a keyword by reciprocally linking to each other's sites. This is less common now that Google is far more intelligent. As long as your reciprocal links happen by chance, there is no need to worry about them negatively affecting your rankings.
These are keywords that will overwrite a master keyword if relevant for a specific region. This is used in location-based searches when somebody searches for something, often a business, in a specific area. Once they enter the name of a place, if a website has been optimised for that keyword, Google will view that as the most relevant & offer it as a result because Google knows you're looking for a business in that specific area.
Regions cover a larger area than that of local-based searches & have the potential to achieve much higher traffic by doing so. This is done by using a series of geographically relevant keywords to ensure you rank when these terms are searched. If you search using the term "near me", Google can use your location to bring up relevant results of places near you as a result of these regional keywords being used.
When looking to improve your SEO, it's important to consider relevant queries - in other words, the queries that your target users will be using on search engines. One of the ways to uncover queries to target is by carrying out keyword research - this will help you to identify how popular these queries are, & how difficult it is to rank for them. Typically, there are considered to be three different types of search queries - navigational, informational & transactional. Navigational refers to a search query that's used with the purpose of finding a particular website, such as Facebook. Informational queries cover broad topics, where thousands of search results could be deemed relevant. As for transactional queries, these show an intent to complete a transaction, which is usually a purchase. For example, searching for a specific product would be considered a transactional query. Within these three groups are then queries that would be relevant to your target audience.
Response codes - also known as HTTP response status codes, or simply status codes - indicate to users whether a specific HTTP request has been successful. A response code is a three-digit code indicating the server's response to the request. For example, a 301 response code indicates that the page has moved & the user will be redirected, while a 403 response code means that the user is not authorized to visit that web page. There are a lot of different response codes.
This term refers to the idea that web design & development should respond to the user's behaviour, interactions, & the environment in which they are in (i.e. desktop, mobile, & tablets). A key element of responsive web design (RWD) is the notion that elements of the page would reshuffle & change orientation based on the device that is being used. RWD is very important in this modern world, where people view content on so many different devices because it's important to offer the most optimised experience for the user. If you offer a poor experience that can't be viewed on a phone & a laptop, people will become frustrated & leave your site. If you're an e-commerce company, this could be the difference between making a sale, & not.
A retargeting campaign is a process by which a company will carry out certain advertising campaigns to target people who have recently left a website without purchasing anything. These campaigns can be specific to a certain product category or the site as a whole. Retargeting can take many forms such as email or social media advertisements. Sometimes they will include an offer to financially incentivise them to come back, other times it will just be more of a reminder.
Rich Snippets are more in-depth snippets of content that Google displays on the search engine results page (SERP). Rich, in this context, refers to the amount, & type, of information that is available in the snippet - this could involve pictures, reviews, reading time, & even nutritional information & pricing if it is relevant to the search term & web page.
Rich snippets can increase the chances of people clicking through to your content because there is more immediate information available to them, which can increase confidence that this site will have the information they need.
To benefit from Rich Snippets you must add something called structured data to your website. This is the form of code, written in a specific format that can be understood by search engines. Once read, search engines can use this to create Rich Snippets. Using plugins on your website, & reading up on the importance of structured data, can ensure that your website will, over time, be featured as a Rich Snippet.
Robots.txt, or the Robots Exclusion Protocol, is a text file used by a website to communicate with web robots search as search engine spiders (or crawlers). Robots.txt is a text file accessible at the root of the website that communicates important information to crawlers. For example, using robots.txt allows SEOs to tell search bots how to process each page of the website. You can set certain pages to be ignored by the crawler, ensuring that only the most useful & important content is crawled & indexed.
The SERP, or search engine results page, is the page displayed by Google or another search engine after a user enters a search. The SERP will display usually around ten organic search results, ranked by order of relevance & quality, showing the URL, page title & short description of each result. Your ranking on the SERP will determine your user visibility & greatly affect the amount of organic traffic coming onto your website.
Raising your SERP ranking & visibility is one of the first things any SEO will aim to improve & is a key goal for most SEO campaigns. The SERP is how Google analyses the intent of the user search against its index of web pages & websites in an attempt to deliver the most relevant & useful content. Depending on the kind of search made, the SERP may also include other features such as:
Search engine optimization is closely tied to improving your ranking on SERPs for the queries & search terms most important to your business.
A sales funnel refers to the journey that potential customers go through on the way to making a purchase. There are several steps within a sales funnel, & these are usually the following - awareness (when people first become aware of brand/product/service), interest (does it solve their problem, competitor analysis), decision (they might dig deeper into prices & packages), action (making a purchase). However, the stages might be different depending on the company's sales model. A sales funnel allows you to understand what potential customers are doing & thinking during each stage of their purchasing journey. With these insights, you can decide on the best way to invest in your marketing channels & activities to create relevant messaging at each stage, which will then ultimately turn potential customers into paying customers.
Sandbox is a filter used by Google that is suspected of preventing new websites from ranking high in the search engine results pages (SERPs). As Google ultimately aims to prioritise good quality, up-to-date content, it's believed that the Sandbox tool helps the search engine to filter out new or 'flash-in-the-pan' websites from those that are more comprehensive & better managed. In essence, Google's aim is to be a reliable & useful search engine that people can use to quickly find what they're looking for. Because of this, relevance is of the utmost importance to the success of Google as the go-to search engine, making tools, such as Sandbox, highly desirable. Sandbox helps Google separate the grain from the chaff of what websites should appear highest in the SERPs. In reality, few people know for certain whether Google Sandbox actually exists, but this filter is suspected of having been added to Google's algorithms sometime around March 2004.
A satellite domain or satellite website is a site that has been set up by a business or webmaster with the sole purpose of boosting the authority & presence of the main domain or website. For instance, if you owned a website that sold products in a competitive market or where a customer's own research was a key part of the sales funnel you may well create a separate website that was filled with relevant content & topics which could then link back to the main domain. In theory, by doing this, you're strengthening the main domain by constantly linking back to another site which, in the eyes of Google, is proof of an authoritative website that's worthy of high rankings. However, while satellite domains were a fruitful tactic a few years back, they are now considered more of a black hat SEO tactic than a grey one. Like any SEO practice, the ones that are genuine, honest, & take time to complete will win out in the long term.
Also commonly known as 'rich snippets', schema markup is a kind of microdata that, when added to a web page, creates an enhanced description of the page. Added to the HTML of a web page, the schema markup improves the way search engines read your page & allows them to include the rich snippets in the search results.
Now, while schema markup doesn't necessarily directly impact your organic search rankings, it's still a great thing to implement as part of your SEO campaign. This is because it gives you more space on the SERP, improving what we like to call search engine real estate. It's also known to improve click through rate from your organic rankings.
Google's search console (GSC), formerly known as Google Webmaster Tools, is a free service provided by Google for website owners & webmasters. It is a series of free tools & resources for use in website optimisation, performance tracking & more. These tools are invaluable to any SEO, & verifying your website with GSC is generally considered an SEO campaign best practice.
Once you have access to the Google search console, you'll have access to a wide range of resources. Through the search console, you can submit sitemaps, check for manual penalties on your site, access crawl reports to check the indexing of your web pages, monitor your site speed, & more. Google search console also provides valuable performance tracking information such as your number of impressions on search results, your ranking position for certain keywords & search terms & the number of clicks you are receiving.
The search engine results, or simply search results or search engine rankings, are those pages, ads & links that the search engine delivers to a user based on their query. Appearing on the SERP (search engine results page), the search results are a ranked list based on relevancy & quality, matched to suit the needs of your search query. But the search results will also include relevant ads too. Essentially, Google's number one goal is to deliver users with the best, most useful & relevant content possible - the search results are Google's attempt to do this.
When you search for a topic on a web browser, it will document everything you've searched for & give you a list of your search terms & visited websites. Essentially it leaves a breadcrumb trail for you to retrace your steps, visit previous websites & keep track of your movements online. Google also reads this & uses it to understand your interests, habits & trends as a user to present you with relevant content. It can also tell whether you're using a browser on a desktop or an app on a mobile device & even keep track of the ads you click on.
While you can delete your search history locally from your computer, the data will remain on Google's servers. There are options for users who want to browse the web without their search history readily available to be viewed such as Incognito mode. However, this won't hide you from Google & your online movements can still be tracked.
Search robots are automated tools used by search engines, such as Google, Bing, & Yahoo!, to build their databases. These robots, also known simply as bots, wanderers, crawlers, & spiders, systematically crawl the web to discover new websites, as well as updates to existing ones, & create a record of the digital spaces they've crawled. They do this by following a series of links, scoping out connections between webpages & processing the data, such as content, sitemaps, links, & HTML codes, to create an up-to-date index. As search robots are automated, they process the data they crawl far faster & more accurately than a human could. Their main use is by search engines, which use them to scan the content of the web; how all content may be viewed by the search bots unless coded with robots.txt, which will deny them access.
This is the amount of times a keyword is searched for by a user on a search engine. This can indicate the popularity of a keyword & help users determine which keywords to optimise for. If a keyword has a very low search volume, it is likely not worth ranking for. If a keyword has an extremely high search volume, you will want to rank for that keyword but the competition for it will be very high. Opting for keywords that have a good search volume but not too much competition are the best option for keywords to rank for.
Large search volume keywords are often very broad in comparison to low search volume keywords which are likely more focused on a specific topic. It's often easier to rank for low search volume keywords but there is a chance it will be less relevant to your website due to the specificity.
Refers to the changes in search trends throughout the year due to the season. For example, there may be a rise in searches for chocolate around the last couple of months of the year as people search for Christmas presents. This also allows websites to focus on these key search terms at specific times of the year & drive conversions.
Targeting these keywords at popular times allows businesses to take advantage of these search trends & attempt to climb the SERP rankings as a result. At other times of the year, these specific keywords may be less popular & ranking for them at these times may prove fruitless. This is why it's important to understand the market & track changes in trends. Certain keywords will reach peak seasonal popularity, which you can find out through various tools such as Google Trends. This allows businesses to do research ahead of time & ride the trends at the right time.
A single word that acts as the starting point for a string of keywords. Seed keywords are short & don't feature any modifiers. Some long-tail keywords may include seed keywords alongside other modifiers. For example, if the seed keyword was 'boots', the long-tail keyword may be 'children's blue football boots'. This is just one variation of the keywords as they are the starting point & can be grown into something longer, hence the name.
Seed keywords can be very useful as they can help users understand the relevance of certain topics. They can also be used to generate longer keyword phrases that are associated. When used alone these seed keywords can be too broad to achieve any real results but when used in a longer keyword phrase, they hold much more value.
A semantic core is a cluster of keywords & phrases that encapsulate the types of goods or services that your business sells. It aims to cover the broad scope of phrases that a potential user may input into a search engine to find the answer to something that they need. Determining your semantic core, then, is a key aspect of a corporate marketing strategy, especially when developing your website's on-page SEO. This cluster of keywords should be used throughout your website so that your website's authority is optimised within this core field you have specified. An effective semantic core for your company should, therefore, address the needs of a search query your target audience is likely to be asking. When reviewing your website's semantic core, it should accurately describe what your business does so that the users it funnels through will be appropriate & interested in what your business has to offer.
A site map is pretty much what it sounds like - it's a map of all the pages on your website. It's important during SEO because it quickly & easily informs Google of your site's structure & content, making it quicker to crawl & index & improving navigability. There are typically two different types of site map:
Site Structure refers to how you structure your website & helps search engines understand what elements of your site are the most important. A solid, well thought out site structure is so, so important to the success of your website. By structuring things properly, ensuring your build a pyramid shape of pages that begins with the homepage, & then branches out to service pages, which then link to smaller topic pages, you make it as easy as possible for search engines to figure out what your website's purpose is.
Other things that come under the umbrella of site structure are things such as categories, how blog content is parsed, taxonomies, internal links, navigation, & breadcrumbs. As your site gets bigger, develops, & has more content added to it, it's important to keep everything organised as this will make it easier for everyone to navigate around your website.
The simple definition of social media is that it is websites & programs online that facilitate the creation & sharing of content & online media by individuals. The most widely used & popular examples of social media include Facebook, TikTok, Twitter, YouTube, Instagram, LinkedIn & more. Social media websites allow users to socialise while creating & sharing their content.
Social media marketing is a form of digital marketing that is often performed alongside search engine optimisation. From paid advertising to organic posting, social media marketing is a powerful tool with the ability to tap into a massive & active audience.
Over the years, social media has become more important to SEO, with links from many social media websites now appearing in searches. Securing links within social media sites & encouraging web traffic from social media accounts onto a website are becoming increasingly important SEO tactics.
In the world of search, social signals refer to the likes, shares, comments, & interactions that businesses have across any social media channels, which search engines perceive when calculating visibility. Building a relevant cohesive social media strategy is no longer just 'a nice thing to have' but something that can directly impact your SEO. These days search engines want to give rankings to businesses that are real & active. What more active signal that your business is open & ready to provide products/solutions is there than a regularly updated set of social media channels. While not yet as valuable a currency as backlinks, social signals, in particular, those signals that involve the sharing of content & web pages is certainly something that should be considered an important factor. The importance of social signals is further emphasised by the fact that Google & Twitter have partnered & now display tweets from businesses that are relevant to the search term. This kind of feature is very useful for businesses that are operating in fast-paced industries & need to leverage new industry news & topics.
This is the term that is most people will know from right-clicking on your mouse when viewing a web page. On the menu that pops up once you have right-clicked will be an option to 'view source'. Upon selecting this option, you will given a view of the code that is used to render the page in your browser - the 'source code'. On this view, you will see various codes & tags from <head> to <i> to <p> & many, many more.
See response codes. Status codes is another way to refer to the same thing. HTTP status response codes are the three-digit codes that indicate whether a request made to a web server has been successful.
Otherwise known as schema, structured data markup can be added to a website's HTML to add more contextual information about a web page's content. This helps the search bots to identify the important & contextual elements, allowing it to more accurately crawl, index & rank a website's content.
Structured data is a form of microdata (a kind of different website code) that creates the enhanced description often known as rich snippets, allowing this to appear in the search results. This can take the form of FAQs, star ratings, maps, & more.
Structured data markup was generalised in 2011. Schema.org agreed with Google, Bing, Yahoo, & Yandex that they would create a standard form of structured data to be supported & displayed in the SERPs of these different search engines.
Shortened, mercifully, to TF-IDF, this term is not exclusive to SEO but it is a phrase that's becoming increasingly important as search engine algorithms begin to understand the wider context of pieces of content. TF-IDF stands for term frequency-inverse document frequency. It's a sort of technique that search engines - Google, Yahoo, Bing et al - use to measure the importance of a term, word, phrase, or keyword within a blog, web page, or site. From an SEO perspective, a TF-IDF helps to go beyond ranking just keywords & looking at the relevant content that surrounds it. In essence, it's rewarding webmasters who don't keyword stuff and, instead, create algorithmically wonderful copy that includes keywords & relevant information. The formula works as so: TF = (the number of terms that appear in a document)/(Total number of terms in the document). IDF = log_e (Total number of documents / Number of documents with term in it). Once you have those numbers you can then time TF by IDF. The end figure will give you a good idea about how many times you're using a particular phrase, compared to your competitors, & anyone else who is ranked for that term.
A website or web page's target keywords are the words & phrases that it is intended to rank for. This is a central part of SEO & digital marketing. Any SEO campaign will identify the target keywords each web page should rank for, & on-page SEO will be performed to optimise a page to specifically rank for these target keywords.
A title tag specifies a web page title, & is HTML element. These tags are displayed on search engine results pages - they are the clickable title that you see for each result. The title tag should always be accurate & concise, summarising the purpose of the page & its content. Title tags are important for several reasons, including for sharing on social media networks & for SEO. As such, there are certain things to consider when writing a title tag, for best practice.
For example, you should give every webpage a unique title & avoid keyword stuffing. In other words, don't add lots of keywords into your title tag for the sake of it. This can create a bad user experience, which Google will recognise & penalise you for in the SERPs. There is no character limit on a title tag but you should keep in mind that titles will appear differently depending on the display & Google's display pixels.
Simply, web traffic is the number of people that visit your website over a given period. Web traffic is measured in visits, which are sometimes called 'sessions'. Traffic is one of the most popular metrics used by businesses because it is such a clear way of displaying how popular your site is, & how effective your wider marketing method is at attracting audiences. When SEO analytics first came about, traffic was seen as the most important metric. However, it's now much more important to measure traffic alongside other metrics such as click-through rates & bounce rates. This will give you a much more rounded picture of your site's performance.
These are searches that are made with a clear & direct intent to buy a product or complete a transaction. The user will use the search engine to search for their desired products & complete the transaction, likely with one of the top results. This allows businesses to anticipate this & compete for the top rankings using these particular queries.
The clear indicator that a query is transactional is if certain keywords are used such as 'buy' or 'order'. The search for specific products or brand names also signals the user's intent to buy. By picking up on these keywords, the search engine is able to offer relevant results & help aid the user in their search to buy.
A Twitter Card is a couple of lines of code that allow users, who tweet links to your content, to show a 'card' that's visible to users. It's a handy thing to have because it allows you to go beyond the 280 character limit to create content-rich tweets that stand out as people are aimlessly scrolling through their feeds. Twitter Cards allow your users to view an image, watch a video, download an app, or even visit a landing page. It's easy to see how Twitter cards allow you to create social media content with real intent & encourage people to convert - be that watching a video, buying a product, or signing up for something. & the best thing? The user NEVER has to leave Twitter to experience these, which as everyone knows is great for people who don't like tapping, scrolling, or swiping anymore than they need to. The benefits of Twitter Card's are apparent - you can make sure you have a consistent look on posts across all your platforms, as well as attribution that could drive more traffic to your site. You can also create custom titles & descriptions for your photo & URL. Oh! & you create an awesome mobile experience for people!
URL stands for uniform resource locator. Found at the top of your browser when viewing a web page, the URL is the identifying string of characters that leads your browser to access & display any given web page. The URL for the LJ Digital page, for example, is: https://l33roy.com/seo-manchester
Your URL is important because it shows the web page's location on a network or domain & gives both users & search engines important information about the nature & content of the web page. Not to be overlooked, your URLs should also be optimised & well thought out - they're not something thrown away to simply be overlooked during search engine optimization.
An artificial link generated to manipulate a page's ranking. Unnatural links are generally a thinly veiled attempt by scrapers & spammers to piggyback off your website's value or attach your website to a 'bad' part of the internet to harm its ranking. These can be identified by Google as not being editorially placed, meaning irrelevant anchor text has likely been used as the topic of the page being linked is also irrelevant.
If a website is flagged for unnatural links, it's ranking can drop dramatically, with its owner required to send a request to Google should a warning be issued. Google made the decision to penalise for unnatural links in an attempt to prevent websites climbing rankings due to spamming links & irrelevant sources being recommended to users as a result.
User Experience (UX) is similar to User Journey but is a broader term & focuses on all aspects of a customer relationship with your website, & your products & services. While UX does refer to the ease with which people can head to your website & buy your products & services, it is broader because it also defines the experience a customer has after they have bought the product or service, be that a physical product, or if it's a digital product, it will be based on how easy the platform is to use.
Memorable UX is about going beyond what the customer thinks they want from a product or buying process & offering a seamless experience that keeps them coming back to your website or product. A good & often cited example of UX is on e-commerce sites. If an e-commerce website, app, or social media ad is well thought out then users will naturally gravitate back to that site to purchase more products because they don't have to spend time with a website that offers poor UX.
User Journey is a term that refers to the experience a person has when they visit your website. A typical user journey will more often than not start with a Google search, or clicking an ad, this will then take the user to the homepage, or a specific service page. From there, the journey should be about educating them, & making them respond to the call to action that's on the page - that could be calling, emailing, or inputting information.
A well-thought-out user journey makes it easier for the user to complete the goal & lowers the risk of them getting frustrated, & leaving the page to go to another website. Once a journey has been decided on, it's important to analyse the data to fine-tune it & make it as optimal as possible. Analysing areas where people leave your site, unnecessary interactions & time spent on the site are just three things that need to be looked at to refine the user journey.
The term vertical search relates to specific, smaller types of search engines that attempt to index content that is relevant to the website or company that is attempting to collate it all. While the worldwide web attempts to index all content on the internet a site that focuses on a vertical search just indexes work that is relevant. A good example of a Vertical Search engine is Yell which only collates crowd-sourced reviews of restaurants, attractions, shops, & other areas of note. Vertical search engines are beneficial in many ways. Firstly, because the scope is narrowed, these sites can be incredibly precise. Secondly, by being so precise they can include taxonomy & ontology that increases domain authority in the eyes of search engines. For organisations that are for-profit, attempting to become a vertical search engine is a great, long-term way of becoming an authority in your industry. For instance, if you worked in the fashion industry, your aim would be to create reams & reams of content that answer as many search queries as possible in as much detail. By being so comprehensive on a topic, & touching on every possible topic that's relevant to what your target audience is searching for, the theory goes that you force Google & other search engines to put you at the top of the rankings because you've out content-ed everyone.
Voice search is a form of user search that combines the latest in speech recognition technology with regular search engine queries, allowing users to verbally speak questions rather than needing to sit down at a keyboard & type them out. Voice search is becoming increasingly popular, and, as such, it is becoming more of a priority of SEO campaigns.
The given software will interpret the speech, translate it into search queries which it will then submit to one or more search engines to deliver the user with relevant answers. With new tools such as Apple's Siri & Amazon's Echo technology being specifically designed to utilise voice search, it is a trend that looks to continue growing & expanding in the future.
SEOs are beginning to adapt, with voice search optimisation becoming a more central skillset. Including short & concise answers in your content, optimising for featured snippets, & even creating specific voice search FAQ pages are great ways to optimise your site for voice search results.
Pronounced 'who is', WHOIS is a type of protocol that's query & response-based & relates to the querying of databases that store data such as IP addresses & domain names. The WHOIS database, which was drafted by the Internet Society, is displayed in a way that humans can read. It is an extremely useful tool that allows people to get in touch with those who own the domain. WHOIS took a bit of a negative turn a few years ago when GDPR was introduced. Essentially, because the WHOIS protocol publishes the names, & addresses of those with a certain internet domain, it directly goes against GDPR as it didn't ask for the express consent of these people before that data is shared.
Otherwise known as the Google SEO guidelines, or simply the Google guidelines, the webmaster guidelines are the guidelines set out by Google to provide website owners & webmasters with the information & guidance they need to optimise their website for search engine crawling & indexing. Alongside this, the webmaster guidelines also define what the search engine considers spam & the penalties that should be expected when violating the guidelines.
Because of this, the Google webmaster guidelines are often used to draw the line & define what can be considered black hat SEO techniques compared to white hat SEO. The webmaster guidelines are often split between the quality guidelines, which layout spammy tactics & what is considered good quality SEO, & the general guidelines, which define the actions that can be done to improve a website's indexability & crawl-ability.
Essentially, website navigation is just how a user moves throughout a website & travels from web page to web page. Navigation is often overlooked but is an incredibly important part of user experience. If navigation is unclear or confusing, you will lose the traffic that could have potentially engaged with your website & ultimately converted.
Your website navigation & structure have a big impact on conversions, bounce rate, sales, & more. If a user cannot find the content they are looking for, they will leave. A clear, strong & hierarchical website navigation structure guides visitors - especially vital for e-commerce websites with a clear sales funnel.
Essentially, white hat SEO is a phrase that is the opposite of 'black hat SEO' which is defined as negative SEO practices & often refers to hackers. Black hat practises that sabotage & cause harm to websites means that white hat SEO practises fall in line with Google's terms & conditions on how to correctly optimise a site.
White hat SEO practises are hugely beneficial to websites and, when carried out in the most effective way, can improve the ranking of a website on Google's SERP. Most SEO practices you come across will be white hat as they're done to improve a website. If your website hasn't used white hat SEO techniques, Google will flag this as a black hat practice & penalise your site. These punishments are likely to include de-indexing & your ranking being reduced.
WordPress is an open-source content management system (CMS) that is used by millions of people across the world - it is one of the most popular CMS out there thanks to its user-friendly interface & ability to add plugins & add-ons. It was first published back in May of 2003 & is now home to 75 million websites. More than 409 million people view around 23.6 billion pages each month, according to WordPress themselves.
One of the many positive features of WordPress is that it can be used by beginners, who have no experience with development & web design, to create a website, & by experienced developers who are adept at coding. This is because WordPress has an incredible range of features that suit people of all skills & abilities. The open-source nature of WordPress means that it is free & can be used, adapted, & changed based on the web master's needs. Plugins can also be added, such as Yoast SEO, to track SEO performance & enhance it.
Yoast SEO is a plugin that can be installed on a WordPress website to enhance SEO performance & ensure content, & web pages, are optimised for a specific keyword. Rather than just guessing, Yoast SEO makes it very easy for your business to meet, & exceed, SEO standards that can help you in your efforts to boost keyword reach.
Yoast SEO is a very user-friendly platform & allows businesses to create titles, meta descriptions, & focus keywords so that every aspect of your content is optimised. Once installed, the plugin rates each page, once the important information has been filled in, on a score from 1-100 (with 100 being the highest score, & 1 the lowest). It bases this score on the targeted keyword & the prevalence of it across the headers, copy, title tags, alt-text, metadata, & any other areas of your content where keywords are important.
A good score is anything that is 80+, the score can be improved by adding content & refining the technical aspect of content further. Yoast can also help you track the use of keywords (to ensure you're not over optimising pages) & manage sitemaps to ensure everything is structured correctly & optimised.
© MMXXII Lee Johnson
Made with in Lancashire
267 Lancaster Road
Your privacy is important. That's why this website doesn't track visitors with fingerprinting, your IP address or cookies.
Web analytics data is your own. Reclaim it.
Gain back control with a privacy-focused web analytics platform (using a cookie-less architecture) from: Data Centurion.