is a code that instructs the browser that the page you are trying to access is no longer available on the URL that you’re using, and you’ll find it this new address. What the browser will do is to automatically redirect or access the new URL, and the desired, relocated content will be displayed. While this happens often, internet surfers rarely notice this action.
Ad Retargeting and Remarketing
refers to the advertising approach that advertises to potential customers after they had the first encounter when the said customers visited the business’ website. A website uses a code or device known as “cookie”, which assigns a unique ID and track the visit history of a user. Subsequent ads will then be selected based on the visit history.
Alt Attributes, Alt Text & Alt Tags
are HTML attributes that are used to specify a description for an image. It provides an alternative information in cases where the user is unable to view an image. It is also used to provide text alternatives for search engines, thus, utilising these attributes properly can have positive impacts in search engine rankings.
is a free web-based platform that provides thorough statistics and research tools that can be used for search engine optimisation and marketing purposes. Aside from tracking website visits, Analytics can also be used to track various data including user demographics, behaviour and interactions – page views, bounce rate, average time spent on site/page and conversion rates.
is a descriptive text used to link one web page to another. Aside from being clickable, it is also readable to both visitors and search engines. It is considered as one of the 3 major factors that allows web pages to rank in search engines. For many years, it has remained as a vital part of SEO campaigns and content marketing strategies.
is defined as a percentage of sessions, wherein the user left the website or page through which they entered without any other interactions. Bounce rate not only indicates that a business has lost the opportunity to convert the visitor into customer, but can be interpreted by a search engine that the website or page is either not related or does not offer vital information for the user, thus negatively affecting the site or page’s search visibility.
is a form of search using branded keyword. The keyword used for searching usually contains the name of a brand, business or company i.e. “Business Cherub SEO Services” – where ‘Business Cherub’ is the name of the business. You may want to rank first for your branded search, but it isn’t always easy, even though it is your own brand.
refers a small amount of temporary data stored by browsers so that the website will load quickly next time you visit the said site. Aside from allowing websites to load faster, it also helps the search engine to reduce bandwidth.
is considered as the most important factor in Search Engine Optimisation. An efficient content optimisation means creating information that can be useful to users and search engines.
Conversion Rate Optimisation
is the process SEO agencies perform in order to increase the conversion rate of a page or a website. Conversion happens when a user makes a move which takes them further along the buying cycle. This can include clicking a button on a web page to ask for a quote, inquiring about a product, or buying the product immediately. It is best to build websites with conversion rate in mind.
is a name given to a program that search engines utilise to collect and index data all over the internet. A crawler typically visits a website through a hyperlink and then reads the website’s content, and the embedded links before following the outgoing links away from the page or site. This program will continue working until it has crawled every website that has a link to other websites.
is the process of moving a website from one server to another. Whether you are doing this to rebrand or rename your business or company, it is imperative that your onsite content be correctly redirected to its new home. Improper redirecting will cause 404 File Not Found errors, which can affect user experience that will lead to increased bounce rate and decreased page ranking.
denotes to the process of monitoring the interactions happening on a website. It provides information on how visitors are behaving once they land on a website.
is a type of event tracking process. It is the process of collecting the action, either a transaction or a significant step towards a transaction, of the user.
is an SEO process wherein you obtain links from other websites – links that point towards your own website. Links provide a clear pathway across the confusing maze of the internet world. Thereby enabling search engines and internet users to find and visit your website, and get it on the search results page.
First Click Free
is a tool that allows the Google bots to crawl and index the content hidden behind forms, mainly on subscription or registration-only websites. By allow Google bots to gain access to behind-form, these pages may appear on search engine results for related queries.
Accelerated Mobile Pages (AMP)
is an open source project created for mobile users. This open source coding is designed to improve the loading speed and readability of the website on mobile devices. Imagine it as an upgrade to mobile-friendly pages. The goal behind the creation of AMP, according to Google, is to improve mobile experience through instantaneous loading of rich content such as animations, graphics and videos.
is a search protocol that is used by Google. It was first launched in August 2013 as a replacement for Caffeine algorithm. It has affected more than 90% of Google searches. This algorithm allows Google search engine to perform better though the use of improved semantic search.
was launched in 2011. It’s a search result program that filters out websites with thin, low quality content. This algorithm was the start of a series of major quality control check made by Google. It removed the poorly constructed and spammy content from the search results pages (SERPs), thereby allowing higher quality websites and pages to get the top positions.
is an update that was introduced in April 2012. It algorithm was designed to reduce web spam and penalise websites that are not adhering to Google’s Webmaster Guidelines – often those who used black hat techniques to get backlinks and manipulate search engine results. Those who abide by the guidelines – websites with high quality links – where rewarded with top positions in SERPs.
is a microformat standard used to create structured data. This data is applied for the benefit of search engines over people. It helps Google to get a better understanding of the information on web pages. However, over the years, it has been phased out due to Schema markup.
is necessary to a web page’s ranking and to user experience. It changes the appearance of headings and subheadings. It also provides hierarchy to the pages, which is very useful for search engines. H1 is considered the main heading. It’s usually followed by H2, H3 and so on.
is an abbreviation for Hypertext Transfer Protocol Secure. It is the secure version of the HTTP, in which the primary function is to send data between the web browser and the website. HTTPS is a newer protocol protects the privacy of user data such as online banking and online orders, between a browser and a website. This is the main reason why HTTPS is widely used by e-commerce websites and those that contain sensitive materials.
is a computer programming language developed by Netscape, which is commonly used for creating web pages. It is mostly used for creating dynamic pages and to add special effects to pages.
is a popular programming language developed by Sun Microsystems. It was first launched in 1995 and from there, a lot of computer and web applications have become reliant on this programming language. It can be used to create complete applications that may run on a single computer or among servers and clients in a network. Java can also be used for building small application modules and applets for use as part of a webpage.
is a not programming language, but it’s a technique used to create fast and dynamic pages. It paves way for web pages to be updated asynchronously by making small amount of exchanges of data with the server. Because of this, part of the webpage can be updated without having to reload the page.
is an authoring software created by Macromedia. Its main use is for creating vector graphics-based animation programs with full-screen navigation interfaces, graphic illustrations basic interactivity in an anti-aliased, resizable file format that’s small enough to stream across a normal modem connection.
is a word or a phrase entered by users on search engines such as Bing and Google, which have a result page where websites are listed. This search term that SEO professionals and website owners use to optimise a website with the purpose of getting included in the search engine results page.
is the process of identifying popular words or search terms that people enter into search engines, as well as how many other companies or agencies are using these search terms to get ranked in search engines.
is a system created by Google, which is used for organising information from about millions of famous entities – persons, places and organisations to build a map of how data is interconnected. Google uses this knowledge base to improve its search results with the utilisation of human language technology and the semantic web.
in SEO term, is referred to as the value or equity passed though hyperlinks, from one webpage or site to another. Search engines like Google, interpret these links as votes made by other websites that your page or website is valuable and is worth promoting.
refers to the criteria for ranking search results. This collective term shows the significance of any given webpage on the internet and include a measure of domain authority, relevancy and trust.
is Google’s technology that rates the importance and quality of a webpage. It is one of the criteria used by Google to decide how high a website is going to be placed in the Search Engine Results Pages (SERP).
Domain Authority (DA)
is a search engine ranking score, which predicts how well a website will rank in search engine results page. Developed by Moz, it can be utilised when comparing websites or tracing the ‘ranking strength’ of a website over time.
is the utilisation of specialised search engines that enable users to submit geographically controlled search queries. Local search is a field that is continuously evolving. Ranking well in local search is imperative for businesses who need to draw customers to their physical location.
are snippets of text that provides a description of the webpage. While they do not appear in the page and can usually be seen in the page’s source code, they can help tell search engines what a webpage is all about.
is an abbreviation for Name, Address, Phone Number. These collective details are used by search engines to form a unique marker which separates one business from the rest of the business online. The more a business solidifies their identity, the better their local search ranking will be. By consistently listing your NAP details, you validate your presence in a certain area.
is an HTML tag that informs search engines like Google that a webpage is not endorsing links to other pages. This tag is vital for search engine optimisation since it proves to search engines that the page or site is not selling links or involved in black hat schemes.
used to be an important factor in search engine optimisation. These have been viewed as the online version of Yellow Pages. They played host to huge amount of links, which sent link juice to the sites listed in them, thereby boosting the rankings of the pages they linked to on Google results page.
Rel Canonical Tag
is a piece of HTML code used to mark web pages which are at risk of being interpreted as duplicate content. By utilising this HTML code in a page with similar or identical content, webmasters are able to tell search engines which page is the original one – the ‘canonical link’ – and which are subsequent copies.
Rel=Prev & Rel=Next
are functions introduced by Google in September 2011. It is utilised as a solution to the duplicate content issue. These functions are added to the HTML code of a website so as to tell search engines that a certain group of successive pages should all be indexed together.
Rich Answers or Rich Results
are becoming more and more significant to SEO. Over the years, Google users have become reliant on rich results for quick answers to quick questions and breakdowns of a broad range of topics. As Google continues to evolve and shift, its developers are continuously coming up with new and innovative ways to display answers to user queries.
Meta Robots Tags and Robots.txt
are utilised by webmasters and SEO agencies to provide instructions to search engine crawlers that going through or crawling and indexing a website. These are used to instruct search engine spider what to do with specific pages, which includes requesting not to craw a specific page or crawl the page but do not index it.
was introduced in 2011 and is a collaboration project by search giants including Google, Bing, Yahoo and Yandex. It is a type of microdata that was created to develop a specific vocabulary of tags to allow webmasters to communicate the meaning of web pages to computer programs that read them – specifically search engine crawlers.
Co-Citation or SEO cocitation
is the symbiosis that happens when websites discuss interconnected themes, concepts and mention each other. One of the most significant characteristics is the importance of the words that surround links for search engines like Google. This was first discussed years ago when Google updated it anti-spam protocol and prevented low-quality blog networks from having impact on search engine rankings.
is basically a file where you provide information regarding your pages, videos and other files that’s on a website and the relationships they have with each other. It is a resource that’s created for a website to allow discovery by search engines like Google and Bing. They provide an effective way if you want to position your website and pages on search results.
refers to a wide range of unwanted data, emails, pop-ups and links that we often encounter whenever we use the internet. It got its name from luncheon meat that was unwanted most of the time, but always present. Aside from being unwanted, spams can be dangerous, misleading and problematic for a website in numerous ways.
Syndication Source Tag
was introduced back in 2010, alongside original source tag. It’s a meta tag that was supposed to be used by the news publishing industry to aid the identification of original news. Websites used syndication source tag on syndicated content to admit to Google that their content is not the original and it also tells where to find the original one.
can be used to tell search engines how they should handle parts of your website based on their URL’s. These allow content to be filtered, organised and presented to the user. This also paves way for bots to crawl your website more efficiently. URL parameters actually refers to folders within a URL string, for example: domain.com/folder-one/ and domain.com/folder-two/. Folder one may have duplicate content to Folder two or where the content in Folder one should not be showing up in search results page.
is a review made by a computer user and published to a review website following a product testing or service evaluation. Consumers commonly provide user reviews voluntarily, but there are also some professionals who are paid to write reviews for products and/or services. User reviews are highly effective in SEO and digital marketing. Aside from being an effective promotional tool, search engines like Google, reward websites with positive user reviews with great search rankings. These days, more than 92% of consumers will check user reviews before purchasing a product or service.
Vertical Search Engines
are different from regular search engines in a way that it they focus on a specific part of online content. These are also called topical search engines or specialty search engines. To give you an example Google is the generic search engine model, while vertical search engine would be Google Image Search, Map Search, News Search and Web Search.
facilitates user movement from one web page to another. While it’s often taken for granted, website navigation is an integral part of SEO. Not only does it help search engine bots in crawling a website and find all the website’s content, it can also help organise pages and improve website user experience.
White Hat SEO and Black Hat SEO
are terms that have become quite common in search engine optimisation. To make it simple, White Hat refers to SEO practices that adheres to search engine recommendations. Black Hat on the other hand are SEO practices that break the rules set forth by search engines. Though black hat techniques seem to provide quick and easy boosts in search engine rankings, they never stay on top for long.