Local Search & SEO Glossary

LocalClarity's glossary with nearly 600 terms and definitions will quickly get you up to speed with both historic and current industry jargon.

C

check-in

A digital announcement of a customer's presence at a specific physical location, often a business. Check-ins are the key component of many location-based services including Foursquare, Facebook, and Yelp. Check-ins can be used as a vehicle for both tracking customers and rewarding them with special offers.

C

category

One of a set of approximately 2,000 default business types with which the local search engines try to associate each business in their index. Although each search engine and data aggregator has its own taxonomy, many categories are based on the North American Industry Classification System, or NAICS. The current Google Places for Business dashboard allows business owners to choose up to five categories, all of which must stem from Google's pre-chosen category choices.

C

cache

A technology that temporarily stores web content, such as images, to reduce future page loading times.

C

cannonical URL

An HTML code element that specifies a preferred website URL, when multiple URLs have the same or similar content, to reduce duplicate content. Also known as canonicalization.

C

cached page

A snapshot of a webpage as it appeared when a search engine last crawled it.

C

ccTLD

A country-code top-level domain. For instance, a company based in the United Kingdom would have a domain like this: www.example.co.uk, where uk is the ccTLD.

C

canonicalization

In mathematics, when the same data can be represented in multiple ways, it is best to standardize that representation by establishing the data's canonical form, the one primary form in which it will be used. In the computer science field, the act of defining the canonical form of data is called canonicalization.

C

call tracking number

A phone number used to measure the success of specific marketing efforts and to determine the source of leads. For local businesses, call tracking numbers are not recommended, as they can lead to multiple problems, including the clouding of clear NAP signals.

C

clickbait

Content or headlines designed to generate the maximum number of clicks rather than focus on providing the best quality or value for users.

C

city landing page

Most commonly refers to page on a website providing information about a specific location of a business, most typically in the multi-location business scenario. Also called "location landing pages", city landing pages can be useful in helping a local business achieve search engine visibility in multiple cities, while also offering content that has been carefully customized to a specific geographic audience. City landing pages may also be used by service area businesses, like plumbers or house painters, to showcase their work in a variety of cities where they offer services, despite lacking a physical location there. See also: landing page

C

claim

The act of verifying one's business information with a local search engine and taking ownership of the business listing at that search engine. Reduces risk of hijacking by spammers or competitors. Often involves a PIN setup process with the search engine, platform, or app.

C

cluster

A search engine's collection of information about a particular business location from all of its data sources. In some cases, a search engine's attempt to create a cluster is too "aggressive," causing distinct business listings to merge in its index. In other cases, its attempts to create a cluster may not be strong enough, causing multiple listings to appear for the same business.

C

click through rate

The rate at which users click on an advertisement, link, or other search engine result. CTR is one metric used for measuring the success of online campaigns. In the case of local businesses, it's hypothesized that specific types of clicks on Google Business Profile listings can positively impact rank. These would include clicks-to-call, clicks-to-website, and clicks-for-driving-directions.

C

cloaking

Cloaking is a black hat SEO technique that tricks search engines into finding information from a website that is not what the end user will see. At one point this was a way to let search engines know what type of information was available in media containers like Adobe Flash or videos, but today, progressive enhancement is used. Unless you are up to no good, there is no reason to use this anymore.

C

centroid

A concept in the local search industry used to define a central point of geography or activity. Understanding of the centroid has evolved significantly over the years as Google's weighting of specific ranking factors has changed. The centroid was initially defined as the geographic center of a city, with ranking benefits being perceived for businesses physically located near that point on the map. The concept of the centroid then broadened to include the concept of "industry centroids" as a ranking factor, as it was perceived that there could be one centroid located in a city's auto dealer row and another centroid in an area hosting multiple medical centers. At present, the most common understanding of the centroid is that it has been transformed into a descriptor of human users. Wherever a user is physically located at the time they search for something local, Google's results will be customized to display the businesses nearest to the user's device. This may be referred to as "proximity to the point of search" or the "user-as-centroid phenomenon".

C

competition

Competition (keyword) is the measure of how difficult it will be to rank for a particular keyword. The competition for a keyword can vary depending on how popular the keyword is and industry competition. ... and aggregated their answers into one comprehensive guide for competitive keyword analysis.

C

competitor

A competitor is any website or listing you are competing against for online visibility. Results appearing before and after your business for a given query are your competitors. Online competitors may be different than offline competitors.

C

clustering

Clustering is the act of organizing websites into groups and categories. Not only does this make it a lot easier for search engines to look through, but it offers readers diversity in the top results.

C

Content Delivery Network

A Content Delivery Network (CDN) uses a global network of hosting servers that you can use to load your site from locations closer to each user. This improves your loading times for worldwide visitors, which is important if you are marketing to international audiences.

C

consistency

Publishing identical core business details across the web. In particular, the consistency with which local business NAP information is published influences search engines' trust in the validity and accuracy of this data. The publication of consistent business information also safeguards against consumer misdirection and customer loss. See also: NAP / NAP+W (Name Address Phone + Website)

C

citation campaign

The marketing practice of auditing, cleaning up, and building citations for a local business on a variety of local business data platforms. The fundamental impacts of proper citation management have led to the development of citation management software products that reduce manual work while minimizing error. See also: citation, service area business

C

click depth

Click depth is the number of clicks it takes to get from the home page, or an entrance page, to a destination page on a website. The more clicks it takes, the less likely Google will crawl the page or it will rank.Pages that are the closest to the homepage are considered to be the most authoritative and the most likely to be crawled and indexed by Google.Click depth is important for pages to be crawled efficiently and for the flow of link equity; therefore does influence ranking indirectly.

C

cocitation

How frequently two websites (or webpages) are mentioned together by a third-party website, even if those first two items don't link to (or reference) each other. This is a way search engines can establish subject similarity.

C

core update

When Google makes broad updates to its core algorithm. Google sometimes announces a specific theme to their updates, such as the Page Experience update, but core updates are non-specific and happen several times a year.

C

conversion rate optimization

The process of improving the number or quality of conversions that occur on a website. Some popular CRO tactics include testing changes to website design, copy, images, price, call-to-action, and messaging.

C

comment spam

Poorly written comments, often off-topic and self-promotional, posted by spambots in the hopes of getting a free (but ultimately worthless) link

C

content

Words, images, videos, or sounds (or any combination thereof) that convey information that is meant to be distributed to and consumed by an audience. One of the two most important Google ranking factors (along with links). Search engines want to reward content that is useful, informative, valuable, credible, unique, and engaging with better traffic and visibility.

C

content management system

A complex platform of computer code that allows a website to be easily edited or managed by someone with no knowledge of computer code. Popular content management systems include WordPress, Joomla, and Drupal.

C

content marketing

This term refers to the use of fresh, engaging and professionally written text on your website, in your blog, on social media platforms and landing pages. The goal of effective content marketing is to engage your prospects with beneficial and relevant information, but also to help increase your search engine rankings.

C

conversion

The process of convincing a website visitor to call, email, or visit a business offline (i.e., convert to a customer).

C

correlation

The extent to which a relationship exists between two or more elements. Often used in SEO research to infer relationships of variables on search rankings due to the black box nature of algorithms. Always remember, however, that correlation does not equal causation.

C

content is king

A phrase often used by speakers at conferences and writers on popular SEO (and digital marketing) publications. In this context, �content is king� usually means that content is essential for you to have any SEO, digital marketing, or business success. This phrase actually dates back to a Bill Gates essay, �Content is King�, published January 3, 1996.Recommended reading:Content is King http://web.archive.org/web/20010126005200/http://www.microsoft.com/billgates/columns/1996essay/essay960103.asp (Wayback Machine)

C

conversion rate

The rate (expressed in a percentage) at which website users complete a desired action. This is calculated by dividing the total number of conversions by traffic, then multiplying by 100.

C

core web vitals

A set of metrics that measure the performance of the page related to user experience. Core Web Vitals were introduced alongside the Page Experience update as the main signals that indicate a good user experience: Largest Contentful Paint (LCP) � loading performance. First Input Delay (FID) � interactivity. Cumulative Layout Shift (CLS)  visual stability. Google did confirm Core Web Vitals as a ranking factor but said that relevance and other factors may be more important.

C

cookie

A small file sent by a website and stored in a user's web browser while browsing the website. It allows websites to remember information about a user and display custom information such as advertisements when the user returns. This phrase actually dates back to a Bill Gates essay, �Content is King�, published January 3, 1996. Recommended reading: http://web.archive.org/web/20010126005200/http://www.microsoft.com/billgates/columns/1996essay/essay960103.asp (Wayback Machine)

C

code to text ratio

The amount of text displayed on a page compared to the amount of code used to construct the page is called the code to text ratio. A higher ratio of text to code is considered to provide a better user experience but is not a direct ranking factor.

C

crawl errors

Crawl errors refer to a number of issues that prevent search bots or other types of crawlers from accessing or parsing web resources. Such errors can include DNS errors, server connectivity issues, code bugs or issues with key files, such as your robots.txt file. Once again, one of your most important SEO tasks is to avoid crawl errors.

C

crawl

The act of a search engine reading a page. See also: spider, algorithm

C

crawl budget

The total number of URLs search engines can and want to crawl on a website during a specific time period.

C

crawlers

A crawler is a program used by search engines to collect data from the internet. When a crawler visits a website, it picks over the entire website's content (i.e. the text) and stores it in a databank. It also stores all the external and internal links to the website. The crawler will visit the stored links at a later point in time, which is how it moves from one website to the next. By this process, the crawler captures and indexes every website that has links to at least one other website.There are hundreds of web crawlers and bots scouring the internet but below is a list of 10 popular web crawlers and bots.1. GoogleBotGooglebot is obviously one of the most popular web crawlers on the internet today as it is used to index content for Google's search engine. One great thing about Google's web crawler is that they give us a lot of tools and control over the process.2. BingbotBingbot is a web crawler deployed by Microsoft in 2010 to supply information to their Bing search engine. This is the replacement of what used to be the MSN bot.3. Slurp BotYahoo Search results come from the Yahoo web crawler Slurp and Bing's web crawler, as a lot of Yahoo is now powered by Bing. Sites should allow Yahoo Slurp access in order to appear in Yahoo Mobile Search results.4. DuckDuckBotDuckDuckBot is the Web crawler for DuckDuckGo, a search engine that has become quite popular lately as it is known for privacy and not tracking you. It now handles over 12 million queries per day. DuckDuckGo gets its results from over four hundred sources. These include hundreds of vertical sources delivering niche Instant Answers, DuckDuckBot (their crawler) and crowd-sourced sites (Wikipedia). They also have more traditional links in the search results, which they source from Yahoo!, Yandex and Bing.5. BaiduspiderBaiduspider is the official name of the Chinese Baidu search engine's web crawling spider. It crawls web pages and returns updates to the Baidu index. Baidu is the leading Chinese search engine that takes an 80% share of the overall search engine market of China Mainland.6. Yandex BotYandexBot is the web crawler to one of the largest Russian search engines, Yandex. According to LiveInternet, for the three months ended December 31, 2015, they generated 57.3% of all search traffic in Russia.7. Sogou SpiderSogou Spider is the web crawler for Sogou.com, a leading Chinese search engine that was launched in 2004. As of April 2016 it has a rank of 103 in Alexa's internet rankings. Note: The Sogou web spider does not respect the robots.txt internet standard, and is therefore banned from many websites because of excessive crawling.8. ExabotExabot is a web crawler for Exalead, which is a search engine based out of France. It was founded in 2000 and now has more than 16 billion pages currently indexed.9. Facebook External HitFacebook allows its users to send links to interesting web content to other Facebook users. Part of how this works on the Facebook system involves the temporary display of certain images or details related to the web content, such as the title of the webpage or the embed tag of a video. The Facebook system retrieves this information only after a user provides a link.10. Alexa CrawlerIa_archiver is the web crawler for Amazon?s Alexa internet rankings. As you probably know they collect information to show rankings for both local and international sites.

C

crawling

When a bot discovers your site, a new page or any updated pages, its job is to crawl them. First, it needs to read the base code, parse it and then index the resource, if there's any impact on search ranking. One of your most important SEO tasks is to make your resources easy for bots to crawl and index.

C

cascading style sheet

A type of website code which allows for easier page editing by designers and faster processing of HTML by search engines.

C

custom category

In April 2, 2013, the Google Places for Business dashboard ceased to accept custom-written categories. Business owners must select pre-set categories only. Other local business indexes, however, may still allow the business owner to custom-create categories that describe what their business is. As of 2023, Google has moved on from custom categories and now offer thousands of structured categories to choose from. You cannot put in a custom category for your business, but if if none of the available categories suit a business's needs, you can select a general category that closely aligns with the business or choose multiple categories to provide more context. See also: category

C

coupon

A discount that is used to enhance engagement from consumers to increase SEO visibility of a specific web presence.

C

correlation

Refers to an apparent relationship between two or more conditions wherein the relationship may or may not be interdependent. For instance, �When I stepped outside, I realized I was thirsty.� Thirst was not brought on by having stepped outside. It was correlated, but there was no causation in this instance.

C

country indexes

Globally operating search engines, such as Google, usually have a separate index for each market. This means that, for example, there is a Google Index for US (google.com), a Google Index for Japan (google.co.jp) etc. Having national indexes helps the search engine tailor results to the search behavior (including but not limited to language) of each market. This provides a more reliable information resource that is more closely related to what users in the country are looking for. An inferior alternative approach would be to base results on what would be a universal index, including data from all markets, but this would make it impossible to meet the specific needs of users in each country.

C

cache

This is the storage of web content in memory, in order to be able to more readily serve them to a user. Caching commonly occurs on both servers and browsers.

C

custom field

A field in a local business listing set aside for adding information not covered by the standard fields, for example, brands carried, years in business, or the availability of on-site parking.

C

customer journey

All of the potential moments (or touchpoints) at which a prospect is exposed to or engages with a brand. All of these interactions are designed to eventually persuade, influence, and convert that prospect to become a customer, client, or subscriber. Though customer journeys can vary greatly by business type and industry, typically it is made up of four main �stages�: Awareness > Consideration > Decision > Retention Google�s Avinash Kaushik offers an alternative framework: See > Think > Do > Care Also known as: Buying Process, Consumer Decision Journey, the Customer Journey to Online Purchase, Marketing Funnel, Path to Purchase, Purchase Funnel

C

ChatGPT

A variant of the GPT (Generative Pre-trained Transformer) models developed by OpenAI, specifically designed to generate conversational text based on the input it receives. ChatGPT is trained on a diverse dataset to handle a wide range of topics in a conversational manner. The technology underpinning ChatGPT involves deep learning and natural language understanding, making it effective for tasks such as chatbots, customer service, and interactive applications.

D

data

All the hard numbers that represent real customers � the who, what, where, when, why, and how � all of which is needed to make informed decisions about SEO strategies and tactics.

D

defective links

A defective link leads to nothing, often a 404 error page. It could also have no object that it is connected to.Most defective links have a bad address as its destination. It could also be caused by programming errors.Defective links will have a negative impact on a website and make a crawler's job more difficult. This leads to a website appearing lower on search result pages.

D

deep link ratio

When an internal link points directly to a page other than the homepage on a site, this is known as a deep link. The ratio of deep links compared to links to your homepage is known as deep link ratio. It is considered that having links directly to deep pages in a site indicates quality of content on the site. The more deep links you have the better the site. There is no evidence to support that deep link ratio has any direct impact on ranking.

D

direct traffic

In Google Analytics, users that navigate directly to the site by typing the URL directly into the browser or by clicking on a bookmark are known as direct traffic. Google will also include into direct traffic any traffic sources it doesn�t recognize

D

data provider

A company with an explicit contract to supply local search engines with underlying business information. In the U.S., the major data providers are Infogroup, Localeze, Acxiom, and Factual. See also: aggregator, IYP (Internet Yellow Pages)

D

direction requests

The number of unique individuals who request directions to your client's business. This metric originally tracked the number of unique customers who requested directions to your client's business. It has been updated also to show the location of customers making the direction request. Changes were also made to make this metric more accurate. It now accounts for things like multi-tapping, direction request cancellation, and spam. The new direction requests metric is in the Interactions section of the Performance page on Search, while the old metric is on the Insights page.

D

dead-end page

A webpage that links to no other webpages. So called because once a user or bot arrives on this page, there is no place to move forward.

D

data aggregator

A data aggregator is a company that collects data on local businesses such as their name, address, phone number, opening hours, etc. in order to present it elsewhere online. Data is verified then sold (leased) to other companies in need of local business data. Companies that typically buy this data are online directories (e.g. YP.com), local-mobile applications, and mapping and GPS companies (e.g. TomTom).

D

domain age

The date a domain was registered on, to the current date is known as domain age. For example, Search Engine Journal was registered on 10th June 2003, so it has a significant domain age. It was once considered that a greater domain age gave a domain more authority, but this idea of domain age as an influence on ranking has since been dismissed.

D

Domain Authority (DA)

Domain Authority is a widely used metric created by Moz, not Google or any other search engine. It assigns a score out of 100 to websites, at the domain level, to represent its performance in search engines.

D

domain popularity

Domain popularity denotes the number of backlinks that direct from different domains to a website.Domain popularity is one of the main criteria for the importance of a website in the eyes of search engines. Excellent domain popularity may thus be an important success factor for good positioning in the SERPs. The former ranking factor of "link popularity" has been largely supplanted and replaced by domain popularity. Link popularity could be manipulated too easily, which is not the case for domain popularity.Google checks the value and reputation of a website on the Internet based on the number of backlinks. Over time, it turned out that this statistic was only partially suitable. If a website is recommended by one person over and over again, in other words, hundreds or even thousands or backlinks are set from just one website, it is still in effect only a recommendation. Therefore, as part of domain popularity (short domain pop) an additional verification is done of how many sites the backlinks originate from.Each domain from which one or more backlinks are received is rated with one point only. A website that provides 100 links to another, has the same value with respect to domain popularity as one with only one outgoing link. This is to find out how many different people consider a website useful and recommendable.

D

do-follow link

A do-follow link is a standard link that doesn't use the Nofollow attribute. Do-follow links pass on PageRank, which potentially helps the page or site they link to. However, do-follow links can also have a negative impact if they link to/from lower quality sources or irrelevant pages.

D

DMOZ

The Open Directory Project. This human-edited directory of websites launched June 5, 1998 and closed March 17, 2017.

D

domain name

The web address or homepage of a particular business or organization. Examples: JoesPlumbing.com, PortlandDentists.com, etc. Domain names are reserved and purchased from domain name registrars. See also: WHOIS, URL (uniform resource locator)

D

deep link

A deep link simply directs one to another page within the same website. Well developed websites with typically have plenty of high-quality deep links. Deep links also make it significantly easier for the customer to navigate the website.

Domain Link: https://www.localclarity.com

Top Level Link: https://www.localclarity.com/solutions/

Deep Link: https://www.localclarity.com/solutions/full-service-review-management/

D

de-indexing

De-indexing is the process of a search engine removing a web resource from its results pages, either temporarily or permanently. In this case, the resource is no longer accessible from the search engine that de-indexed it.

D

disallow

Disallow is a rule used in robots.txt files to determine what parts of the website search engines' bots should not crawl. This is often used to maximize the crawl budget and prevent from duplicate content issues. When using it, make sure you're not blocking any page rendering resources, such as CSS or JavaScript. If you're unsure whether your rules will block those resources, use robots.txt tester in Google Search Console.

D

directory

Any website which lists business names and contact information in an organized fashion, typically in alphabetical order or by business type. Directory information is frequently assimilated by the local search engines. For more information see: Local Directories and Citations

D

dwell time

The amount of time that elapses between when a user clicks on a search result and then returns to the SERP from a website. Short dwell time (e.g., less than 5 seconds) can be an indicator of low-quality content to search engines.

D

driving directions

It is speculated that requests for driving directions on applications like Google Maps count as user behavior, and may indicate the popularity of a local business and thus, have some effect on rankings. See also: user behavior

D

domain history

Any activity, including backlinks and website built on a domain previously is known as domain history. If a previous website on a domain received a penalty this will remain attached to the domain and cause issues for the new owner. It�s recommended to always check the domain history before you purchase a domain.

D

doorway page

A doorway page is designed to attract SEO traffic but usually, includes very little or irrelevant content. Some contain a lot of ads while others simply aim to lead users on to another page that has nothing to do with what the doorway page ranks for in search engines.

D

DuckDuckGo

A search engine that was founded September 28, 2008. It is often praised for its heavy focus on user privacy and a lack of filter bubbles (search personalization). DuckDuckGo relies on more than 400 sources to serve its search results, including vertical search engines, its own crawler, DuckDuckBot, Bing, and Yandex. In 2016, 4 billion searches were conducted on DuckDuckGo.

D

duplicate content

Duplicate content is simply repeated content that appears in different locations or even on the same page. There are instances where duplicate content is justified (eg: translations, different versions of the same product, etc.) but it's important you know how to deal with these issues.

D

duplicate listing

A problematic scenario in which more than one Google Business Profile (GBP) local listing exists for a single business. Google allows only one listing per location, and intentional or accidental violation of this policy can lead to penalties and ranking issues. Steps must be taken to resolve duplicate listing issues.

D

disavow

Disavow is an action taken when you want to disassociate yourself with a particular site. As on the internet everyone can link to you, there might be sites you want Google to ignore. Links are a vital part of the ranking algorithm and is important that search engines find mostly high quality and relevant links to your site. Whether you are doing a proactive backlink audit or cleaning up after a link penalty, you will need a disavow file to upload to Google Search Console.

E

engaging content

Engaging content is content that keeps end users connected to a website for longer periods of time. This is generally done with unique articles or slideshows that provide relevant and interesting information.

E

editorial link

A link that is given by one website to another without the recipient asking or paying for it. Also known as: Natural Link.

E

engagement metrics

Methods to measure how users are interact with webpages and content. Examples of engagement metrics include: Click-through rate Conversion rate Bounce rate Time on page/site New vs. returning visitors Frequency and recency Dwell time

E

ecommerce

The buying and selling of products, all conducted online.

E

engagement

A broad term that refers to the amount of time users spend and/or the number of actions they take with any website, page, resource or application.

E

.edu links

Educational-focused institutions have a top-level domain (TLD) of .edu. For example, stanford.edu. A link from such a site is known as a .edu link. Links from .edu sites were considered "hard to get" and thought to have more value for link building. As a result, link builders targeted .edu links until many of the lesser-known .edu sites became devalued by Google and any benefit of the link was ignored.

E

Expertise AuthoritativenessTrust

This is one of Google's most important "quality" metrics that aims to measure the usefulness and credibility of your site and its pages. Google wants to deliver the most useful possible content to users from sources it can trust and this metric analyses a wide range of signals to determine these factors. The amount of expertise, authoritativeness, and trustworthiness (E-A-T) that a webpage/website has is very important. Main content quality and amount, website information, and website reputation all inform the E-A-T of a website. Keep in mind that there are "expert" websites of all types, even gossip websites, fashion websites, humor websites, forum and Q&A pages, etc. In fact, some types of information are found almost exclusively on forums and discussions, where a community of experts can provide valuable perspectives on specific topics. Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc. These ordinary people may be considered experts in topics where they have life experience. If it seems as if the person creating the content has the type and amount of life experience to make him or her an "expert" on the topic, Google will value this "everyday expertise" and not penalize the person/webpage/website for not having formal education or training in the field.

E

entity

Entities are unique things which exist independently, such as people, places or things, so a company can also be an entity, as can a country or planet.

E

entry page

This is the first page that a person sees when they visit a website. For a large majority of websites, this is also called the home page. A website's choice of entry page can make all the difference when it comes to relevance and the user finding the site/page useful.

E

enterprise local search engine optimization

Enterprise Local Search Engine Optimization (SEO) is a specialized strategy and set of practices aimed at improving the online visibility and search engine rankings of businesses or organizations with multiple physical locations. The goal is to attract more local customers and drive foot traffic to each location by optimizing online assets for relevant local searches.

E

entities

People, places, organizations, websites, events, groups, facts, and other things.

E

external link

An external link is a hyperlink that leads to a page or resource outside a particular website. It is the opposite of an internal link, which links to URLs within the same domain. Backlinks or inbound links are sometimes called external incoming links

E

Ethical AI

The principles and practices that seek to ensure AI systems are developed and operated in a way that is morally sound and socially responsible. This involves considering the impact of AI technologies on individuals and society, ensuring fairness, transparency, accountability, and privacy in AI systems.

E

Explainable AI

The methods and techniques in AI that make the outputs of AI models transparent and understandable to humans. This involves designing AI systems in such a way that their decisions, predictions, and actions can be easily interpreted by users, allowing for greater insight into the AI’s functioning and rationale.

F

Facebook Local Search

Facebook once had an app and search tool called "Nearby" which has now changed to Facebook Local Search. This is a mobile search application for local searches. The main goal is to allow users to discover local business based on their current location. This is especially useful when launching a new location, or for areas with heavy tourist traffic as it can drive the large Facebook audience to find your business which they would likely have not done otherwise.

F

featured snippet

For certain queries, usually questions (i.e., who/what/where/when/why/how), Google sometimes shows a special block above the organic search results. This box contains a summary (in the form of paragraph, list, table, or video), as well as the publication date, page title, link to the webpage from which the answer originated, and URL. Also known as: Position Zero.

F

Facebook Graph Search

Launched in 2013, Facebook's internal search providing natural language results. Includes the ability to search for local places.

F

feed

A structured, automated list of content or data produced by a website. Feeds were created in order to allow users to subscribe to website updates. See also: RSS (really simple syndication), XML (eXtensible Markup Language)

F

Facebook

A major social sharing platform. Local businesses can create a Facebook business page, complete with location and contact information, and utilize this profile to interact with customers and potential customers. See also: social media (SM), Twitter, Facebook Local Search

F

factual

One of four primary data sources of local business data for all major search engines. See also: Acxiom, Infogroup, Localeze

F

frame

Search engines support frames and iframes to the extent that they can. Frames can cause problems for search engines because they don't correspond to the conceptual model of the web. In this model, one page displays only one URL. Pages that use frames or iframes display several URLs (one for each frame) within a single page. Google and other search engines try to associate framed content with the page containing the frames, but can't guarantee that they will.If you're concerned with how your site appears in the Google search results, please read Search Engines and Frames This document describes the use of the "NoFrames" tag to provide alternate content. If you use wording such as "This site requires the use of frames," or "Upgrade your browser," instead of providing alternate content on your site, then you'll exclude both search engines and individuals who've disabled frames on their browsers. For example, audio web browsers, such as those used in automobiles and by the visually impaired, typically do not support frames. Read more about the "NoFrames" tag.

F

fragment URL

Any URL that contains a # character is a fragment URL. The portion of the URL to the left of the # identifies a resource that can be downloaded by a browser and the portion on the right, known as the fragment identifier, specifies a location within the resource.

F

Fetch as Google

The Fetch as Google tool enables you to test how Google crawls or renders a URL on your site. You can use Fetch as Google to see whether Googlebot can access a page on your site, how it renders the page, and whether any page resources (such as images or scripts) are blocked to Googlebot. This tool simulates a crawl and render execution as done in Google's normal crawling and rendering process, and is useful for debugging crawl issues on your site.

See: https://support.google.com/webmasters/answer/6066468?hl=en

F

footer link

Footer links are sitewide links placed at the bottom of your website in the footer of every page.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.