Connect with us

Tech

Canonical Tags Best Practice Guide

Published

on


A definitive, accurate, and comprehensive guide to using canonical tags. Through this guide, supported by several practical examples, you will learn what canonical tags are. You will learn how to use a canonical tag properly and how to implement them. You will also learn about common mistakes you will need to avoid with these tags.

What are canonical tags?

In its purest form, a canonical tag is a signal within a web page that says, “this piece of content is a copy of another piece of content which you can find at location X.”

A canonical tag tells Google where the other piece of content is and tells the search engines that the original location of the preferred piece of content. Using tags overcomes the duplicate content issue with Google and other search engines.

What this means is that you can add third-party content to your website without the risk of duplication penalties; there are things to consider, of course, which we will get too.

How do Google and other search engines deal with duplicate content and similar pages?

In most cases, when a search engine crawls two or more web pages that have very similar content, they will select only one of the pages to index. The selected page is generally the one that originally posted the content — although there are several factors that are considered.

In most cases, all other pages with similar content are ignored.

You will be able to find them indexed on the search engines, but they will have little to no positive effects on the overall website authority, ranking, or scoring factors.

Search engines will also look at a range of other factors when deciding which page to index, including which page was first crawled. Also under consideration is how many backlinks each page has, and which one is offering the most internal links.

Duplicate content is extremely bad for SEO, so if you are developing a Search Engine Optimisation strategy.

Make sure the content being produced (web page, blog, etc.), is fresh and avoid any duplications. Duplicate content can also harm the site potential for increased conversion rates, as content will not be shown in its best possible form.

As websites grow, it is common that multiple web pages will contain very similar information or content which are vital to the infrastructure of the website environment.

Many web pages with similar information are particularly common with larger sitemaps and eCommerce websites.

If a website has multiple pages with similar content, you can add the rel=canonical tag on the preferred source page, which will remove the risk of a duplicate content penalty.

Example of product page duplication for an eCommerce website.

What you can see here are three of the same product being used.

Look closely. Each listing has a different URL. You can see through the breadcrumbs being pointed at by the arrows. This means that the product is shown 3three times and has the same content with only very small variations between them.

These products are likely to be hit with a duplication penalty.

What you can now see is the implementation of the Canonical Tag.

We have signaled to the search engines that the original product is Bats > MRF > Classic, which would be the URL.

The other two URLs (products) are copies of the first product. Because we are telling the search engines that two products are copying the main product through the canonical tag — we avoid a duplication penalty.

Page duplication examples that require canonical links.

There will also be occasions where web page URLs have special characters added at the end of them. These special characters are called parameters and will always follow a question mark in the URL.

For example:

When adding a parameter into the URL, completely different page content is shown despite the URL seemingly pointing at the same page.

There are three different variables setting a parameter in a URL will provide:

  1. The URL parameter will show completely different content.
  2. The URL parameter will filter out certain parts of a piece of content.
  3. The URL parameter will have no effect.

Setting up a canonical tag.

The first thing you need to do is decide which page is going to be set as the preferred URL. Mostly this will be the version of the page you deem to be the most important. You can look at the page with the most links feeding to it, or the page that gets the most visitors.

If you are using a CMS (Content Management System), such as WordPress or Magento, there are a lot of different Canonical Plugins that can be used to apply a canonical link. However, if you are going straight into the code, you will need to add a <link> to the <head> section of the additional pages but NOT the preferred page.

What the tag does is demonstrate to the search engines that the preferred URL for visitors wanting to access the bat’s page. This also tells the search engine that this is the page you would prefer the visitor to reach over the other similar pages.

Yoast is a powerful SEO extension that supports WordPress sites. They have created a very useful GUIDE, which helps further explain canonical considerations.

Canonical Tags: Common mistakes.

If canonical tags are wrongly used, there can be dire consequences. For example, let’s say you set up your site where your homepage was the preferred web page, and each page within the site was a copy of the home page. In this instance, search engines would complete de-index (remove), all your web pages from search results.

There are some common mistakes which need to be avoided:

  • If you have a non-dynamic canonical tag on each page of your website, which then points to one preferred URL/web page, you will be committing SEO suicide.
  • Multiple canonical tags on a single web page are quite common. Remember search engines will only count the first one meaning anything related to others will be discounted.
  • Always use complete URLs, which include the HTTPS:// section. A common mistake it forgetting this segment of the URL.
  • Do not point product pages to category pages. Product pages always need to be indexed separately.
  • If using a canonical tag on a paginated page, you may run into problems.

Canonical tags and paginated pages.

Paginated URLs are simply a series of URLs that follow each other in sequence. A typical example of paginated URLs would be URLs in a story or a series of products, lists of blog posts, etc.

If you were writing a story that had a number of chapters and each chapter was a web page you would likely only want to send users to the first page from the search engine results pages. Then the user could access the second web page from the first web page, the third web page from the second, and so on.

In this case, a canonical tag would be as damaging as pointing every web page/chapter to the first page would mean all content from page 2 onwards would be lost and have no search rank authority at all. This is something you should avoid as users may wish to jump straight to page 2, 3 or 4.

In this instance, your paginated pages would be treated as normal web pages. And, search engines would treat them as individual web pages rather than consolidating into one piece of content.

Self-referencing canonical URLs.

This is a fiercely debated topic when it comes to best practice SEO.

John Mueller from Google has created a Best Practice GUIDE, which is quite useful.

You can avoid the potential risk of a duplicate content penalty by implementing a self-referencing canonical, and many popular CMS systems will allow these parameters without changing content.

Cross-domain canonical URLs.

You can use canonical URLs that point to another domain. In this way, if you have a piece of content that was submitted to your website, but an external site also feels the content would be of benefit to their users — you can use the rel=canonical tag to ensure the URL is linked back to the original content on your site.

Anthony Godley

Award Winning Digital Strategist & Marketeer

We are all about Digital Innovation, Digital Governance & Digital Strategy | helping our partners achieve brand success in competitive markets across Australia, New Zealand & Asia.
Fuell is one of Australia’s GO TO Digital Agencies providing a level of digital creativity and digital leadership that companies require when competing in saturated markets and competitive industries.



Source

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Big Data in the Telecommunications Ecosystem

Published

on


Big data analysis is the next innovative technique that the telecommunications (telecom) sector is deploying. Big data will tame the telecommunications abundance of data, and enable harvesting information “gold” from existing data storage. We know that Big data includes paper as well as innovating new ways of collecting numerous touch-points of data.

Telecoms have always created and analyzed vast quantities of data about their customers — both financial and administrative transactions and operations. Telecom service providers have always been early adopters of data-related technologies. Included in this tech are old-school statistical analysis followed by data mining, knowledge management, and business intelligence applications.

According to a MindCommerce study:

“An average telecom operator generates billions of records per day, and data should be analyzed in real or near real-time to gain maximum benefit.”

For communications service providers (CSPs) to make use of insightful knowledge, much of the data must be processed in near real-time. Traditional systems would take days, weeks, even months to process the data, not to mention the complex variety of structured and unstructured data that would trip up legacy applications.

Big data analysis is used by telecommunications service providers (telecoms) in a variety of intriguing and practical ways that even a few years ago would not have been possible.

  • CSPs monitor network traffic to identify problems to make decisions that improve service and customer satisfaction. The CSP operational diagnostic information also helps prioritize investment in the networks’ physical and technical assets.
  • Telecoms analyze the metadata of call records to pinpoint fraudulent activities that protect their customers and criminal investigations.
  • Telecoms index mountains of documents, images, and manuals within minutes to help call center agents resolve customer issues. Agents can now quickly search for information previously locked away in paper. The paper resolution reduces the call handling time, thereby reducing labor costs. For commerce, these finable documents boost efficiency, which can increase both employee and customer satisfaction — and retention.
  • CSPs evaluate usage patterns to help businesses create service plans that better suit their customers’ needs. When you take care of your customers, it cuts customers’ costs — and more importantly to telecom companies — it helps them predict and reduce churn.
  • Telecom companies even use data pouring in from social media networks to optimize the content of, and investment in marketing campaigns on the fly.

Telecom big data sources include the obvious such as phone calls, emails, and multi-media messages. The telecom big data authority also extends to geo-spatial information, transaction metadata, social media usage, log data, file downloads, sensor data, and more.

History of Big Data

Until recently, the variety and velocity of data were vexing. Disparate data being created at rapidly increasing rates presented insurmountable storage, and processing dilemmas.

It may seem like big data is a recent scourge, but data recording and ciphering its value has been going on since 7000 B.C. The earliest modern 20th-century big data problems were US federal government projects.

If only Franklin Roosevelt had known that he was a pioneer in big data. Read more about these and other intriguing big data history facts.

But it would be the 21st century, in 2005, before Roger Mougalas from O’Reilly Media named the problem of managing and processing large sets of data. All of the data produced and gathered cannot be conquered using traditional business intelligence tools.

The term “big data” came long after the knowledge that there was a plethora of information that existed. And each year, each month and each day since, the amount of data have been recognized — innovations have been critical to contain it all.

The Internet of Things (IoT) drives the unfathomable speeds and quantity of data from sensors that are used for calculations. These calculations are lightning-quick, often life-preserving. By their very nature, these calculations must result in instant decisions and actions.

Telecom’s Big Data Analytics Trends

Some of the hyped-topics in telecom technology, such as virtual reality (VR) and augmented reality (AR), and even the latest 5G feel like it fizzles to many consumers. But the misunderstanding is mostly because of emerging innovations required to make the information happen. There is a latency between possibilities, and the human mind’s ability to take it all in.

All technology can’t be understood by the layperson, as raw material is presented for the next paradigm about-face. For sure, those technologies will deliver applications, both practical and entertaining, that will exponentially accelerate the creation of data. But for all persons to “get” or understand what is happening in the telecommunications field is like asking the community to understand a booster rocket. All people are not trained engineers, and they won’t have a firm grasp on the technical aspects.

For example, Kevin Westcott, Deloitte’s VP and leader of its US Telecom practice, predicts the popularity of e-sports to skyrocket. We all get that we want the popularity and reputation of e-sports to rise. But, this prediction was before the world-altering COVID-19 pandemic.

When you send millions of individuals into isolation — with little to do but seek out streamed entertainment — something will change in our world and the world of commerce. The forced isolation of the pandemic has brought cancellations of the entire seasons of the world’s hottest sports organizations. E-sports and the big data it generates will likely heat up faster than the predicted trends.

Also, legalized sports betting is on the rise following a 2018 US Supreme Court decision lifting the federal ban on sports betting.

To follow suit — several states have already legalized sports betting. 5G’s low-latency, high-volume communications will enable real-time sports betting. With 5G already being deployed in stadiums and sports bars, betting from your seat or barstool is imminent. (Yes, we are all hoping the stadiums and sports bars will be opened soon.)

Telecoms are at the ready — as they should be — to analyze every wager. New telecom connectivity technologies like 5G fixed wireless and satellite internet will offer the backbones needed for disruptive, big-data birthing applications.

The aggressive growth of smart homes and cars, video on demand, streaming apps, gaming, and other entertainment and educational applications will continue producing even higher volumes of data. The data will then be analyzed to glean the insights for businesses and other operational decisions.

Big Data Analytics Solutions

Big data analytics technologies are evolving right along with the technologies and supporting infrastructure. These are the very structures that created the formidable volumes of data in the first place. Companies must be able to collect data from different sources, analyze them, and distribute the information to disparate databases. Data centers or data warehouses are awaiting the intelligence, and depending on the data for the specific needs of the organization.

Marketing Campaigns

The challenge with big data projects is finding skilled resources with experience to create cutting-edge architectures and supersonic data processing applications. These data processing applications must support data-driven business models and near-real-time hyper-targeted marketing campaigns.

Problem-Solving Teams

Teams for solving big data problems must have a wide variety of engineers, analysts, business experts, and system integrators. These specialized teams are rarely found in-house for most companies, even large telecom service providers.

Outsourcing

Outsourcing and staff augmentation is frequently used for big data ventures. For example, Vates, a leading big data analytics and systems integration company in Latin America, is in the center of some of the biggest telecom big data projects in the world.

Global Telecom Company

Vates was hired by a global telecom company to bring its engineering and agile project management prowess to meld with the development of a system. Engineering teams located across the US, Chile, and Argentina are working on the development.

IBM Streaming Analytics

The combined in-house and outsourced team utilized IBM Streaming Analytics to develop architecture and big data analytics solutions. These processes take and analyze the unstructured and structured metadata from multiple sources in near-real-time.

One of the resulting systems was created using IBM Streams processes. The IBM system processes 35 million CSV files or 100 terabytes of data per month. You can read about these impressive big data analytics use cases for the Telecom company.

The Vates Expertise

In a follow-on project for the same telecom, Vates utilized its expertise. Vates works with data from different formats, that originate from varied locations. The company can create solutions capable of checking network quality (NQI) in near-real-time.

Receiving XML files with measurements as detailed as the antenna position, the company can calculate its deviations. The information enables the system to gauge signal quality at a particular location. With this microscopic big data, a technician can quickly make necessary corrections.

Patterns

The insight-sifting solutions uncover hidden patterns, trends, and profound perceptions of customer behavior. Within this data is other useful business, operational, and marketing information — all bringing instantaneous business value.

As big data technologies evolve, solutions will likely be developed by blended teams. These blended teams will be using outsourcing and staff augmentation to overcome the challenges of staying on the leading edge.

Mario Barra

Mario Barra is the co-founder and CEO of Alaya Capital a venture capital firm and Vates Software, a company focused on providing system integration, software development and data science services. Vates Software is an up and coming tech company with over 500 employees and offices in the USA, California, Argentina and Chile. Vates was founded in 1991 and achieved CMMI level status. It currently uses agile methodologies and offers certified teams.



Source

Continue Reading

Tech

samsung a31 bangla review | samsung a31 price in bangladesh | AFR Technology

Published

on



samsung a31 bangla review | samsung a31 price in bangladesh | AFR Technology. Know more samsung a31, samsung a31 price in bangladesh, samsung a31 bangla review, samsung a31 review, samsung a31 official video, samsung a31 camera test, samsung a31 unboxing, samsung a31 unboxing bangla, samsung a31 vs m31, samsung a31 atc, samsung a31 price, samsung a31 2020, samsung a31 vs a51, samsung galaxy a31, samsung galaxy a31 price in bangladesh, samsung galaxy a31 unboxing, samsung galaxy a31 2020, samsung galaxy a31 review etc.

=======================================
Price: 21K-22K
=======================================

এই চ্যানেলের অনান্য ভিডিওগুলো দেখেতে:
Redmi Note 8 pro vs Realme X2 Bangla :
Samsung Note 10 Lite Bangla Review :
How to Clean Mobile Back Cover :
Oppo Reno 3 pro Review Bangla :
Redmi K30 vs Realme X2 Bangla :
Honor V30 Pro Bangla Review :
Samsung Galaxy A71 Bangla Review :
Best Camera Phone 2019 under 15000 TK :
Low Price Mobile in Bangladesh :
Vivo S5 Bangla Review :
Infinix S5 Bangla Review :

===================================
Subscribe My Channel:
Facebook:
Linkedin:
Twitter:

#SamsungA31
#SamsungGalaxyA31
#AFRtechnology

source

Continue Reading

Tech

After coronavirus, AI could be central to our new normal

Published

on


When we came out of the financial crisis of 2008, cloud computing kicked into high gear and started to become a pervasive, transformational technology. The current COVID-19 crisis could provide a similar inflection point for AI applications. While the implications of AI continue to be debated on the world stage, the rapid onset of a global health crisis and concomitant recession will accelerate its impact.

Times of crisis bring rapid change. Efforts to harness AI technologies to discover new drugs – either vaccine or treatment – have kicked into hyperdrive. Startups are racing to find solutions and established companies are forming partnerships with academia to find a cure. Other companies are researching existing drugs for their potential applicability. AI is proving a useful tool for dramatically reducing the time needed to identify potential drug candidates, possibly saving years of research. AI uses already put into action are screening for COVID-19 symptoms, decision support for CT scans, and automating hospital operations. A variety of healthcare functions have started to be performed by robots, from diagnosis to temperature monitoring.

Whatever the new normal becomes in the aftermath of the current crisis, it’s apparent that AI will be an even larger part of the technology landscape going forward — and not only for healthcare.

Growing automation

The Brookings Institute recently published its view that a recession is likely to bring about a spike in labor-replacing automation — with employers shedding less-skilled workers. They argue that automation surges during recessions and could bring long-term structural changes to the labor force. This echoes an article where London School of Economics Professor Mirko Draca said a recession will bring with it a wave of AI and automation.

The recession will impact broad swaths of the economy. A recent story cites a CEO who had to close a factory after an employee became ill. Obviously, robots do not have this problem. As a result, the company has plans to speed its adoption of AI and machine learning over the next few years.

VB TRansform 2020: The AI event for business leaders. San Francisco July 15 - 16

Call center operations have been similarly affected, leading to an increased interest in automation software. The Wall Street Journal reported that, in the midst of the current business disruption, companies are looking for temp workers and using automated bots to help filter callers who need a live person from those who can be helped digitally. Auto maker Hyundai may now move even faster towards automated production.

Warehouse, grocery, and delivery workers are striking in hopes of better wages and, especially, working conditions. This at the same time as these positions are increasingly subject to automation and AI products to improve automation continue to advance. Though conditions prompting the work stoppages are understandable, it is equally possible these actions will only spur further efforts to embed automation. A completely automated retail supply chain from warehouse to grocery or restaurant to home is increasingly coming into view, though it will likely be several years before all the pieces are fully in place.

As technology advances, there is increasing acceptance of automation. “Americans are growing more comfortable shopping for food or electronics without the aid of another human,” according to the March 2020 Automated Retail Tracker. Mercer’s 2020 Global Talent Trends survey reveals 34% of employees expect their jobs to be replaced in three years. It is not only blue-collar work that will be affected. Gartner predicts emerging technologies such as virtual personal assistants and chatbots will replace close to 70% of managerial workload, leading to a complete overhaul of these roles.

Increased use of surveillance

In dire times, governments can assume broad powers. This was evident after 9/11 when the U.S. Congress quickly passed the USA PATRIOT Act that expanded surveillance. Many of the provisions were supposed to expire more than a decade ago. Yet, the program is still in place.

A very different crisis now is providing a similar impetus to increase surveillance. Governments worldwide are harnessing surveillance-camera footage, smartphone location data, and credit card purchase records to help trace the recent movements of people who may have contracted COVID-19 and establish virus transmission chains.

AI technologies are being deployed into augmented reality glasses, ostensibly to detect fevers with thermal imaging cameras. Similarly, facial recognition systems are being used as an alternative to technologies that rely on touch-based sensors. This information can be integrated with other data, such as phone location, to compile information on people and make determinations about permissible movement and behaviors. Yet facial recognition has had challenges with accuracy of identification and ethical applications, leading to calls for greater government regulation.

As with the search for effective drug candidates, the use of AI technologies on all fronts to battle a virus is needed in an all-hands-on-deck moment. World Health Organization executive director Dr. Michael Ryan said surveillance is part of what’s required for life to return to normal in a world without a vaccine. However, civil liberties experts warn that the public has little recourse to challenge these digital exercises of power once the immediate threat has passed. Human Rights Watch, Amnesty International, AI Now, and 104 other organizations urged governments to show leadership in tackling the pandemic by respecting human rights when using digital technologies to track and monitor people. In a joint statement, the organizations said that the virus must not be used as a cover to usher in a new era of greatly expanded systems of invasive digital surveillance.

The dilemma

As we face the current crisis as a society, we must resolve the competing values of protecting health while also ensuring privacy and liberty. And we must find a balance between business viability and protecting the ability of people to earn a reasonable living. There is no going back; we’re heading into a new normal. Immediately, the focus will have to be on managing the crisis with the best available tools. This period could be 12-24 months, until there is enough herd immunity, treatment therapies, and an effective vaccine.

During this time, governments will need to do everything possible to provide a social safety net, at least until business can resume and employment levels approach pre-crisis levels. Concurrently, people should realize there will be new rules in the new normal, especially those who work in fields where automation is likely. They should use this period to learn new skills such as systems analysis and evaluation, problem solving, ideation, and leadership. Many companies, from Shell to Amazon have announced plans to re-skill large segments of their workforce. More will need to do so.

Protecting privacy and liberty is perhaps even more challenging. Once surveillance technology is used in response to an immediate crisis, it is difficult to reverse. Surveillance does not need to be our manifest destiny. One proposal out of Europe would limit retention of collected data for only 14 days, the period of possible virus transmission. The only effective means to reasonably protect privacy is to require that surveillance powers assumed during a crisis expire when the crisis ends.

Gary Grossman is the Senior VP of Technology Practice at Edelman and Global Lead of the Edelman AI Center of Excellence.



Source

Continue Reading

Trending

//ofgogoatan.com/afu.php?zoneid=2954224
We use cookies in order to give you the best possible experience on our website. By continuing to use this site, you agree to our use of cookies.
Accept