Month: November 2019

Abstracting WordPress Code To Reuse With Other CMSs: Concepts (Part 1)

Abstracting WordPress Code To Reuse With Other CMSs: Concepts (Part 1)

Abstracting WordPress Code To Reuse With Other CMSs: Concepts (Part 1)

Leonardo Losoviz

Making code that is agnostic of the CMS or framework has several benefits. For instance, through its new content editor Gutenberg, WordPress enables to code components which can be used for other CMSs and frameworks too, such as for Drupal and for Laravel. However, Gutenberg’s emphasis on re-utilization of code is focused on the client-side code of the component (JavaScript and CSS); concerning the component’s backend code (such as the provision of APIs that feed data to the component) there is no pre-established consideration.

Since these CMSs and frameworks (WordPress, Drupal, Laravel) all run on PHP, making their PHP code re-usable too will make it easier to run our components on all these different platforms. As another example, if we ever decide to replace our CMS with another one (as has recently happened that many people decried WordPress after its introduction of Gutenberg), having the application code be agnostic from the CMS simplifies matters: The more CMS-agnostic our application code is, the less effort will be required to port it to other platforms.

Starting with application code built for a specific CMS, the process of transforming it to CMS-agnostic is what, in this article, I will call “abstracting code”. The more abstract the code is, the more it can be re-used to work with whichever CMS.

Making the application completely CMS-agnostic is very tough though — even possibly impossible — since sooner or later it will need to depend on the specific CMS’s opinionatedness. Then, instead of attempting to achieve 100% code reusability, our goal must simply be to maximize the amount of code that is CMS-agnostic to make it reusable across different CMSs or frameworks (for the context of this article, these 2 terms will be used interchangeably). Then, migrating the application to a different framework will be not without pain, but at least it will be as painless as possible.

The solution to this challenge concerns the architecture of our application: We must keep the core of the application cleanly decoupled from the specifics of the underlying framework, by coding against interfaces instead of implementations. Doing so will grant additional benefits to our codebase: We can then focus our attention almost exclusively on the business logic (which is the real essence and purpose of the application), causing the code to become more understandable and less muddled with the limitations imposed by the particular CMS.

This article is composed of 2 parts: In this first part we will conceptualize and design the solution for abstracting the code from a WordPress site, and on the 2nd part we will implement it. The objective shall be to keep the code ready to be used with Symfony components, Laravel framework, and October CMS.

Code Against Interfaces, Rely On Composer, Benefit From Dependency Injection

The design of our architecture will be based on the following pillars:

  1. Code against interfaces, not implementations.
  2. Create packages, distribute them through Composer.
  3. Dependency Injection to glue all parts together.

Let’s analyze them one by one.

Code Against Interfaces, Not Implementations

Coding against interfaces is the practice of interacting with a certain piece of code through a contract. A contract, which is set up through an interface from our programming language (PHP in our case since we are dealing with WordPress), establishes the intent of certain functionality, by explicitly stating what functions are available, what inputs are expected for each function, and what each function will return, and it is not concerned with how the functionality must be implemented. Then, our application can be cleanly decoupled from a specific implementation, not needing to know how its internals work, and being able to change to another implementation at any time without having to drastically change code. For instance, our application can store data by interacting with an interface called DataStoreInterface instead of any of its implementations, such as instances of classes DatabaseDataStore or FilesystemDataStore.

In the context of WordPress, this implies that — by the end of the abstraction — no WordPress code will be referenced directly, and WordPress itself will simply be a service provider for all the functions that our application needs. As a consequence, we must consider WordPress as a dependency of the application, and not as the application itself.

Contracts and their implementations can be added to packages distributed through Composer and glued together into the application through dependency injection which are the items we will analyze next.

Create Packages, Distribute Them Through Composer

Remember this: Composer is your friend! This tool, a package manager for PHP, allows any PHP application to easily retrieve packages (i.e. code) from any repository and install them as dependencies.

Note: I have already described how we can use Composer together with WordPress in a previous article I wrote earlier this year.

Composer is itself CMS-agnostic, so it can be used for building any PHP application. Packages distributed through Composer, though, may be CMS-agnostic or not. Therefore, our application should depend on CMS-agnostic packages (which will work for any CMS) as much as possible, and when not possible, depend on the corresponding package that works for our specific CMS.

This strategy can be used to code against contracts, as explained earlier on. The packages for our application can be divided into two types: CMS-agnostic and CMS-specific ones. The CMS-agnostic package will contain all the contracts and all generic code, and the application will exclusively interact with these packages. For each CMS-agnostic package containing contracts, we must also create a CMS-specific package containing the implementation of the contracts for the required CMS, which is set into the application by means of dependency injection (which we’ll analyze below).

For example, to implement an API to retrieve posts, we create a CMS-agnostic package called “Posts”, with contract PostAPIInterface containing function getPosts, like this:

interface PostAPIInterface
{ public function getPosts($args);
}

This function can be resolved for WordPress through a package called “Posts for WordPress”, which resolves the contract through a class WPPostAPI, implementing function getPosts to simply execute WordPress function get_posts, like this:

class WPPostAPI implements PostAPIInterface
{ public function getPosts($args) { return get_posts($args); }
}

If we ever need to port our application from WordPress to another CMS, we must only implement the corresponding CMS-specific package for the new CMS (e.g. “Posts for October CMS”) and update the dependency injection configuration matching contracts to implementations, and that’s it!

Note: It is a good practice to create packages that only define contracts and nothing else. This way, it is easy for implementers to know exactly what must be implemented.

Dependency Injection To Glue All Parts Together

Dependency injection is a technique that allows declaring which object from the CMS-specific package (aka the “service provider”) is implementing which interface from the CMS-agnostic package (aka the “contract”), thus gluing all parts of the application together in a loosely-coupled manner.

Different CMSs or frameworks may already ship with their own implementation of a dependency injection component. For instance, whereas WordPress doesn’t have any, both Symfony and Laravel have their own solutions: DependencyInjection component and Service Container respectively.

Ideally, we should keep our application free from choosing a specific dependency injection solution, and leave it to the CMS to provide for this. However, dependency injection must be used also to bind together generic contracts and services, and not only those depending on the CMS (for instance, a contract DataStoreInterface, resolved through service provider FilesystemDataStore, may be completely unrelated to the underlying CMS). In addition, a very simple application that does not require an underlying CMS will still benefit from dependency injection. Hence, we are compelled to choose a specific solution for dependency injection.

Note: When choosing a tool or library, prioritize those ones which implement the corresponding PHP Standards Recommendation (in our case, we are interested in PSR-11), so they can be replaced without affecting the application code as much as possible (in practice, each solution will most likely have a custom initialization, so some re-writing of application code may be unavoidable).

Choosing The Dependency Injection Component

For my application, I have decided to use Symfony’s DependencyInjection component which, among other great features, can be set-up through YAML and XML configuration files, and it supports autowiring, which automatically resolves how different services are injected into one another, greatly reducing the amount of configuration needed.

For instance, a service Cache implementing a contract CacheInterface, like this one:

namespace MyPackage\MyProject;
class Cache implements CacheInterface
{ private $cacheItemPool; private $hooksAPI; public function __construct( CacheItemPoolInterface $cacheItemPool, HooksAPIInterface $hooksAPI ) { $this->cacheItemPool = $cacheItemPool; $this->hooksAPI = $hooksAPI; } // ...
}

… can be set as the default service provider through the following services.yaml configuration file:

services: _defaults: bind: MyPackage\MyProject\HooksAPIInterface: '@hooks_api' hooks_api: class: \MyPackage\MyProject\ContractImplementations\HooksAPI cache: class: \MyPackage\MyProject\Cache public: true arguments: $cacheItemPool: '@cache_item_pool' cache_item_pool: class: \Symfony\Component\Cache\Adapter\FilesystemAdapter

As it can be observed, class cache requires two parameters in its constructor, and these are resolved and provided by the dependency injection component based on the configuration. In this case, while parameter $cacheItemPool is manually set, parameter $hooksAPI is automatically resolved through type-hinting (i.e. matching the expected parameter’s type, with the service that resolves that type). Autowiring thus helps reduce the amount of configuration required to glue the services and their implementations together.

Make Your Packages As Granular As Possible

Each package must be as granular as possible, dealing with a specific objective, and containing no more or less code than is needed. This is by itself a good practice in order to avoid creating bloated packages and establishing a modular architecture, however, it is mandatory when we do not know which CMS the application will run on. This is because different CMSs are based on different models, and it is not guaranteed that every objective can be satisfied by the CMS, or under what conditions. Keeping packages small and objective then enables to fulfill the required conditions in a progressive manner, or discard using this package only when its corresponding functionality can’t be satisfied by the CMS.

Let’s take an example: If we come from a WordPress mindset, we could initially assume that entities “posts” and “comments” will always be a part of the Content Management System, and we may include them under a package called “CMS core”. However, October CMS doesn’t ship with either posts or comments in its core functionality, and these are implemented through plugins. For the next iteration, we may decide to create a package to provide for these two entities, called “Posts and Comments”, or even “Posts” under the assumption that comments are dependent on posts and bundled with them. However, once again, the plugins in October CMS don’t implement these two together: There is a plugin implementing posts and another plugin implementing comments (which has a dependency on the posts plugin). Finally, our only option is to implement two separate packages: “Posts” and “Comments”, and assign a dependency from the latter to the former one.

Likewise, a post in WordPress contains post meta attributes (i.e. additional attributes to those defined in the database model) and we may assume that every CMS will support the same concept. However, we can’t guarantee that another CMS will provide this functionality and, even if it did, its implementation may be so different than that from WordPress that not the same operations could be applied to the meta attributes.

For example, both WordPress and October CMS have support for post meta attributes. However, whereas WordPress stores each post meta value as a row on a different database table than where the post is stored, October CMS stores all post meta values in a single entry as a serialized JSON object in a column from the post table. As a consequence, WordPress can fetch posts filtering data based on the meta value, but October CMS cannot. Hence, the package “Posts” must not include the functionality for post meta, which must then be implemented on its own package “Post Meta” (satisfiable by both WordPress and October CMS), and this package must not include functionality for querying the meta attributes when fetching posts, which must then be implemented on its own package “Post Meta Query” (satisfiable only by WordPress).

Identifying Elements That Need To Be Abstracted

We must now identify all the pieces of code and concepts from a WordPress application that need be abstracted for it to run with any other CMS. Digging into an application of mine, I identified the following items:

  • accessing functions
  • function names
  • function parameters
  • states (and other constant values)
  • CMS helper functions
  • user permissions
  • application options
  • database column names
  • errors
  • hooks
  • routing
  • object properties
  • global state
  • entity models (meta, post types, pages being posts, and taxonomies —tags and categories—)
  • translation
  • media

As long as it is, this list is not yet complete. There are many other items that need abstraction, which I will not presently cover. Such items include dealing with the location of assets (some framework may require to place image/font/JavaScript/CSS/etc. files on a specific directory) and CLI commands (WordPress has WP-CLI, Symfony has the console component, and Laravel has Artisan, and there are commands for each of these which could be unified).

In the next (and final) part of this series of articles, we will proceed to implement the abstraction for all the items identified above.

Evaluating When It Makes Sense To Abstract The Application

Abstracting an application is not difficult, but, as shall be observed in the next article, it involves plenty of work, so we must consider carefully if we really need it or not. Let’s consider the advantages and disadvantages of abstracting the application’s code:

Advantages

  • The effort required to port our application to other platforms is greatly reduced.
  • Because the code reflects our business logic and not the opinionatedness of the CMS, it is more understandable.
  • The application is naturally organized through packages which provide progressive enhancement of functionalities.

Disadvantages

  • Extra ongoing work.
  • Code becomes more verbose.
  • Longer execution time from added layers of code.

There is no magic way to determine if we’ll be better off by abstracting our application code. However, as a rule of thumb, I’ll propose the following approach:

Concerning a new project, it makes sense to establish an agnostic architecture, because the required extra effort is manageable, and the advantages make it well worth it; concerning an existing project, though, the one-time effort to abstract it could be very taxing, so we should analyze what is more expensive (in terms of time and energy): the one-time abstraction, or maintaining several codebases.

Conclusion

Setting-up a CMS-agnostic architecture for our application can allow to port it to a different platform with minimal effort. The key ingredients of this architecture are to code against interfaces, distribute these through granular packages, implement them for a specific CMS on a separate package, and tie all parts together through dependency injection.

Other than a few exceptions (such as deciding to choose Symfony’s solution for dependency injection), this architecture attempts to impose no opinionatedness. The application code can then directly mirror the business logic, and not the limitations imposed by the CMS.

In the next part of this series, we will implement the code abstraction for a WordPress application.

Smashing Editorial (rb, dm, yk, il)

Mixcloud data breach exposes over 20 million user records

A data breach at Mixcloud, a U.K.-based audio streaming platform, has left more than 20 million user accounts exposed after the data was put on sale on the dark web.

The data breach happened earlier in November, according to a dark web seller who supplied a portion of the data to TechCrunch, allowing us to examine and verify the authenticity of the data.

The data contained usernames, email addresses, and passwords that appear to be scrambled with the SHA-2 algorithm, making the passwords near impossible to unscramble. The data also contained account sign-up dates and the last-login date. It also included the country from which the user signed up, their internet (IP) address, and links to profile photos.

We verified a portion of the data by validating emails against the site’s sign-up feature, though Mixcloud does not require users to verify their email addresses.

The exact amount of data stolen isn’t known. The seller said there were 20 million records, but listed 21 million records on the dark web. But the data we sampled suggested there may have been as many as 22 million records based off unique values in the data set we were given.

The data was listed for sale for $4,000, or about 0.5 bitcoin. We’re not linking to the dark web listing.

Mixcloud last year secured a $11.5 million cash injection from media investment firm WndrCo, led by Hollywood media proprietor Jeffrey Katzenberg.

It’s the latest in a string of high profile data breaches in recent months. The breached data came from the same dark web seller who also alerted TechCrunch to the StockX breach earlier this year. The apparel trading company initially claimed its customer-wide password reset was for “system updates,” but later came clean, admitting it was hacked, exposing more than four million records, after TechCrunch obtained a portion of the breached data.

When reached, Mixcloud spokesperson Lisa Roolant did not comment beyond a boilerplate corporate statement, nor did the spokesperson answer any of our questions — including if the company planned to inform regulators under U.S. state and EU data breach notification laws.

As a London-based company, Mixcloud falls under U.K. and European data protection rules. Companies can be fined up to 4% of their annual turnover for violations of European GDPR rules.

Corrected the fourth paragraph to clarify that emails were validated against the site’s sign-up feature, and not the password reset feature. Updated to include comment from the company.

Read more:

Here’s How to Make Wireframing Work for SEO Success

Posted by atxutexas05

gEob9Qh.jpg

(image credit: Flickr)

Search optimization and engaging design always seem to be at odds. Things like banners and graphics that designers love often conflict with SEOs’ need for crawlable content. However, this doesn’t always need to be the case. In fact, if you include various SEO requirements within the initial design mockups before a site’s launch, you can comfortably fit the needs of SEO and design together just like puzzle pieces.

It all begins with the initial wireframe.

What are wireframes?

nlqUIg8.jpg

(Image credit: Flickr)

Wireframing can be anything from a low fidelity sketch to a fully designed (but non-functional) user interface. These mockups allow designers to share their ideas without committing too much time and effort, which allows designers to iterate and reiterate quickly and easily so the needs of various members of a team can be established and addressed before launch.

This, of course, includes the SEO professionals.

SEO is a complex field with a lot of emphasis on site architecture, so it’s no stretch to say it should be included as a primary concern in the initial design process. In a post from way back in 2008, Moz community member, Amplified-media, described the process of building a wireframe specifically for SEO purposes. He noted that he created SEO specific wireframes in order to describe to clients the ways in which internal navigation, metatags, and content could be optimized. Certainly, we’ve come a long way in both the design and SEO worlds in the last 8 years, but this is still solid strategy.

Wireframes allow designers to plan user flows and overall aesthetics, while they allow developers to concentrate on the functionality of a website. For SEOs, wireframing can help you plan optimized on-page elements as well as opportunities for generating leads, conversions, and interlinking. Wireframing can even help you prepare your keyword analysis for each page.

Let’s take a deeper look at how you can begin implementing your SEO strategy during the wireframing phase.

Content first 26iuZ5V.jpg

(Image credit: Pixabay)

Web design is defined by the content it’s presenting. Oftentimes, a designer is asked to produce a mockup without any clear notion of the ideas it’s supposed to convey. There have been times when our product team has gone forward without bringing every key team member to the table. Once, my team went through the entire development process but realized near the end that the SEO needs weren’t accounted for: Keyword length exceeded character limits; there wasn’t enough content; and we were forced to include SEO haphazardly where we could fit it.

Obviously, this wasn’t an ideal strategy, and the launch had to be pushed back.

To get ahead of this problem, we’ve altered our approach. Now SEO professionals are always at the table during the ideation phase. And we also collaborate during the wireframing phase. We write our text first. Then we add headlines, taglines, body text, and determine keywords before a wireframe is ever presented. Only then do we hand over to the designer to see what they come up with.

If you’re following this model, keep in mind that it’s an iterative process. Don’t be surprised if you’re asked for rewrites or to compromise on length. Or, alternatively, you can go low fidelity and present the designer with a content outline. Give them the bare bones of your ideas for the text and give them suggestions on how to present it.

The point is to collaborate. Letting designers know ahead of time that SEO isn’t going to be a roadblock to what they want to accomplish aesthetically. This is if they’re involved in the conceptual discussions at the beginning of the process.

The same goes for UX designers. Let them know that usability is important to SEO as well.

Potential problems and quick fixesEqAf2Hg.jpg

(Image credit: Pixabay)

In a recent article, Justin Taylor describes some of the so-called discrepancies involved when designing a website that’s optimized for both search and users. Mainly the fight is between the need for text (SEOs) versus imagery (designers). He mentions some great fixes for this problem such as:

  • Webfonts rather than graphics
  • Expandable content blocks or “divs,” which reveal hidden text in response to a mouse click
  • Mouse-overs which are animated content blocks that reveal text in response to a mouse hovering over the block

These are all fantastic workarounds that address the need for aesthetics and allow for crawlable text to be included on page. I highly recommend you take a moment to check out Justin’s article because it’s extremely thorough.

During the wireframing process, all an SEO needs to do in order to have workarounds like the ones mentioned above included in the design is be present at the planning table, and then to let the designers/developers know that these strategies could meet everyone’s needs. As I mentioned before, collaboration is the key.

The nonexistent conflict between design and SEO
Wr4xHeo.jpg

(Image credit: Pixabay)

SEO is too often treated as an afterthought, subsidiary to design, development, and usability. If you can frame SEO as something that can be most useful when implemented concurrently with the other concerns in the planning phase, then you can minimize any potential conflicts between team members.

“Conflict” is a keyword here because there is a persistent myth that SEO always runs counter to the needs of both visual and UX designers. This is patently false. Design and SEO can work very well together in most circumstances.

I’ve had specific experiences explaining to project managers s that search optimization is only going to enhance the usability of a page. My go-to example in this situation is that of headline creation: I point out that the whole purpose of search spiders parsing each page is to emulate the way users evaluate content. That means headlines need to clearly and quickly communicate exactly what the content is about. This serves both ends and usually establishes the point. There are of course numerous other examples that can be used to disprove the supposed UX/SEO conflict.

Flat design, for instance, is an immensely popular trend that meshes perfectly with SEO efforts, mainly because of the minimal size of flat illustrations. Whereas high-definition photography can slow down a site’s loading time, flat illustrations are light and quick to load.

Navigation is another point of intersection for SEO and design. Interlinking from the homepage to subpages offers an SEO boost and when artfully implemented it can also be engaging to users. Visual hierarchy and headings are likewise close bedfellows. Using web fonts instead of graphics for banners, CTAs, and other text elements mentioned above can allow you to create linking and meta-tagging opportunities.

Conclusion

Success must be planned for, and success requires careful collaboration planning from start to finish. Adding SEO considerations late in the design process can make your designs look off-kilter and your SEO seem patchworked. Moreover, if the site feels wonky to users, your bounce rate will go up and all your optimization efforts will be for naught.

That’s why implementing an SEO strategy during the wireframing phase is so important.

What are your thoughts for using wireframing to bridge the gap between SEO and UX considerations?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Solar Trees Powering The Park Of South Florida

Joining South Florida’s lush, green canopy of real trees are a new crop of solar trees. These “trees” have blue trunks and bear no fruit, but supply clean energy to whoever needs it.

If you’re at the beach and your phone starts to die, you can charge it right here using Solar Power.

Here’s how the solar trees work: Each solar tree comes with 2 solar powered panels up top. Some of that energy collected is powering the grid of the community, and to a nearby box that send electricity to plugs where phones or compuers can be charged.

The post Solar Trees Powering The Park Of South Florida appeared first on LatestSolarNews.

All About Fraggles (Fragment + Handle) – Whiteboard Friday

Posted by Suzzicks

What are “fraggles” in SEO and how do they relate to mobile-first indexing, entities, the Knowledge Graph, and your day-to-day work? In this glimpse into her 2019 MozCon talk, Cindy Krum explains everything you need to understand about fraggles in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Cindy Krum, and I’m the CEO of MobileMoxie, based in Denver, Colorado. We do mobile SEO and ASO consulting. I’m here in Seattle, speaking at MozCon, but also recording this Whiteboard Friday for you today, and we are talking about fraggles.

So fraggles are obviously a name that I’m borrowing from Jim Henson, who created “Fraggle Rock.” But it’s a combination of words. It’s a combination of fragment and handle. I talk about fraggles as a new way or a new element or thing that Google is indexing.

Fraggles and mobile-first indexing

Let’s start with the idea of mobile-first indexing, because you have to kind of understand that before you can go on to understand fraggles. So I believe mobile-first indexing is about a little bit more than what Google says. Google says that mobile-first indexing was just a change of the crawler.

They had a desktop crawler that was primarily crawling and indexing, and now they have a mobile crawler that’s doing the heavy lifting for crawling and indexing. While I think that’s true, I think there’s more going on behind the scenes that they’re not talking about, and we’ve seen a lot of evidence of this. So what I believe is that mobile-first indexing was also about indexing, hence the name.

Knowledge Graph and entities

So I think that Google has reorganized their index around entities or around specifically entities in the Knowledge Graph. So this is kind of my rough diagram of a very simplified Knowledge Graph. But Knowledge Graph is all about person, place, thing, or idea.

Nouns are entities. Knowledge Graph has nodes for all of the major person, place, thing, or idea entities out there. But it also indexes or it also organizes the relationships of this idea to this idea or this thing to this thing. What’s useful for that to Google is that these things, these concepts, these relationships stay true in all languages, and that’s how entities work, because entities happen before keywords.

This can be a hard concept for SEOs to wrap their brain around because we’re so used to dealing with keywords. But if you think about an entity as something that’s described by a keyword and can be language agnostic, that’s how Google thinks about entities, because entities in the Knowledge Graph are not written up per se or their the unique identifier isn’t a word, it’s a number and numbers are language agnostic.

But if we think about an entity like mother, mother is a concept that exists in all languages, but we have different words to describe it. But regardless of what language you’re speaking, mother is related to father, is related to daughter, is related to grandfather, all in the same ways, even if we’re speaking different languages. So if Google can use what they call the “topic layer”and entities as a way to filter in information and understand the world, then they can do it in languages where they’re strong and say, “We know that this is true absolutely 100% all of the time.”

Then they can apply that understanding to languages that they have a harder time indexing or understanding, they’re just not as strong or the algorithm isn’t built to understand things like complexities of language, like German where they make really long words or other languages where they have lots of short words to mean different things or to modify different words.

Languages all work differently. But if they can use their translation API and their natural language APIs to build out the Knowledge Graph in places where they’re strong, then they can use it with machine learning to also build it and do a better job of answering questions in places or languages where they’re weak. So when you understand that, then it’s easy to think about mobile-first indexing as a massive Knowledge Graph build-out.

We’ve seen this happening statistically. There are more Knowledge Graph results and more other things that seem to be related to Knowledge Graph results, like people also ask, people also search for, related searches. Those are all describing different elements or different nodes on the Knowledge Graph. So when you see those things in the search, I want you to think, hey, this is the Knowledge Graph showing me how this topic is related to other topics.

So when Google launched mobile-first indexing, I think this is the reason it took two and a half years is because they were reindexing the entire web and organizing it around the Knowledge Graph. If you think back to the AMA that John Mueller did right about the time that Knowledge Graph was launching, he answered a lot of questions that were about JavaScript and href lang.

When you put this in that context, it makes more sense. He wants the entity understanding, or he knows that the entity understanding is really important, so the href lang is also really important. So that’s enough of that. Now let’s talk about fraggles.

Fraggles = fragment + handle

So fraggles, as I said, are a fragment plus a handle. It’s important to know that fraggles — let me go over here —fraggles and fragments, there are lots of things out there that have fragments. So you can think of native apps, databases, websites, podcasts, and videos. Those can all be fragmented.

Even though they don’t have a URL, they might be useful content, because Google says its goal is to organize the world’s information, not to organize the world’s websites. I think that, historically, Google has kind of been locked into this crawling and indexing of websites and that that’s bothered it, that it wants to be able to show other stuff, but it couldn’t do that because they all needed URLs.

But with fragments, potentially they don’t have to have a URL. So keep these things in mind — apps, databases and stuff like that — and then look at this. 

So this is a traditional page. If you think about a page, Google has kind of been forced, historically by their infrastructure, to surface pages and to rank pages. But pages sometimes struggle to rank if they have too many topics on them.

So for instance, what I’ve shown you here is a page about vegetables. This page may be the best page about vegetables, and it may have the best information about lettuce, celery, and radishes. But because it’s got those topics and maybe more topics on it, they all kind of dilute each other, and this great page may struggle to rank because it’s not focused on the one topic, on one thing at a time.

Google wants to rank the best things. But historically they’ve kind of pushed us to put the best things on one page at a time and to break them out. So what that’s created is this “content is king, I need more content, build more pages” mentality in SEO. The problem is everyone can be building more and more pages for every keyword that they want to rank for or every keyword group that they want to rank for, but only one is going to rank number one.

Google still has to crawl all of those pages that it told us to build, and that creates this character over here, I think, Marjory the Trash Heap, which if you remember the Fraggles, Marjory the Trash Heap was the all-knowing oracle. But when we’re all creating kind of low- to mid-quality content just to have a separate page for every topic, then that makes Google’s life harder, and that of course makes our life harder.

So why are we doing all of this work? The answer is because Google can only index pages, and if the page is too long or too many topics, Google gets confused. So we’ve been enabling Google to do this. But let’s pretend, go with me on this, because this is a theory, I can’t prove it. But if Google didn’t have to index a full page or wasn’t locked into that and could just index a piece of a page, then that makes it easier for Google to understand the relationships of different topics to one page, but also to organize the bits of the page to different pieces of the Knowledge Graph.

So this page about vegetables could be indexed and organized under the vegetable node of the Knowledge Graph. But that doesn’t mean that the lettuce part of the page couldn’t be indexed separately under the lettuce portion of the Knowledge Graph and so on, celery to celery and radish to radish. Now I know this is novel, and it’s hard to think about if you’ve been doing SEO for a long time.

But let’s think about why Google would want to do this. Google has been moving towards all of these new kinds of search experiences where we have voice search, we have the Google Home Hub kind of situation with a screen, or we have mobile searches. If you think about what Google has been doing, we’ve seen the increase in people also ask, and we’ve seen the increase in featured snippets.

They’ve actually been kind of, sort of making fragments for a long time or indexing fragments and showing them in featured snippets. The difference between that and fraggles is that when you click through on a fraggle, when it ranks in a search result, Google scrolls to that portion of the page automatically. That’s the handle portion.

So handles you may have heard of before. They’re kind of old-school web building. We call them bookmarks, anchor links, anchor jump links, stuff like that. It’s when it automatically scrolls to the right portion of the page. But what we’ve seen with fraggles is Google is lifting bits of text, and when you click on it, they’re scrolling directly to that piece of text on a page.

So we see this already happening in some results. What’s interesting is Google is overlaying the link. You don’t have to program the jump link in there. Google actually finds it and puts it there for you. So Google is already doing this, especially with AMP featured snippets. If you have a AMP featured snippet, so a featured snippet that’s lifted from an AMP page, when you click through, Google is actually scrolling and highlighting the featured snippet so that you could read it in context on the page.

But it’s also happening in other kind of more nuanced situations, especially with forums and conversations where they can pick a best answer. The difference between a fraggle and something like a jump link is that Google is overlaying the scrolling portion. The difference between a fraggle and a site link is site links link to other pages, and fraggles, they’re linking to multiple pieces of the same long page.

So we want to avoid continuing to build up low-quality or mid-quality pages that might go to Marjory the Trash Heap. We want to start thinking in terms of can Google find and identify the right portion of the page about a specific topic, and are these topics related enough that they’ll be understood when indexing them towards the Knowledge Graph.

Knowledge Graph build-out into different areas

So I personally think that we’re seeing the build-out of the Knowledge Graph in a lot of different things. I think featured snippets are kind of facts or ideas that are looking for a home or validation in the Knowledge Graph. People also ask seem to be the related nodes. People also search for, same thing. Related searches, same thing. Featured snippets, oh, they’re on there twice, two featured snippets. Found on the web, which is another way where Google is putting expanders by topic and then giving you a carousel of featured snippets to click through on.



 So we’re seeing all of those things, and some SEOs are getting kind of upset that Google is lifting so much content and putting it in the search results and that you’re not getting the click. We know that 61% of mobile searches don’t get a click anymore, and it’s because people are finding the information that they want directly in a SERP.

That’s tough for SEOs, but great for Google because it means Google is providing exactly what the user wants. So they’re probably going to continue to do this. I think that SEOs are going to change their minds and they’re going to want to be in those windowed content, in the lifted content, because when Google starts doing this kind of thing for the native apps, databases, and other content, websites, podcasts, stuff like that, then those are new competitors that you didn’t have to deal with when it was only websites ranking, but those are going to be more engaging kinds of content that Google will be showing or lifting and showing in a SERP even if they don’t have to have URLs, because Google can just window them and show them.

So you’d rather be lifted than not shown at all. So that’s it for me and featured snippets. I’d love to answer your questions in the comments, and thanks very much. I hope you like the theory about fraggles.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

2019 Thanksgiving e-commerce sales show 14% rise on 2018, $470M spent so far

With popular social networks seeing some downtime, most shops closed, and many people off work today for Thanksgiving, bargain hunters are flocking online to start their holiday shopping.

Adobe says that as of 2pm Pacific time, $2.1 billion had been spent online, up 20.2% on the same period a year ago.

That shows that as the day went on, sales accellerated. Prior to that, at 10am Adobe said $470 million has been spent online, a rise of 14.5% compared to sales figures from the same time last year. Overall, sales patterns are largely on track to hit Adobe’s prediction of $4.4 billion in sales today.

Over at Shopify, as of 11.30 Pacific time, the e-commerce backend provider noted that it was seeing around 4,500 transactions per minute, working out to just under $400,000 spent each minute.

Within that, some 66% of all sales were being made on mobile devices, with apparel and accessories the most popular category, and New York the top-selling city. Average cart price has been $78.66.

Adobe Analytics tracks sales in real-time for 80 of the top 100 US retailers, covering 55 million SKUs and some 1 trillion transactions during the holiday sales period. Shopify, meanwhile, uses data from across the range of online retailers that use Shopify APIs to run their sales.

Black Friday (the day after Thanksgiving) used to be seen as the traditional start to holiday sales, but consumers spending time at home on Thanksgiving itself are increasingly coming online — on a day when most brick-and-mortar stores are closed — to get the ball rolling.

This year, Thanksgiving is coming a week later this year than in 2018 (when it fell on the 22nd of the month), which will make for a more compressed, and potentially more frenzied, selling period.

As Sarah pointed out yesterday, many retailers this year made an early jump on their Black Friday deals, and so far some $53 billion has been spent in the month of November up to today. This year’s holiday sales overall are predicted to hit nearly $144 billion.

We’ll be updating this post with more figures as they come in.

As a point of comparison, in 2018, online sales hit $3.7 billion, according to Adobe’s analysis.

Adobe notes that in the $53 billion spent so far this month, all 27 days in November have surpassed $1 billion in sales. Eight days passed $2 billion, and yesterday saw $2.9 billion in sales. That was up 22% on a year ago, which either points to increased sales overall, or simply that the strategy of extending “holiday” shopping to start earlier and earlier is paying off for retailers.

Another interesting insight is that some $18.2B in purchases have been made by smartphones this month, which is up 49.5% compared to last year.

“The strong online sales performance to-date suggests that holiday shopping starts much earlier than ever before. Steep discounts on popular items like computers on the day before Thanksgiving indicate that many of the season’s best deals are already up for grabs. This has led to significant growth in online sales (16.1% YoY increase) so far. What will be important for retailers to track is whether the early discounts will drive continued retail growth overall, or if they have induced consumers to spend their holiday budgets earlier,” noted Jason Woosley, vice president of commerce product & platform at Adobe.

SEO Expert In Bangladesh | How to Become an SEO Expert

SEO can be a pain in the ass; specifically for an entrepreneur or a digital marketer.

But there are a huge number of websites of crappy data, misinformation or outdated data about how to become an SEO expert.

Here’s the great news. There is more than enough FREE training materials available. So that you can too become an SEO expert.

The key, of course, is to know what’s important. After that you need to learn those things. That doesn’t mean that you need to be a genius in other sectors. You just need to handle the basic site optimization things.

Let’s hop into the main section to become an SEO expert in Bangladesh.

Technical SEO

Technical SEO refers to those actions, which are important for website or server level design. So that it can improve search results. Those actions helps crawling, indexing and rendering of websites.

The main motive is to get as many pages indexed and ranking as possible. To be an SEO expert you need to learn the technical seo part.

Here are some courses and guides that will help you with your technical SEO skills:

Technical SEO Courses

Technical SEO Guides

Content Marketing

“Great content” is no longer a huge shit. But it is a important to ensure the organic success.

As one of three Google ranking factors, you need to understand the content that helps both users and the Google algorithm. To be an SEO expert you need to learn these things.

These courses and guides will help:

Content Marketing Courses

  • The Strategy of Content Marketing: This Coursera course is a partnership between Copyblogger and UC Davis. The course teaches students how to analyze and measure the effectiveness of a content marketing campaign.

Content Marketing Guides

Schema

Schema markup is a form of microdata. It creates an rich description (a.k.a. rich snippet), which appears in search results.

Schema provides added context to a page which can be useful to both users and search engines. Schema often provides context to a page too. To be an SEO expert, you need to know about this.

No one is offering free courses on schema. But there are some guides on the topic:

Schema Guides

  • Understand how structured data works: This explains how to add different types of structured data to a simple HTML page. This includes where to place the microdata and how to use it.

User Intent

Google is working hard to understand the context of search queries. Their goal is to return search results aligned with searcher intent.

If your website doesn’t conform to Google’s understanding of intent for a keyword, there is nothing you can do to reach the top of the SERPs.

For that reason, it’s very important to convert any traffic to drive to your page. To become an SEO expert, you need to capture that too.

The following courses and guides will help you to understand:

User Intent Courses

User Intent Guides

  • The Complete A/B Testing Kit: Kissmetrics and HubSpot have put together a package with everything one needs to start running A/B tests right away.

Website Speed

Page speed can affect on both traffic and conversions.

According to Google, “the average time it takes to load a mobile landing page is 22 seconds. 53 percent of mobile site visitors leave a page that takes longer than three seconds to load.”

In short, fast page speed equals a good user experience. Slow page speed equals a poor user experience.

For that reason, Google now uses page speed as a ranking factor. So, wanna be an SEO expert? Learn these things too.

If you need help with speed, check out these courses and guides:

Website Speed Courses

Website Speed Guides

Link Building

Thanks to Google’s Penguin Algorithm and machine learning algorithm (RankBrain), links that are both earned and related, carry more weight than ever.

To be clear, I’m referring to the types of links that supports by Google webmaster guidelines.

Check out these guides and courses if you want to be an SEO expert:

Link Building Courses

Link Building Guides

Mobile-First/Local SEO

Remember this, mobile search is local search. Fifty percent of all mobile searches have a local intent. To be an SEO expert you need some Local SEO skill set.

Improve your skills by checking out these courses and guides:

Local SEO Courses

Local SEO Guides

  • Where to Get Citations: Citations, are the lifeblood of local search. This Moz guide gives a plan for identifying and building these references.

Guides: A Collection of UX Guides Created by Google for Select Industries

It will never be easier to be an SEO expert without these courses. SEO is a must for today. Take advantage of it. Push yourself with the knowledge to succeed. You won’t regret it.

Oh! Forgot to tell this. I’ve written some SEO articles for a long time. This website has some cool SEO articles to be an SEO expert in Bangladesh. Just read these articles. You’ll understand, why it will help you.

Related SEO Resources:

Loved the article? Do comment and share!

The post SEO Expert In Bangladesh | How to Become an SEO Expert appeared first on Muntasir Mahdi.

How to Become an SEO Consultant | SEO Consultant in Bangladesh

Do you want to become an SEO consultant?

Then you need to think more like Google and act like the customer. Because these are the two main factors that decide if you can be an SEO consultant in Bangladesh.

How to Become an SEO Consultant?

There is nothing exclusive about SEO. Anyone can become an SEO expert in Bangladesh.

SEO is an skill that can be learnt. All by yourself.

Here is a list of simple and easy steps to follow to become an SEO consultant in 2019:

Step 1: Know How Search Engines Work

First of all, I recommend to read this awesome SEO guide from here. It will help you to understand the SEO more.

This guide talks about important information, for example:

  • Organizing your website
  • Optimizing your on-page content
  • Measuring your results
  • And more!

The entire part of SEO or Search Engine Optimization depends on different tools. So you need to understand, how they work.

To rank a page into the search engines, you have to know more. There are a lot of algorithms, tools & data signals which are important.

This is why it is important to understand some basic & complex concepts about how Search Engines work. As it will help you see the bigger picture easily.

Step 2: Become an Expert SEO Content Writer

Content is the king. It stays at the center of SEO. Everyone can not be a writer. So content writing is not for anyone and everyone.

SEO content writing needs creativity. An SEO content writer also have the technical knowledge of SEO.

Search results appear on the internet because of the content. So, you have to get familiar with content creation. You also have to know, how you can compose LSI (Latent Semantic Indexing) content.

Invest your energy and time to create great quality content. It would help your SEO campaign.

Step 3: Learn to Code

To become an SEO consultant in Bangladesh, you should learn (the concepts) HTML and CSS.

Why is that?

Because these are the coding languages which are being used by search engines. Yet it isn’t fundamental.

The site you are working for is using HTML and CSS. Someday it will cause some issues.

Then? What you’ll do?

It’ll hamper your SEO results. If you learn HTML & CSS, you can get those problems fixed rapidly.

Step 4: Understand Keyword Research

SEO spins around “Keywords.” Not just any keywords. They have to be specific. And you need to research that.

You should ensure that the keywords are linked into the Content, Meta labels, Title, alt picture labels and so on. This will allow search crawlers to crawl and index your content easily.

Any good SEO consultant would tell you the importance of SEO keyword research tools. Some of the most popular SEO tools are:

  • SEMrush
  • Google Keyword Planner
  • Keyword Research Tool from Moz etc.

Step 5: Use Search Engine Marketing (SEM)

SEO is one small method of the giant digital marketing sector. A good SEO consultant need to know different parts of digital marketing campaigns.

SEM or Search Engine Marketing is one of the most important method. It includes:

Step 6: Learn from the best SEO Training Resources

Internet is a sea of data. So finding the correct information here, is difficult. When you search for information on the internet, you’ll find a huge number of sites. But you won’t have that much time to surf all.

That is why you have to find the correct and high quality content and resources. This is why it’s smarter to learn from the best in the business.
There are many good SEO training resources,

Step 7: Keep Practicing & Sharing

Knowledge is not the key if you don’t practice. After gathering information and data, use them on different projects. This will help you to gain experience.

Share your insights, data and resources. Start writing a blog. Share your experience. Help other people too.

Now you know the basic roadmap of becoming an SEO consultant in Bangladesh. Or do you want to become an SEO expert? If you do, then read this, SEO Expert in Bangladesh.

Let me ask you a question!

Do you want to know how much do SEO consultants earn?

The total earning of an SEO consultant depends on different things, such as:

  • Experience
  • Success Rate
  • Company’s Budget
  • Customer’s spending limit
  • Types of Websites
  • Delivery time of results & many more.

Here i’m giving you an idea. A skilled SEO consultant’s median salary reaches up to $56,000 annually. The top 10% SEO’s earn around $81,000, according to PayScale.

Note that SEO is much useful and can be learnt by everyone. SEO consulting services include all of the things mentioned above.

So, did you like the article? Please do leave a comment and share it.

The post How to Become an SEO Consultant | SEO Consultant in Bangladesh appeared first on Muntasir Mahdi.

Case Study: Why it Makes Sense to Optimize Your Site for ‘Near Me’ Searches

Posted by colleenharriscdk

May 2015 was full of big change in the search world. First, Google announced that, “More Google searches take place on mobile devices than on computers in 10 countries, including the US and Japan.” Then Google followed that up with the news that, “Search interest in ‘near me’ has doubled since last year with 80% of those searches occurring on mobile devices.In response to these trends, Google updated the local extensions for AdWords, allowing businesses to bid on keywords that deal with “near me” searches.

These announcements led us to ask a question: Can content that includes “near me” terms help gain impressions and clicks for those queries in organic search?

Our methods to study this question were simple:

  • We tested 82 websites (41 as the control group; 41 as the test group)
  • Within the test group, we updated the hour and directions page title, description, and H1 to utilize the phrases ‘franchise dealer near me’ and ‘nearest franchise dealer.’ These franchises included a wide range of auto manufacturers, with the physical locations dispersed throughout the United States.
  • We then spent five months looking at mobile impressions and click-through rates for both groups

near-me-chart-v2.jpg

Noteworthy changes after testing

After five months, we started to see a few trends across these websites, including an increase in mobile impressions and clicks for all the “near me” searches. In the test group’s first month, we saw a 27% increase in mobile impressions for “near me” phrases, and the clicks increased from 11 to 40. By comparison, the control group had just a 20% increase in mobile impressions, and click-throughs only increased from 13 to 23.

These trends continued every month we looked at the data. In month three, the test group’s “near me” impressions rose another 15%, compared to the 8% increase of the control group’s impressions. Similarly, the click-through rate for the test group almost doubled that of the control group, with 37 and 19 clicks, respectively.

By the last month, the test group’s websites saw their mobile impressions for “near me” more than double since the start to total 8,833 impressions and 46 clicks.

This is in contrast to the control group, whose “near me” impression share only rose 11% since the start and had just 21 clicks.

There were a few other observations we made in our research:

  • Locations in urban and metro areas saw more impressions and clicks compared to rural locations
  • ‘Near me’ impressions grew from franchises-related searches to include broader phrases, including ‘nearest oil change’

What this tells us

Overall, our results started to give us the answer that, yes, updating your website and content for the appropriate “near me” phrases can have a positive impact on the impressions and clicks for those phrases. This is just the start for a small business website, as mobile search and search intent will only continue to become more important.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Solar Powered Lights For The Yard

Reading about the various sofas reviews here and choosing amongst them to match your patio is not that difficult when compared to doing the entire electrical work of the patio. When you are lighting your patio the exact opposite thing you need to do is to be stumbling over electrical lines and attempting to put lights close to an outlet to connect them to. Assuming, notwithstanding, you ran with solar lights you wouldn’t have this issue.

Solar lights take their power from the sun, store it in inside batteries and afterward sparkle splendidly all through the dull night. With such a large number of choices, you can have solar lights for security and to decorate your home, all in the meantime and all with bring down power costs while being ecologically inviting. It’s a win.

On the off chance that you have ventures in the yard you might need to utilize lights that will enlighten those means when its dull. This will wipe out searching for the means and lessen mischances. They are pretty accents to your means as well and obviously arrive in an assortment of shapes and hues.

Similarly for a pathway. Enlightening it during the evening removes the mystery from the condition and takes into consideration a beautiful sparkle other than. Way lights can be low to the ground or can be on sticks to make a lovely emphasize to the sides of the way itself. They too arrive in an assortment of shapes and hues to be the ideal expansion to your arranging.

Did you realize that the banner code proposes that you light your banner around evening time? Never fear, there is a solar light only for that reason and now the banner will never look better against the dull sky.

You invest a great deal of energy and cash on your finishing so why not indicate it off around evening time too? There are a wide range of solar spotlights available that will consequently sparkle on your ideal arranging or delightful tree around evening time. You can likewise actualize arranging lights into the finishing itself for complements consistently, day or night.

Decks look so happy and summery when they have lights on them. Presently you can get solar string lights to give your deck that awesome light and comfortable feel, and they likewise come in all the prominent shapes and hues.

The post Solar Powered Lights For The Yard appeared first on LatestSolarNews.