Skip to main content
Code Review

Return to Answer

Commonmark migration
Source Link

Regarding your comment about SEO:

The obvious shortcoming of this application is: you are not able top see the templates (HTML) with [CTRL] + [U], which is bad for SEO.

I did a search on the web for "angularJS SEO" and found quite a few results. The majority of posts on the topic suggest considering the use of pre-rendering services:

Search engines still need to see the content and elements of the page in the source code to guarantee that it will be indexed correctly. One current solution is to consider using a pre-rendering platform, such as Prerender.io. This is middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). When a request is received from a search engine bot it will show them the cached version, while the non-bot visitor will be shown the Angular page.

This may sound like cloaking, but Google has confirmed that it is not. It has also been stated by Google that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.1

Another post suggests the following:

There are three ways to do this.

The first one is to use a pre-rendering platform. Prerender.io, for example. Such a service would create cached versions of the content and render it in a form a Googlebot can crawl and index.

Unfortunately, it could prove a short-lived solution. Google can depreciate it easily, leaving the site without an indexable solution again.

The second solution is to modify SPA elements to render it as a hybrid between client- and server-side. Developers refer to this method as Initial Static Rendering.

In this method, you can leave certain elements, critical for SEO – title, meta description, headers, somebody copy and so on – on the server-side. As a result, those elements appear in the source code and Google can crawl them.

Unfortunately, once again, the solution might prove insufficient at times.

There is a viable option, however. It involves using Angular Universal extension to create static versions of pages to be rendered server-side Needless to say, these are indexable by Google fully, and there’s little chance that the technology will be depreciated any time soon.2

So consider options like those.

Regarding your comment about SEO:

The obvious shortcoming of this application is: you are not able top see the templates (HTML) with [CTRL] + [U], which is bad for SEO.

I did a search on the web for "angularJS SEO" and found quite a few results. The majority of posts on the topic suggest considering the use of pre-rendering services:

Search engines still need to see the content and elements of the page in the source code to guarantee that it will be indexed correctly. One current solution is to consider using a pre-rendering platform, such as Prerender.io. This is middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). When a request is received from a search engine bot it will show them the cached version, while the non-bot visitor will be shown the Angular page.

This may sound like cloaking, but Google has confirmed that it is not. It has also been stated by Google that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.1

Another post suggests the following:

There are three ways to do this.

The first one is to use a pre-rendering platform. Prerender.io, for example. Such a service would create cached versions of the content and render it in a form a Googlebot can crawl and index.

Unfortunately, it could prove a short-lived solution. Google can depreciate it easily, leaving the site without an indexable solution again.

The second solution is to modify SPA elements to render it as a hybrid between client- and server-side. Developers refer to this method as Initial Static Rendering.

In this method, you can leave certain elements, critical for SEO – title, meta description, headers, somebody copy and so on – on the server-side. As a result, those elements appear in the source code and Google can crawl them.

Unfortunately, once again, the solution might prove insufficient at times.

There is a viable option, however. It involves using Angular Universal extension to create static versions of pages to be rendered server-side Needless to say, these are indexable by Google fully, and there’s little chance that the technology will be depreciated any time soon.2

So consider options like those.

Regarding your comment about SEO:

The obvious shortcoming of this application is: you are not able top see the templates (HTML) with [CTRL] + [U], which is bad for SEO.

I did a search on the web for "angularJS SEO" and found quite a few results. The majority of posts on the topic suggest considering the use of pre-rendering services:

Search engines still need to see the content and elements of the page in the source code to guarantee that it will be indexed correctly. One current solution is to consider using a pre-rendering platform, such as Prerender.io. This is middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). When a request is received from a search engine bot it will show them the cached version, while the non-bot visitor will be shown the Angular page.

This may sound like cloaking, but Google has confirmed that it is not. It has also been stated by Google that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.1

Another post suggests the following:

There are three ways to do this.

The first one is to use a pre-rendering platform. Prerender.io, for example. Such a service would create cached versions of the content and render it in a form a Googlebot can crawl and index.

Unfortunately, it could prove a short-lived solution. Google can depreciate it easily, leaving the site without an indexable solution again.

The second solution is to modify SPA elements to render it as a hybrid between client- and server-side. Developers refer to this method as Initial Static Rendering.

In this method, you can leave certain elements, critical for SEO – title, meta description, headers, somebody copy and so on – on the server-side. As a result, those elements appear in the source code and Google can crawl them.

Unfortunately, once again, the solution might prove insufficient at times.

There is a viable option, however. It involves using Angular Universal extension to create static versions of pages to be rendered server-side Needless to say, these are indexable by Google fully, and there’s little chance that the technology will be depreciated any time soon.2

So consider options like those.

Source Link

Regarding your comment about SEO:

The obvious shortcoming of this application is: you are not able top see the templates (HTML) with [CTRL] + [U], which is bad for SEO.

I did a search on the web for "angularJS SEO" and found quite a few results. The majority of posts on the topic suggest considering the use of pre-rendering services:

Search engines still need to see the content and elements of the page in the source code to guarantee that it will be indexed correctly. One current solution is to consider using a pre-rendering platform, such as Prerender.io. This is middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). When a request is received from a search engine bot it will show them the cached version, while the non-bot visitor will be shown the Angular page.

This may sound like cloaking, but Google has confirmed that it is not. It has also been stated by Google that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.1

Another post suggests the following:

There are three ways to do this.

The first one is to use a pre-rendering platform. Prerender.io, for example. Such a service would create cached versions of the content and render it in a form a Googlebot can crawl and index.

Unfortunately, it could prove a short-lived solution. Google can depreciate it easily, leaving the site without an indexable solution again.

The second solution is to modify SPA elements to render it as a hybrid between client- and server-side. Developers refer to this method as Initial Static Rendering.

In this method, you can leave certain elements, critical for SEO – title, meta description, headers, somebody copy and so on – on the server-side. As a result, those elements appear in the source code and Google can crawl them.

Unfortunately, once again, the solution might prove insufficient at times.

There is a viable option, however. It involves using Angular Universal extension to create static versions of pages to be rendered server-side Needless to say, these are indexable by Google fully, and there’s little chance that the technology will be depreciated any time soon.2

So consider options like those.

lang-js

AltStyle によって変換されたページ (->オリジナル) /