Difference between pages "SEO" and "Translations:Funtoo:Metro/122/en"

From Funtoo
(Difference between pages)
Jump to navigation Jump to search
m (fix the canonical url section)
 
(Importing a new version from external source)
 
Line 1: Line 1:
[https://en.wikipedia.org/wiki/Search_engine_optimization Seo] stands for search engine optimization.  This page is dedicated to helping improve your page rank on search engines.
On my AMD Jaguar build server, on Feb 20, 2015, this lists all the builds that {{c|buildrepo}} has been configured to manage. The first number on each line is a '''failcount''', which is the number of consecutive times that the build has failed. A zero value indicates that everything's okay. The failcount is an important feature of the advanced repository management features. Here are a number of behaviors that are implemented based on failcount:
 
=== Domains ===
When selecting a domain name, try to choose a short domain name, omitting words like "the" the longer the url, the harder it is to remember, and this affects page rank.
 
http://www.yes.com is excellent
http://www.nooooooooooooooo.com is bad....
 
Domain registration duration impacts SEO, if you intend to have the site a long time, register several years rather than a short period of time, as this will improve SEO.
 
=== Canonical Urls ===
Your sites URLs should have 1 or the other url re-writing, and 301 permanent redirecting so search engines do not see duplicate content.  You are not penalized for url length by adding a www. subdomain.
 
{{console|body=###i## curl -I http://funtoo.org/Welcome}}
 
{{console|body=###i## curl -I http://www.funtoo.org/Welcome}}
 
URL rewrites offer content at the original source also.  Example:
 
http://www.funtoo.org/index.php?title=Welcome
 
serves the same content as
 
http://www.funtoo.org/Welcome
 
we would want the cannonical url to tell search engines that http://www.funtoo.org/Welcome is the canonical url, the url to use index etc, and all duplicates are ignored instead of penalized.
 
=== php? ===
Question marks are bad.  Most search engines freak out about indexing anything past a ? so use url rewrites to remove index.php? and leave just site.com/Main_Page
 
=== www ===
 
domain.org should have url rewrites, and be rewritten to www.domain.org for the purpose of wildcard ssl certificates, and cookie free domains.
 
=== 301 ===
 
Some users when referring to your web site will post links as domain.org, or www.domain.org.  using a 301 redirect on one or the other will have search engines treat both as one singular domain.  Dns 301 redirect domain.org to www.domain.org
 
==== Url spaces ====
You should prefer - over _ to represent spaces.  MediaWiki is flawed in design preferring _ over -.  This is an older SEO problem, yet it still persists in affecting your page ranking.  [[web-server-stack]] is an example of an SEO friendly url with spaces.
 
=== Server Speed ===
How fast your page loads significantly impacts seo.  Installing a caching reverse proxy, and testing that it hits close to 100% of the time is a good idea.  It prevents your server from rebuilding pages, hitting the processor, and also speeds up page delivery to move the bottle neck from the php building to the internet connection speed.  see: {{Package|www-servers/varnish}} & [[Web-server-stack#Benchmarking]]
 
=== Cookie Free Domains ===
Cookies force static content to be reloaded. Static content can be hosted on an alternate subdomain in most cases.  example: static.google.com
 
=== CDN ===
Use of a CDN for static content will speed things up, and distribute load to client's geographically local nodes.  since a cdn is external to the website you don't have to worry about cookies forcing a cache to reload the content.
 
=== CSS & JS Minification ===
Minified CSS & JavaScript is simply your file with all unnecessary spaces, and carriage returns removed.
 
=== CSS & JS Positioning ===
CSS insertion needs to happen at the top of your web pages source.  JavaScript needs to be loaded last, so that goes at the bottom of your web pages source.
 
=== Avoiding Inline Styles ===
CSS has several forms but inline styles should be avoided. Prefer an external CSS file, preferably distributed by a cdn.
 
=== Meta Description/Keywords/Tags ===
Meta information is parsed directly by search engines.  with these in place they will be displayed rather than the initial text of your canonical landing page.
 
==== Dublin core ====
Dublin core is a metadata system that produces details for search engines to discover such as authors, & publication dates.  many sites generate this meta information dynamically.
 
=== Links ===
Posting links around about your site is ok so long as it is organic, and not spammy. addthis share widgets improve SEO as they produce many backlinks that are not a bot dumping anywhere, and everywhere.  addthis produces twitter/facebook/g+/redit etc share links, and has an analytic service.
 
Linking to high quality trusted sites helps, high quality trusted sites linking back helps.
Linking to low quality untrusted sites hurts, low quality untrusted sites linking back hurts.
 
=== SiteMaps & robots.txt ===
Sitemaps generate xml pages that tell crawlers about your sites content pages, and robots.txt tell crawlers about pages they're not allowed to index.  In a few iterations ill look up how to make a robots that allows everything, as if its not even there, and an external site map generating service.
 
=== Analytics ===
Analytics tell you information about your websites users.  Some analytics services enhance seo, others are just informative.
http://www.google.com/analytics/
https://www.quantcast.com
 
piwik is server side analytics.
 
=== Webmaster Tools ===
Sign up for webmastering services provided by search engines.  Bing and Yahoo are bundled together now.
 
*https://www.google.com/webmasters/
*https://webmaster.yandex.com/
*http://www.bing.com/toolbox/webmaster
 
=== Establishing Trust ===
Encourage users to tell the internet your website is trusted.
*https://www.mywot.com
 
===testing, and evaluating===
==== free analysis ====
* http://www.webrankpage.com/
* http://www.seomastering.com/ (shows estimated page value in USD)
* http://seositecheckup.com/ (throttled to 1 check every 30 minutes)
* http://www.site-seo-analysis.com/
* http://www.seoworkers.com/tools/analyzer.html
 
==== free with required registration ====
* http://www.site-analyzer.com/
*http://www.wpromote.com/seo/seo-audit-tool (shows which pages are ranked)
 
==== pay with free trial ====
free 1 site per week testing
* http://www.woorank.com/
 
==== unsorted goodies ====
*http://www.webpagetest.org/
*http://www.seocentro.com/tools/seo/seo-analyzer.html
*http://www.seoptimer.com/
*https://www.found.co.uk/seo-tool/
*https://zadroweb.com/seo-auditor/
*https://marketing.grader.com/
*http://www.alexa.com/
 
 
=== Malware & Bad Link Scanning Services ===
*http://www.quttera.com/home
*https://app.webinspector.com/
*http://scanurl.net/
*http://safeweb.norton.com/
*http://www.avgthreatlabs.com/website-safety-reports/
*http://www.penguinscan.com/

Revision as of 17:31, July 12, 2015

On my AMD Jaguar build server, on Feb 20, 2015, this lists all the builds that buildrepo has been configured to manage. The first number on each line is a failcount, which is the number of consecutive times that the build has failed. A zero value indicates that everything's okay. The failcount is an important feature of the advanced repository management features. Here are a number of behaviors that are implemented based on failcount: