[fr] Mise à jour de WordPress. Installation de Jetpack et de Robots Meta. Comme toujours, faites signe en cas de comportement inhabituel du site!
Sorry for the neologisms in the title. I’ve upgraded CTTS to WordPress 3.1 (you should do it too if you haven’t done it yet, lest you fall prey to hackers as I did earlier this year). While I was at it, I also installed Jetpack, the plugin that brings to WordPress.org blogs goodness from WordPress.com.
I use WordPress.com for almost all my projects, and all my clients. For most of the people I work with, it’s just not worth the hassle to have to deal with upgrades, technical issues, and potential hackings. For CTTS, however, I do depend on plugins like Basic Bilingual which are not (yet?!) a part of the WordPress.com offering. Also, I admit the geek in me likes having her own installation and code to tinker with.
Finally, I installed the Robots Meta plugin. You know me, I’m always a bit wary of the fancy SEO stuff (specially as many people who write about it seem completely obsessed with it, rather than obsessing on doing and saying interesting things). I’m really unimpressed with all the panic over duplicate content for example, especially as it didn’t seem to sound like a huge issue in blogs when I heard Matt Cutts giving us SEO tips in 2007 — I happily cross-post a lot of my writing “elsewhere” back here and I don’t think I’ve suffered unreasonably from it.
Anyway: lately, I’ve read a few analytics/SEO articles that seemed sensible to me and I’m starting to take a tiny (tiny!) bit of interest in the subject.
I’ve been using the Google Sitemap Generator plugin for some time now, and hanging out in my Google Webmaster Central — particularly since my hacking incident.
Also, it was brought to my attention today that there are old articles lying around on CTTS which are ranked very highly for certain searches even though they are really not that relevant anymore. Though I’m loathe to remove them altogether, I could very well remove them from search engine listings — and the Robots Meta plugin allows me to do just that.
So, I’ve taken the plunge and am now only allowing search engines to index my home page (of course) and single article pages, blocking them from date, category and tag archives as well as comment feeds.
We’ll see what happens — I’m a bit worried I may have gone overboard and I wonder what the consequences of those settings can be to other crawlers like BackType and IceRocket. If you have any intel to give me on that topic, I’m happy to take it. I feel a bit like I’ve been giving orders to my robot blocker without really understanding all the consequences.
Your website is rather old and has a high trustrank level. On top of that it’s so full of content that you are one of the few persons that cannot really fear the duplicate content.
It has always seemed to me “too much” to restrict from categories AND tags AND archives. Archives in general are the most senseless way of organizing an content (which means semantical on the page might be difficult), categories and tags… well it depends of the content. On my blogs, tags can have very few articles, so I block them, but categories always have a lot of article, so I don’t block them.
One useful feature is to use the “description” and print it on top of the category archive page
Finally, speaking of plugin, I happily replaced Google Sitemap AND AIOSEO with WordPress SEO by Yoast… you should have a look 🙂