Below you will find pages that utilize the taxonomy term “SEO”
I wanted to share this post with readers who are interested in increasing their ranking keywords with very little effort. I was able to nearly double my ranking keywords in as a little as two weeks by doing a few optimization tweaks.
How it all started It all started with some FB messaging to my niece and nephew over in Berlin. Covid19 had them pretty much shut-in for a while so they started working and optimizing their travel blog.
I’m working on syncing up my post comments on Disqus again. Since I moved to Hugo, I had to update my post slugs to generate into different channels. I split the majority of posts in /blog/, /tutorials/, and /reviews/ now. There’s a lot of great comments in many of my popular posts but because Disqus has this weird way of mapping the URLs, I turned off Disqus until I worked on a solution to syncing them from /my-old-slug to /blog/my-new-slug/.
On the heels of my Seo Optimization for Hugo post, I spent quite a bit of time learning how to use the built-in image processing to make responsive images in Hugo. Responsive images are pretty powerful for website performance. People use different browser sizes when viewing content so selecting the right image size to show on loading is important from an SEO and website performance standpoint.
A lot of CMS’s (like Wordpress) do this automatically when you upload an image but Hugo doesn’t do it out of the box.
I made the switch over to Hugo a short while and love the fast builds. You can read all about me ‘crushing’ on it in my What I Learned using Hugo post. There’s one main thing I want highlight in this post and it’s how to optimize Hugo for SEO. Hugo, just like any other CMS, is pretty good out of the box but you need to optimize it. Here are my Hugo SEO tips for optimization.
It’s no secret that I’ve switched CMS’s so many times that it makes my head and my reader’s heads spin. Take a look at my CMS tag and you’ll see my struggles with Wordpress, Expression Engine, Text Pattern, and Blot.
I made a big migration to Jekyll and loved the static type of CMS’s ever since. Fast forward to today and I’ve pretty much settled on Pelican. Why? It’s a static CMS and its python powered.
Bless me Father for I have sinned. Please forgive me for my content creation and management sins. Yes, I admit I’ve have mismanaged good content for the sake of expediency. Often I’ve have ignored how my readers consume my evergreen content. For that I beg for your absolution. I promise, on my CMS’s grave, not to stray from the path of good content creation and management again. Amen.
Today It’s been almost 10 years since I started this blog and its gone through many evolutions.
Recently I imported some custom reports in Google Analytics that I found online. They have been eye opening indeed! My most favorite ones are the Profit Index and Time of Day custom reports.
Google Analytics assigns a page value to each and every page you have, provided you use Goals. Without using Goals, this won’t work! In my previous post, I wrote about how I started using Goals to see how readers interacted with my site.
I’ve been looking at ways to hack the SEO and Meta Data with Blot.Im. I wanted to hack the meta data in all my posts so I can use Twitter Cards and Open Graph more effectively. Why? I get a two for one benefit right off the bat. I get better SEO and deletion of meta data duplicates that affect Google crawlers.
My changes are already having a huge effect.
This is my first stab at trying to figure out how to optimize the SEO here. A doodle/sketch just seems the fastest in this case.
Update I have since given a lot of thought to SEO on this blog as I migrated it from one CMS to another. Each CMS handles SEO in a completely different manner and over time the migrations have had, in some way, a damaging effect here.
So I finally got around to downloading some keyword data from Google Analytics for the time period of 2/17/11 through 3/17/11 just to see what's driving my site traffic. I did a simple text mining process in Rapidminer to build my keyword frequency list (it took me a few minutes) and generated keyword similarities. Of course I know what is the biggest draw to my site, that would be my tutorials about Rapidminer, BUT what I'm looking for are subtler patterns in the keywords relative to the bounce rates and site visits.
I decided to tweak out this blog's web income in January and posted my first income report this year as a starting metric. Below is February's income report, which is showing a pattern of growth, mostly due to my new Rapidminer 5.0 Video Tutorials. All i'm doing now is collecting data to feed into a Rapidminer model later and using typical white hat SEO tricks right now.
Google Adsense: $19.88 ( +78%)
In January of this year, I decided to ramp up my posting frequency again for Neural Market Trends. Part of the reason is to data mine blog traffic data for the express purpose of learning more about SEO and Internet Marketing. I started this doing this after I had a few beers with the Market Doctor and we discussed the long tail statistics of Internet Marketing and how some blogs just make money hand over fist.
As an effort to help search engines index my site better, I installed a robots.txt file. The reason why I did this will become clear in a minute. When you run a blog, search engines will parse your site and index it with a lot of duplicate data and posts. Duplicate data and posts can have a negative effect whereas the spider might think that your blog is a spam blog (splog).
Yes, you read correctly. You can data mine your blog’s traffic using a simple web statistics data collector like Google Adwords. I’m doing it right now for Neural Market Trends and I’m finding out some very interesting information. I found out that:
Tuesday’s are my busiest days.
The optimal amount of posts per day should be 2.
The most popular post category happens to be my posts about Forex and YALE.