Nov 142014

Amazon Web Services (AWS) hosting offers a great flexibility in the configuration of a computer platform, where each of the resources involved (CPU power, storage, bandwidth,etc.) can be independently configured.

Besides, the cost of the service is a function of the use of resources (pay as you go). Thus, a server is only charged for the hours that it has been active, and the cost of the storage depends on the number of I/O transactions performed.

These characteristics make AWS very different from the service offered by other hosting providers, where the user can choose among a limited set of server configurations, at a fixed monthly cost.

But this also means that the complexity involved in the configuration of the service is higher for AWS. A monitoring service is also required to be able to detect  and correct unexpected peaks in resource consumption, to avoid incurring in excessive costs.

Nevertheless, it is worth considering AWS as an alternative to more traditional VPS platforms, when it comes to setting up a platform for a web service that might need to be scaled as the use of the service grows.

This post is an overview of the main considerations and caveats that need to be taken into account in the setup of a platform on AWS.

Continue reading »

 Posted by at 5:54 pm
Jul 282014

Munin is a full-featured open source server monitoring tool that implements a web frontend for the visualization of the evolution of the most relevant performance parameters: CPU, main memory and disk usage, number of processes, etc.

This post goes through the steps involved in the setup and basic configuration of this useful tool.

Continue reading »

 Posted by at 8:59 am
Jul 172014

Increasing the volume of visits is one of the basic objectives of most websites. And in most cases, the greater number of visits comes from the main internet search engines, such as Google and Bing.

Indeed, for the pages in our site to appear in the search result pages of these search engines, they must have previously been added to their indexes.

To index a site, search engines run specialized applications commonly referred to as “bots”, “crawlers” or “spiders”. These bots navigate the content of web sites, reading the content of the pages they find.

In principle, being actively crawled by different search engines it is a good signal for a site. But it may happen that the number of accesses done by crawlers becomes excessive, affecting the performance of the server, making it appear as slow or unresponsive to users. In this post we will review different possibilities to limit the crawl rate of the main internet search engines. Continue reading »

 Posted by at 6:55 am
May 142014

If a system administrator suspects that one of the filesystems might be damaged, he/she should proceed to analyze, and eventually repair the errors found on it. On a Linux system, the fsck (File System ChecK) command is used to perform this analysis and repair.

But the filesystem to be repaired must be unmounted before running the fsck command on it, and this is not possible if the file system is the root filesystem itself.

Continue reading »

 Posted by at 8:20 pm