A Brilliant App Optimization/Monitoring Tool – New Relic!

Almost 24 hours ago, one of my friend referred to me an interesting offer from ‘tutplus’

http://dev.tutsplus.com/articles/get-a-free-year-of-tuts-premium-by-trying-new-relic–cms-12

It seems Tutplus either affiliated or owned a new App optimization tool named “New Relic”. My primary objective was of course to get the free Tut+ Premium for a year and the Nerd T-shirt, and whats hard in deploying a PHP App Monitoring tool in one of the server! So I started.

The deployment of the tools are fairly easy. I am not really in the Mobile App thing, so I had chosen the PHP Web App monitoring tool. The deployment is well instructed. Its a RPM based installer for RHEL based releases, pretty clean and simple. Once the installation was done, it added a shared object in my PHP interpreter and started grabbing data. Out of a surprise, I started seeing details that are really cool. Things like “Errors” and “Stack Trace” are the finest invention of this tool. The Stack trace gives you reports like “strace” which is my favorite tool of linux debugging facility. The basic advantage of this feature in New Relic is, it saves the data and post you as a token in the dashboard of new relic. Now, isn’t it brilliant? I sorted almost 23 major bugs in client’s account since I have installed the monitor. Database monitoring also includes some exceptional features that are not usually available in App Monitoring/Optimizations tools I had used before.

Unfortunately, the tool is free for 2 weeks. Since then, the “Pro” version comes with 150$ a month per host. The price is certainly high, but the result is truly amazing, looking at the features and performance of the tool.

At the end of all, I had my Tut+ premium for one year for free of charge and a nerd T-shirt on the way to my home 😀

If you haven’t tried it, you can try it now. If you are an android developer, you can add the code in your app, and monitor your App for 14 days for free, and get a Tut+ premium for free for a year.

Just for a record, I am not affiliated with neither Tut+ nor New Relic. The link should not contain any affiliate url.

Happy troubleshooting!

How to track all outgoing mails in Exim

If you are a mail server administrator and possibly using one of the most used open source mail server namely Exim, you might require to monitor the outgoing mails to track down a spammer. In shared web servers, you can use some regular expressions on the mail logs to trace a spammer log. But sometimes, you might fail to find a possible spammer if you have a huge amount of users in the server and a lot of users are actually spamming. Most of the cases, user’s accounts are compromised and intruders utilize the facility to send out spam.

Sometimes, a better way is to store a copy of the each mails sent using Exim and use the regular expressions on the header details to track down the original spammer. Just for the record, storing email data may breach the privacy concern if it is a public server and this should only be used to track the original spammer.

Exim can utilize 3 levels of mail filtering. I have used System Filtering to deliver a copy of each mail sent to a local mailbox. A system filter works for all the accounts and users under Exim. In Cpanel, you can set the exim filter from Cpanel >> Service Configuration >> Exim Configuration Manager >> Filter

From command line, open the /etc/exim.conf and find the line starts with “system_filter”.

By default, cpanel uses a system filter located under “/etc/cpanel_exim_system_filter”. Just for the record, this copy will always get reverted to the default on each cpanel update. We need to make a customized filter for our use. I did the following:

cp /etc/cpanel_exim_system_filter /etc/exim_system_filter_mellowhost

I made a copy of the original system filter to exim_system_filter_mellowhost. Now open the copy with your favorite text editor, mine is always nano.

Now, you need to add some simple shell script inside this custom filter using Exim filtering commands you can find here:

http://www.exim.org/exim-html-3.30/doc/html/filter_29.html

Here is a shortcode I have used:

if first_delivery
and ("$h_from:" does not contain "[email protected]")
and not ("$h_X-Spam-Checker-Version:" begins "SpamAssassin")
then
unseen deliver "[email protected]"
endif

Just for the record, “localdelivery.com” is an account I have created under the same server. I don’t own the domain neither operate it. I have used it to create a local inbox and deliver the mails for me. You just need to make sure the domain lies in /etc/localdomains. That is the local resolver for Exim and it won’t go for a dns resolution check if the domain is available under localdomains which serves our purpose. You need to make sure, you create an email account with the localdelivery.com, in my case, I created an individual inbox with “[email protected]”.

Now, here is the breakdown of the shortcode. “first_delivery” means the mail is just sent, it hasn’t been queued or relayed. “$h_from” is a variable used by exim to determine the from address from each mail header. So, I am checking whether the mail is the just dispatched from a mail user and whether the mail was ever delivered to [email protected] or not using the 2nd line where it checks if the from “does not contain” (an exim filter command) our local delivery mail address. If the line isn’t included, your mail forwarder will fall in an infinite loop and keep forwarding your own mails to yourself.

The third condition is included if you have SpamAssassin installed to check your mails for spam. SpamAssassin is an individual daemon that will check every first delivery of mails, add its spam score in the header and send the mail again. That would make Exim realize the mail is another “First Delivery”. So, if the SpamAssassin score is added, we are safely discarding them as we have already received those mails in our local inbox.

Now the production of the all clauses is very simple. It is delivering the mail as “unseen deliver” (an exim command, means make the mail unread in the inbox” to our localdelivery inbox.

How can you trace down the spammer from an aggregated inbox?

It depends on how would you like to use regular expressions and tools like “grep, awk, cut” etc. Let me give you some insight on basic.

First of all, all these mails are actually getting stored as text files under the local mail directory. In my case, it is under “/home/localdel/mail/localdelivery.com/tracker/”.

Now move your shell prompt to the folder “cur” (current mails in mail directory). If you check the files, you should see each mails are stored as one individual text file.

In my case, I usually sort the subjects first and track down if there is spammer out there. You can do that using the following:

grep -i "Subject: " *

This would result all the subject and the file name.

One of my favorite way to track down a spammer is to check for Duplicate subjects. You can do it as following:

cat *|grep "Subject: "|cut -d":" -f2|sort|uniq -c|sort -n

cut is a tool to divide the sentence using regular expression and print the part you want. In my case, I am dividing the Subject lines with “:” and printing the 2nd column which is our original subject. Now we are sorting the result alphabetically with “sort”. Counting the unique values with “uniq -c” and sorting them again from low to high using sort -n.

This was just the basic of using parsing and trace out as spammer. The more you work with the spam mails, the more you will understand. Parsing talent learns based on experience.

Happy Troubleshooting!

Why are we using Softlayer Nameservers?

I was reviewing the live chat transcripts earlier today. An interesting one that was served by “Ronskit”, a live chat operator of Mellowhost caught my attention. One of our visitor was interested to know, why are we using Softlayer nameservers for the domain “mellowhost.com” (http://intodns.com/mellowhost.com) instead of ns1.mellowhost.com or so on. The visitor was more interested in proving that Mellowhost is hosted in a shared server and all of our clients are also using a server that is not really managed by Mellowhost. His excuses were flowing towards why we don’t sell VPS or Master Resellers, or so called “Alpha” Master Resellers instead we only sell Reseller and Shared Hosting. It is eventually hard to answer a management level of query by a sales representative and as expected he wasn’t able to please the visitor 🙂 I quickly thought to write this down for future references.

 

Continue reading “Why are we using Softlayer Nameservers?”

Experience with Varnish!

When Mellowhost first launched her servers, all of them were using “DSO” module for serving php. I can remember, one of the most commonly used caching plugin was either Eaccelerator or Xcache. Eaccelerator was preferable as cpanel Easyapache have this in their option and can be compiled automatically while rebuilding apache. As time passed, we had to choose suphp instead of dso due to many factors involved. I hope to write them down at later time why we had to move to suphp. But that actually cut the idea of using dynamic cacher in the server. Suphp kills the php process after serving, this allows all the opcode cacher to be valueless. Due to cutting off a dynamic cacher, the server started showing pretty good load average. Although it was a good trade off of IO and CPU usage in cost of security. I was searching for a cacher that would work with suphp in the same technique Litespeed (A paid web server software) does.

Continue reading “Experience with Varnish!”

How much data does Mellowhost have in their Backup?

It should be pretty known if you are a Mellowhost customer that we backup our servers on daily basis. We are currently using R1Soft CDP for each of our servers. All the backup servers are offsite, that means they are not hosted in the same server you are using with Mellowhost and not even in Softlayer network. Continue reading “How much data does Mellowhost have in their Backup?”

2Checkout

As we do not have an offline Credit Card processing option, we were planning to add 2Checkout for long time. We had been seeing many requests from “Africa” and Middle East countries requesting to use 2Checkout. We have finally made 2Checkout available for all types of Payment to Mellowhost services. You can now select 2Checkout from the Payment Option for new orders or paying the recurring bills. 2Checkout can be used for paying with Direct Credit Card.

Happy Hosting!

wp-supercache plugin for MH servers

I had written about using a cache plugin with all the wordpress blogs in order to reduce the CPU usage before. Although, some of our clients were complaining about issues with the most popular “wp-supercache” plugin with couple of our servers. We use some custom security protection which might block couple of wp-supercache commands. I here therefore, uploaded a workable version of  latest wp-supercache 0.9.9.9 that works perfectly with our servers. You can download the latest version of wp-supercache compatible with our servers here:

http://mellowhost.com/downloads/wp-supercache.tar.gz

Wp-supercache is a property of its original author. More details about this plugin is available here:
http://wordpress.org/extend/plugins/wp-super-cache/

48 restless hours!

RAID is not a backup solution, it is proved again! I was planning to write my experience of 48 hours from July 22 7:17 to July 24 7:23 GMT -5, couldn’t really manage to get some time. All the users who were in the Hemonto server should be aware about the recent issue we faced with our RAID. This post is just to elaborate how did we handle the situation.

Continue reading “48 restless hours!”

Some good budget servers!

We use Softlayer and Liquidweb for all of our production servers. None of them is really a budget server provider. Softlayer does sell some budget servers which are not at all good for production servers planned for web hosting services due to their inability to upgrade in future (Like xpress servers). Moreover the price isn’t really right for the same set of hardwares with some other budget provider. We have been using budget servers for our backup servers which usually can hold tons of TBs of data.

Continue reading “Some good budget servers!”

Form Spam

I have monitoring this thing for long. It is really becoming a headache now. It continuously consuming a lot of CPU and Mysql resources for no reason. Form Spam like wordpress comment spam, directory registration/submission spam, forum spam consumes around 33% of the total CPU usage of a day in one of Mellowhost’s old server according to my calculation a couple of minutes ago. This consumption is pretty huge and grows as the server grows.

For some reason due to the development of auto scripts installer like Softaculous and Fantastico, users tend to try each script and leave them unattended. This keeps leaving form exploits for the botnet attackers. A wordpress blog without akismet is potentially threatened to form spam attack. Most of the phpbb forums contains no protection at all on the initial installation. These let the auto bot spammers to post their links in unattended forums/blogs to gain backlinks from their perspective.

This is not eventually only harmful for the server in realtime but also threaten the reputation of the shared IP. I have been working to try to develop a protection server wide to stop these spammers, but every attempt seems inadequate.

In many cases, it is hard to control or check manually as resellers add users and the users add many addon domains. It grows almost everyday. It is advised for every user, not to keep unattended blog/forum/script. It is always better to add some “Captcha” in all sort of registration form. Nowadays, spammers have broken the Captchas as well. Some people have already started some solution called Random questions. But anyway, there should be something like verification in the registration and the comments shouldn’t be allowed without registration. You can also add the Akismet plugin which is available for almost all the blogs and forums. It drastically reduces the number of spam and acts pretty quickly.

Protecting form spam is not only good for the server, but also good for your sites reputation. If you are hosting an unattended blog script inside one of your main site, than it may receive a serious damage of reputation in SEO if the unattended blog is regularly spammed by malicious users. So, check now, if you have any unattended script inside a folder, you should probably double check and delete it if it is not essential or protect it from auto botnets.