The Fine Line Between Spam and Email Marketing

Posted by Nessa | Posted in uncategorized | Posted on 03-06-2010

7

If I had a Viagra pill for every time I confronted a spammer who pulled the “email marketing” excuse…

There’s no debate, no exceptions, and no justification. If you’re sending out massive amounts of unsolicited email, you’re a spammer. Period.  You’re not running a legitimate or respected marketing campaign, you’re not helping anyone with their emotional or erectile dysfunction problems, and in reality, no one cares to read what you have to say.

A lot of hosting providers are cracking down on spammers – and from experience, I can tell you why:  Spammers are an inconvenience to everyone – even themselves. They are an inconvenience to me, the system administrator that has to sift through spam complaints and spend hours every week tracking them down.  They are an inconvenience to our customers, who find their email being blacklisted because of a spammer on their network. And finally, they are an inconvenience to you, the Internet user, that has to deal with getting spam on a daily basis because some people have nothing better to do.

And here’s how you know if you’re one of those some people:

  • You purposely find ways to circumvent your ISP or host’s mailing limits instead of simply asking
  • You’re harvesting or purchasing email addresses off of websites, other mailing lists, or third parties to compose your recipient base
  • You’re sending those people [unsolicited] email advertising yourself, a product, or a website
  • You use spoofing or other tactics to hide your email address or server information
  • You don’t give your victims recipients a way to opt out of your torturous email campaigns
  • You hide behind the CAN SPAM law to justify your behavior
  • You get the slight inkling that people hate you for what you do

So what’s the fine line between spam and email marketing?

It’s all about honesty and consent.  If I contact a fellow blog owner requesting a link exchange, I wouldn’t technically consider that spam. If I send the same email to 40 other people, I’m crossing the line.  A legitimate, non-spammy email campaign would consist of a database of opted-in users, and email content consistent with what those users requested to be in the loop for.  If any of those users decide they don’t want to participate anymore, they are given a quick and easy way to remove themselves from the list.  The fine line between spamming and email marketing is the concept of opting in. Simply put, email marketers use opt-in lists, spammers don’t.

And no, I’m not talking about purchasing opt-in lists that other people have compiled.

Spammers have ruined the concept of email marketing enough to where now even legitimate email marketers are being accused of spamming, and many hosts won’t even work with them.  That’s not all hosts are doing to fight back, either.  Many larger hosting providers are so tired of dealing with spammers on their network that they impose mailing limitations that tend to inconvenience other users. Here are just a few:

  • Limiting the number of emails sent per minute, hour, or day
  • Limiting the number of recipients that can exist in a single email or BCC field
  • Locking outbound SMTP connections so scripts can’t send email from remote servers
  • Blocking email sent from certain system users (like the Apache user), requiring the use of authenticated mail sessions

For everyone else, here are a few tips on dealing with SPAM:

  • Delete it – it takes two seconds
  • Learn what a spam filter is, and use it
  • Stop trying to play Internet police. Feel free to report spam to the ISP or host, but don’t start spouting off with legal threats. It’s not going to change the fact that millions of spam emails are sent every day, and no court is going to waste their time on you
  • Don’t assume that the ISP or host can read minds – do you think they would have intentionally allowed a spammer to sign up for their service?

And for ISP’s and hosts, you have responsibilities as well:

  • Don’t be afraid to impose the aforementioned limitations on your servers. Your goal should be to look out for the best interests of your customers as a whole
  • Require justification from users that want to send large mailing lists, asking them how much email they are sending, who they are sending to, and whether they have an opt-in/out method
  • Set up abuse@ and postmaster@ email addresses, which will usually be where complaints are sent to. This way you’re aware of users that may be abusing your network, even if
  • Sign up for feedback loops, so automated spam reports from various email providers are sent to you to review
  • Deal with spammers ASAP. Not doing so can end up causing your network to get blacklisted, or have complaints escalated to cancellations from your customers – or even legal threats

Programming Tip: Assume Your Users are Idiots

Posted by Nessa | Posted in uncategorized | Posted on 30-05-2010

3

Any programmer knows the golden rule of programming – no matter how well you’ve coded an application, there’s always going to be something wrong with it. I’ve done enough development work to have a lasting suspicion that if there’s a bug or hole to be found, someone will stumble upon it and rub it in your face.

Here’s an interesting fact:

There’s no such thing as a bug-free application.

No amount of poking, prodding, testing, slurping, or caressing is going to find every possible fault that can exist in an application. Somewhere along the line, one of your users is going to trigger a problem and cause you to spend a few hours patching code.  It’s like idiot-proofing a microwave – you can’t reasonably predict every possible thing that a user can do, you just do what you can and hope for the best.

The good thing about these idiots is that they make us better programmers.  To be a better programmer, you have to think like an idiot and apply some basic principles:

1. Validation

Check and maybe even double-check all types of input and assume the worst. Sure, maybe that user didn’t know that Sex referred to gender, but you should have thought of that.  Always take into account blank, malformed, incorrect, malicious, and duplicate data.

2. Default Actions

Any time you use conditionals, always combine validation with a default action, in case something unexpected happens. Do you know what your application is going to do if a specified condition isn’t met?

3. User behavior

Some people do things you don’t want them to, but you have to be ready for it anyways.  Does your application work correctly if people hit the “back” or “refresh” buttons? Is it going to cause a problem if someone bypasses your lightbox and opens a link in a new tab instead? Or bookmarks a page that was meant to be accessed from a login screen?

4. Acceptance

Accept the fact that no matter what advice I give, you’re still never going to make it perfect.

And don’t forget – testing, testing, testing. While some people I deal with like to believe that I don’t actually test anything, I do – I just also know the golden rule of programming and that there’s no way around it.  Testing is an ongoing process and requires both automated and manual work. Don’t knowingly leave a bug or security flaw in place and assume it will go unnoticed – trust me, it won’t.  If there’s one thing idiots are good at, it’s making you look like an idiot.

WordPress Thinks Network Solutions is Stupid

Posted by Nessa | Posted in uncategorized | Posted on 22-04-2010

6

Quick quiz: What does a hosting provider do when they know they’ve messed up and don’t want to deal with the fallout?

You apparently blame WordPress.

Don’t get me wrong here – being behind the scenes of server management for a webhosting company makes you the target of a lot of accusations.  And yes, most of the time when a user’s site gets hacked it’s their own damn fault. But in this case, Network Solutions is apparently trying to push their issues off on WordPress because they don’t want to admit they f***cked up.

Well, WordPress is pissed.  I logged into my dashboard today and the first thing I see in my news feed is:

A web host had a crappy server configuration that allowed people on the same box to read each others’ configuration files, and some members of the “security” press have tried to turn this into a “WordPress vulnerability” story.

To highlight the best part:

I’m not even going to link any of the articles because they have so many inaccuracies you become stupider by reading them.

P.S. Network Solutions, it’s “WordPress” not “Word Press.”

Burned.

In short, Network Solutions acknowledged that most of the problem was due to users’ public_html and wp-config.php files being readable by other users on the server – something which could have easily been caused by the users setting the permissions of those files insecurely. But they took a shot in the dark and said that the problem was caused by WordPress putting cleartext database credentials in the wp-config.php file – something that just about every software developer does, as WordPress states:

WordPress, like all other web applications, must store database connection info in clear text. Encrypting credentials doesn’t matter because the keys have to be stored where the web server can read them in order to decrypt the data. If a malicious user has access to the file system — like they appeared to have in this case — it is trivial to obtain the keys and decrypt the information. When you leave the keys to the door in the lock, does it help to lock the door?

Good point. They also went on to say that a properly configured web server will not allow users to access the files of another user, regardless of file permissions. This is why most hosts have switched to using suPHP or phpsuexec, a technology which Network Solutions was apparently left in the dark on. At least now they seem to be taking responsibility for the problem and are attempting to handle it.

I’m also going to state, based on comments in popular blogs from users that don’t know what the hell they’re talking about, that unless someone has access to view the source of a PHP file, they can’t see the database credentials. PHP files are executed server-side, and only their output is sent to the browser. Since the username and password are stored as variables and are not echoed out anywhere, someone simply calling wp-config.php from a browser can’t access your login data.

You’re probably going to find all kinds of fixes on various sites that this story is covered on, but I’m going to give the same advice I do for all my customers that have had sites hacked:

  • Change your FTP and MySQL user passwords
  • Replace all files on your site from a ‘clean’ backup
  • Make sure the software on your site is up to date
  • Scan your PC for viruses
  • Choose a secure host

Remember that your site can get hacked regardless of who your host is or how secure they are, though your host has to take some level of responsibly for hacks that are caused by their own bad configuration, such as in the case with Network Solutions.

Who Gives a Crap About “The Cloud”?

Posted by Nessa | Posted in uncategorized | Posted on 11-02-2010

25

That was my question all through HostingCon last year. Almost every pillar seminar had some mention about “the cloud” outside of any context that meant anything other than finding an excuse to talk about cloud hosting.  But really, who cares about cloud hosting?

No really — I’d like to know who thinks cloud hosting is really worth its hype and would benefit a hosting provider offering shared hosting services.

You need special hardware and software to efficiently support a cloud hosting platform. It’s not like a cPanel server you can turn on and set up – and I so far haven’t come across any [good] user-side control panels available for cloud hosting. That means that you’re going to have to find a way to come up with your own.  Since the hardware is also specialized, I’m sure the scope of vendors is limited, and those vendors probably take full advantage of that by cranking up their prices.

And…

The purpose of cloud hosting is expandability and reliability. You have multiple servers working in tandem serving sites, so if one server has a problem, the others pick up the slack. Then if you plan on doing what other hosting providers so, you’ll charge your clients based on how much system resources they are using instead of changing their hosting plan every time they have a burst of traffic. The part about the stability is great – but the same can be achieved by load balancing.  And not limiting a user’s resources but charging them for what they actually use is great too – until they use too much, especially in conjunction with other users on the system who are coincidently “overusing” resources as well.  But you’re probably losing money, and fooling those customers into thinking that they can get away with running that junk on a shared server.  Thank you, Mr. Over-Cloudy Shared Hosting Provider, for providing a false sense of need to your customers so they cause a problem for the rest of us when they decide to switch hosts.  

I don’t know how they do it in the cloudy wonderland up there, but in the real world of hosting, if one of my customers is burning an excessive amount of CPU cycles, they’re not going to be on one of my shared servers – they’re moving to a dedicated server.  If a site gets enough traffic to warrant VPS or Dedicated hosting, why would you willingly keep them on a shared server? You’re stunting your revenue by 1) allowing high resource customers to pay for shared hosting, even if the cost fluctuates based on their usage, and 2) decreasing your shared server capacity so you end up needing more servers to accommodate users that shouldn’t be on them to begin with.When a server runs out of resources it runs out of resources – whether it’s one server or 10 servers “clouded” together.

Cloud hosting tends to only beneficial to the customer, who is certainly getting the better end of the deal by costing you money.  I’m just going to put it out there that while customers probably like the concept of cloud hosting, most probably have no idea what it actually is, and wouldn’t notice any change in hosting quality from that of a standalone or clustered hosting solution.  So I’m sure you could actually just run their site off a crap dedicated server with 100 other customers and randomly change their hosting bill every month to make it look like they’re getting cloud hosting, then laugh while they talk about how awesome it is to be on the cutting edge of technology.  Heh.

That also brings up a customer service point about cost.  I checked a few pricing points for cloud hosting providers, most of which charge on a percentage of RAM and CPU cycles used per month.  To me, that just screams customer service problems. Most of the time when I try to tell a customer that they are using too much CPU on a shared server, the first thing they do is either deny it, or blame it on us.  You can imagine what would happen if a customer’s traffic quadrupled one month and they look at their bill, suddenly realizing that they were charged more.  A majority of your customers are likely non-technical and therefore not going to understand why their hosting charges changed.

Don’t get me wrong – I’m not against cloud hosting, I just don’t care for it, and I’m tired of hearing about.  If you’re a hosting consumer and looking for hosting and your site is as massive as Google, you could benefit from a dedicated cloud hosting solution. But otherwise, just stick with the simple stuff. Standalone and clustered servers have been used for years, and tend to be very reliable if managed efficiently.

I mean, people thought the iPad was going to be the next best thing but it turned out to be a piece of shit.

v-nessa.net is de-Googled

Posted by Nessa | Posted in uncategorized | Posted on 21-07-2008

11

Um….what?

I thought something was up when for the last month my traffic rate went from 300,000 in May to only 24,000 in June, even moreso when I found out my pagerank dropped from a 6 to a 4. I was de-Googled! Apparently a month or so ago a spam comment slipped past Akismet and got posted to my blog, and that meant that my site no longer meets the “quality” guidelines for Google. I found the post and removed it, but upon filling out their reconsideration form it appears that it takes 4-6 weeks to be reviewed and added back. Man, that sucks….

Anywho, I’d hate to do this mainly because I’m lazy, but I’ve set WordPress to send new comments to moderation again so I have to manually approve them. What I’m confused about is that if my site doesn’t meet quality guidelines for Google, where does this guy fit in?

Someone’s Got the Internet AIDS…

Posted by Nessa | Posted in uncategorized | Posted on 23-03-2008

7

I knew something was fishy when I got an IM from my ex whom I haven’t spoken to in over a year:

hey How are you???? this is ur pic rite?! http://www.msn-gallery.com/gallery.php?user=blue_butterfly21.jpg

Worse enough I can’t believe I clicked on that shit.. I thought maybe it was one of those pictures from the amateur night at JB’s Gallery of Girl back in 2004 that caught up with me. But no, as soon as I clicked on it my PC (which unfortunately is the one that runs Windows XP) froze up for a good minute during which time it was sending the same message to all 158 people in my MSN friends list.

Arrrggg…anywho, the virus — which is the Backdoor.Generic3.SAT – is pretty harmless as far as your PC is concerned but you’ll probably get  kicked every time you open an MSN window. So, close your MSN and go here and here to read about how you get cure the internet STD’s you’ve probably just spread around to all your friends. It’s like the 70′s all over again only the free clinic isn’t as crowded.

Google Wants to Be the Next God of the Universe

Posted by Nessa | Posted in uncategorized | Posted on 02-12-2007

2

I thought this was too shocking to be true but the other system admins confirmed it – Google officially wants to be the next god of the universe. The datacenter that houses over a hundred of our servers also caters to some of Google’s servers, and apparently Google also owns part of the building or something like that. They decided that they don’t have enough power for their servers, so they are actually demanding that the entire datacenter be stripped of all power for about two hours while they install more power lines. No, not more power for the datacenter – but just for their little cage. So basically, all hundred or so of our servers housing thousands of sites (as well as the other hundreds of servers belonging to other occupants of the datacenter) are going to be powered down for two hours so Google can expand their empire and eventually take over the world. I hope Google is happy with the fact that we’re all going to lose customers and reputation over this, so happy that they all get gonorrhea and die.

Also, mozzy on over here and take a look at the mildly humorous pictures.

GRUB Errors on Windows Dual Boot

Posted by Nessa | Posted in Uncategorized | Posted on 30-11-2007

7

I don’t want to admit that I still have PC’s that dual boot Windows XP and Vista, but given the occasional problems I have after Ubuntu and Fedora updates I’m not ready to give them up yet. Some time in the middle of the night last night my laptop, which used to dual boot Ubuntu and Vista (before I deleted the Ubuntu partition), rebooted and left me with a ginormous GRUB loader error:

GRUB Loading stage 1.5

GRUB loading, please wait...
Error 15

The issue is that the boot loader probably went apeshit and doesn’t know what to do.  Since Windows is the MBR nazi, it’s best to use Windows to fix it.

Luckily with all the luck I’ve had with Vista I still had the install CD and was able to recover quickly. For those of you at home, if you don’t have the original install CD you need to create a boot disk and slide it in <insert giggle here>. From the CD, when the menu comes up hit ‘R’ for recovery console which will bring you into the Windows command line. If you’re using the boot disk, you should already be there.

From the command line type ‘fixmbr‘ (or fdisk /mbr for versions < XP) and then you should be able to successfully boot into Windows XP (or Linux).

If you’re still running Linux on dual boot, another option is to run the recovery from the Ubuntu CD.  However, this can take a lot of time and if you don’t know what you’re doing you may end up deleting your OS.

How to Upgrade to a Non-Existent MySQL Version

Posted by Nessa | Posted in Uncategorized | Posted on 28-11-2007

15

Working in webhosting for a while now I’ve had some people ask for really weird shit, and I’ve dealt with a lot of people who try to sound a lot smarter than they actually are (I’m one of them). The latest of the bunch is a guy who asked for MySQL 7.0 claiming that he’s a MySQL programmer and that he specially programmed his database to work with MySQL 7.0. He really didn’t take it to heart very well when I told him that there is no MySQL 7.0 and the most he can hope for is 6.0x alpha (FYI for future readers a year from now, read the damn date on this post). Besides the point, the guy apparently felt like I was talking down to him so he went out of his way to mention that because he has a bachelors degree in computer science and that he’s an avid Microsoft Word user, he definately knows more than I do when it comes to doing my job. So, I gave in and agreed to upgrade him to MySQL 7.0.

The trick of the trade here is that you can essentially install any version of MySQL that you want to, whether it exists or not! It’s a long-standing suck point in cPanel that the MySQL version in user cPanels are read from a static file within the datastore directory:

/home/username/.cpanel/datastore/_usr_sbin_mysqld_–version

Within that file is the output of the ‘/usr/sbin/mysqld –version‘ command, which cPanel reads and outputs to each user’s cPanel. You can easily edit this file in one user’s account to make it read whatever MySQL version you want:

Needless to say, after I ‘upgraded’ his version to 7.0, he claims his scripts started working!

How to Commit Genocide on Annoying Processes

Posted by Nessa | Posted in Uncategorized | Posted on 21-11-2007

17

A few days ago I came across some processes on one of our servers that just wouldn’t die. Even after doing a kill -9 and all that good stuff, more would just keep spawing until there were dozens running on the machine. A head system admin of ours gave me this command, which will mass-kill all alike processes so they don’t have a chance to re-spawn each other.

The processes running were all some form of “init_”, like init_1, init_13, etc. To kill these:

ps aux |grep init_ |awk ‘{print $2}’ |awk ‘{print “kill -9 ” $1}’ | sh -v

The ‘grep init_’ should reflect the common name of all the processes.