I’ve started my own business, and have been working with a friend’s business to migrate his web hosting clients over to my servers. For the most part this transition has been smooth, except for one client. Due to how their directories were configured (and WP misconfigurations), instead of creating normal subdomains through ISPConfig, I had to create them as new domains. This was fine until they changed their name servers to reflect mine…then in came the 500 and 503 errors. Luckily, I documented what I did for similar issues with those who use Apache2 + PHP + ModFCGI. Continued »
While I’m not the biggest saint in the IT world when it comes to doing backups ([religious figure]-bless the fact OpenVZ has a simple container-back up function), when you do perform a backup one of the worse things that can possibly happen (besides a corrupted backup) is the backup not being created due to an error. Even though I wasn’t doing a back up at the time I ran into this issue, I thought it would be helpful as MySQL still has a pretty strong hold on the database market, especially on *nix systems. Continued »
For the past few months I’ve been working hard at getting my own business started and going. Its definitely been a ride, and have learned a lot about business and I.T. that I didn’t think I’d ever touch on. One of those I.T. aspects involves OpenVZ and their Ubuntu template (I use 11.04 x86), and the disappearance of lo, or the loopback adapter. Continued »
There’s plenty of control panels out there, some free and some that a lot of money. A very popular pay one is cPanel, and that is arguably THE most popular control panel for servers out there. You can do a lot with it, probably more than most should be allowed to do (this coming from my experiences working with web hosting companies). However, over the past few years, when I’ve wanted a control panel for my own use I’ve stuck with the free-side of things. That where this little write up comes in. I’m talking about Webmin, and for those of you that have used it, you know just how powerful it can be.
Basically every guide you see online or read about on how to set up a SMTP server says you need to have SMTP authentication enabled to be safe (and to avoid open relay attacks). While yes, you do need this if you’re running an enterprise-level system that requires remote connections from smart phones, laptops/PCs at home, etc…, what about those that do not run into this issue?
Now, getting down to business. You always hear the infamous words that Linux has no viruses. Well, if this was true, then why are there virus scanners for Linux? Sure, some could be to steal money from unsuspecting Windows-transitioners, but that doesn’t explain ones like ClamAV.
This is where the interesting part of this post comes in. Not because I’ve come up with some revolutionary new breakthrough or that Linus is now the king of putt-putt, but because there’s documentation out there to illustrate how to write a virus for Linux ELF programs.
Who, might you ask, released this information? Linux Journal in their January 2012 issue. Before you go and read it hoping to copy/paste the code, the author states right off the bat that the harmful code is left as an exercise to the user as he doesn’t want to contribute to damage done. Interested to see where this leads into this article? Continued »
I started writing this article late last week or earlier this week, but some unknown issue happened (gotta love driver issues). The point of this article is to cover the benefits and costs of both passive and restrictive firewalls. When I first got into I.T. security I always thought restrictive firewalls were the most secure (which they are), and that passive firewalls were completely pointless. However, over the years (and learning) I have found that they both really serve purposes, and just depends on what you are wanting the firewall to protect. Continued »
I didn’t intend on writing this post, in fact I had a whole idea for a post to write about but that will come tomorrow. I want to address some things. As this was a longer post than I originally intended, I’m placing a skipper (“Continued…”) part here. Continued »
Its 12:40 AM here as I’m typing this, on a Saturday night, and what better thing to do than to discuss some security? I’m all for healthy competitions (heck, I even partake in some wargames for the fun of it). But this is one that could really benefit you. Here’s the main points: the GCHQ (which is a British government organization) is using the web to recruit new people for their security team. Now, this may not sound new at all to many because hey, who hasn’t used Craigslist or Monster to get people for a position? This is different, however. They are running this website called Can You Crack It? which upon visiting it prompts you with a welcome screen to crack a code. Once you get the key code, apparently (though I can’t vouche as I haven’t cracked it myself) you’ll be presented with a true welcome screen.
This is pretty intuitive in my opinion. In fact, I’ll be documenting my progress here during the duration of this project (which ends in 7 days). Will I get it solved? Most likely not…I’m not a wiz when it comes to this kind of stuff, but why not try and have some fun with it? I’m not sure if you can have mulligans with this or not, but it does seem the code is static. I’ll be posting more about this most likely tomorrow, after I drink some more hot chocolate and try to fuse every single brain cell I have to solve this.
Dave Taylor made an interesting editorial/tutorial in the most recent edition of Linux Journal where he decides to parse the Twitter HTML data to get how many tweets and such a user has made. This got me to wondering something: is it worth it? I mean, Twitter has a pretty robust API where you can already get this information. Do they have a Bash library (which Dave’s article discusses)? No, unfortunately, although that would be pretty interesting. But, as most sysadmins use one language or another that does have an official library binding itself to Twitter, why not use that instead?
I know this sounds weird coming from me, especially since I tend to reinvent the wheel more than I should. Most of the time I do that though it is to get a better understanding as to what is happening in those libraries. Dave teaches us the use of regex, sed, grep and cURL…none of which really are beneficial to this process, and could possibly make it slower via Bash.
By now everyone should know I love Bash and its portability. However, I do also feel in these cases, especially when its giving problems that are not easy to debug, it might be best to just use pre-made solutions. Such is the case, for example, when I was trying to implement RSA into a PAM module I’m working on. I could do it myself, but I know I would not make an efficient solution, so I decided to use a pre-made solution.
My question to the readers, though, is what do you think? Is a bare-bones API (i.e.: Twitter) worth re-writing in a (lets be honest here) outdated language? Or am I just going crazy and being attacked by holiday-cheerful penguins that want me to do nothing but work on benchmarking tests?