February 26, 2009 5:45 PM
Posted by: Xjlittle
credit card theft
, cyber criminals
, cyber theft
, super 8
, wyndham hotels
The Wyndham Hotel chain’s computer systems security team discovered in mid-September 2008 that the company’s central computer systems were infiltrated. The intruder gained access through a franchisee’s computer system and from there was able to access the central systems of Wyndham. Wyndham believe that as many as 41 properties may have been affected and about 21,000 people in Florida.
Wyndham immediately retained a qualified investigator to assess the problem and ensure that it was isolated and to strengthen and implement a stronger security system. The Secret Service, credit card agencies and several state’s attorney general offices were also notified. They are making an effort to contact all of the affected customers by working through the credit card companies. It appears that only the credit card information was stolen without matching names and addresses. Wyndham says:
To ensure our customers’ card numbers were protected, we provided each of the payment card companies (American Express, Visa, Mastercard and Discover) with the actual card numbers that were accessed so that these payment card companies could take such action as they deemed appropriate to monitor the use of the cards.
Wyndham does not keep social security numbers or other confidential identifying information and does not believe any identity theft has occured because of the breach. The criminals did manage to get magnetic stripe information which contains the CVV code. Card numbers with this code bring a higher price on the black market because it is easier to use the card in a fraudulent transaction.
When a stolen card is used that includes the cvv code the banks are responsible for the charges. When there is only a card number and an expiration date used in the transaction which occurs in many online sales then the retailer is responsible.
If you believe that you may have been affected by the theft you can find more information here to get more information.
February 26, 2009 12:57 PM
Posted by: Xjlittle
, local hosts
Recently I installed OpenSuSE on my notebook. It’s been awhile since I used any SuSE products but I thought since my next contract is supporting SuSE servers I should bone up on the distribution. Most things went as expected. Except that I couldn’t ping or resolve local hosts.
That’s correct. I couldn’t ping them nor could I access them via a web browser. Bummer. I access my management consoles and mp3 server through a browser.
The dig utility would resolve them ok. Just not any user or administrative applications such as those mentioned above. After doing some digging around on the web and reading some man pages I found that there two files that need to be edited, or at least checked.
Now we need to edit the /etc/nsswitch.conf and /etc/host.conf file. Note that I said the /etc/host.conf file not /etc/hosts as some people have done.
Change the /etc/nsswitch.conf hosts entry to read:
hosts: files dns
and the /etc/host.conf file to read:
order bind, hosts
You may need to reboot after doing this. Now I could ping and resolve the local hosts normally.
Be careful what posts that you use to solve this or any other problem. I read one post that said if the domain ends in .local, which mine does, then this is problematic with SuSE. One poster even recommended changing the domain name. That’s right. I’m going to change the FQDN of all of my machines. Not.
The problem simply lies with the settings that SuSE ships with their products, specifically with the /etc/host.conf file. See the man page for more details. It specifically points out the use of .local domains and what to set the entries to in the file, the critical one being the mdns entry.
February 19, 2009 7:01 PM
Posted by: Xjlittle
Many ISPs are requiring SSL and a password to connect and send mail. This how to shows how to set up your sendmail server to use SSL with a password for connecting and sending mail through your ISP.
I set this up on a CentOS 5.2 virtual machine. You should have the following packages installed:
First let’s generate our self signed certificate. Be sure and use the FQDN of your server for the machine name.
umask 77 ; \
PEM1=`/bin/mktemp /tmp/openssl.XXXXXX` ; \
PEM2=`/bin/mktemp /tmp/openssl.XXXXXX` ; \
/usr/bin/openssl req -utf8 -newkey rsa:1024 -keyout $PEM1 -nodes -x509 -days 365 -out $PEM2 -set_serial 0 ; \
cat $PEM1 > test.pem ; \
echo "" >> test.pem ; \
cat $PEM2 >> test.pem ; \
rm -f $PEM1 $PEM2
Generating a 1024 bit RSA private key
writing new private key to '/tmp/openssl.wc3819'
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
Country Name (2 letter code) [GB]:US
State or Province Name (full name) [Berkshire]:AZ
Locality Name (eg, city) [Newbury]:Tempe
Organization Name (eg, company) [My Company Ltd]:Self
Organizational Unit Name (eg, section) :
Common Name (eg, your name or your server's hostname) :mail.home.local
Email Address :firstname.lastname@example.org
Next we need to make some edits to the sendmail.mc file. cd to /etc/mail and open the file with your favourite editor. The following lines should be edited or added to match your configuration and/or connection information to your ISP. Note that dnl at the front of a line indicates a comment. This should be removed from the beginning of any lines that are edited.
define(`SMART_HOST', `smtp.att.yahoo.com')dnl <==put your ISP's smtp server here
TRUST_AUTH_MECH(`EXTERNAL DIGEST-MD5 CRAM-MD5 LOGIN PLAIN')dnl <==uncomment the next two lines and add the third line
define(`confAUTH_MECHANISMS', `EXTERNAL GSSAPI DIGEST-MD5 CRAM-MD5 LOGIN PLAIN')dnl
define(`confCACERT_PATH', `/etc/pki/tls/certs')dnl <==uncomment these 4 lines
DAEMON_OPTIONS(`Port=smtp, Name=MTA')dnl <==Remove the loopback address from this line
DAEMON_OPTIONS(`Port=smtps, Name=TLSMTA, M=s')dnl <==uncomment this line
Now we need to set up the login information for your ISP’s smtp server. In the /etc/mail directory perform the following:
chmod 700 auth
Add the following line to the client-info file:
AuthInfo:your.isp.net "U:root" "I:user" "P:password"
Repace user with your ISP username and password with your smtp password. Save and close the file and perform the following:
makemap hash client-info < client-info
chmod 600 client-info*
Now issue the following command so that everything is compile as sendmail likes it:
make -C /etc/mail
Last edit the following file and make sure that it contains the following two lines:
pwcheck_method:saslauthd <==make sure that these two lines are in the file
mech_list: plain login
If you are using tcpwrappers as I have suggested in the past add the following line to hosts.allow. Change the ip configuration to match your setup:
Now it’s time to test. Make sure that the correct services are running:
After starting check the log at /var/log/maillog. If you find any errors that contain `starttls’ then either something is wrong with the sendmail.pem file that you created or the saslauth daemon is not started. I had a situation once where something happened to the sendmail.pem file and recreating it solved the problem. Beyond that check your firewall, syntax in the sendmail.mc and hosts.allow and hosts.deny files.
Once everything is started cleanly open up your mail client. I used evolution for testing. Edit the preferences and use the settings for your sendmail server. For mine I used the IP address of the sendmail server, check “Server requires authentication”, set “Use secure connection” to SSL encryption and entered the user name that I use to login to the sendmail server. Note that this is not your ISP username.
Now you should be able to send a test message out through the internet and receive it back through your ISP’s pop server.
February 18, 2009 12:01 AM
Posted by: Xjlittle
, red hat
Red Hat and Microsoft have entered into a virtualization agreement. The agreement is designed so that Red Hat and Microsoft customers using virtualization from both companies can get support from either group.
Red Hat and Microsoft both emphasize that this agreement is not the same as the agreement that Microsoft has with Novell. That agreement covers such things as intellectual property, code indemnification and licensing.
Red Hat and Microsoft still very much remain competing platform vendors. Red Hat’s GM of virtualization Mike Neil emphasizes that “these agreements do not include any patent or other IP licensing rights.” The agreement with Novell is more of a partnership agreement where Microsoft gives it’s customers coupons to purchase SuSE Linux Enterprise Server from Novell.
Red Hat and Microsoft will enter into each other’s validation phases for their respective virtualization technologies. The results of these tests will be posted throughout the year on the Red Hat and Microsoft websites.
With the base of heterogenous hosts and virtual guests I suspect this is not the only agreement like this that we will see.
February 17, 2009 10:40 PM
Posted by: Xjlittle
, red hat
I mention Linux security in the title but these best practices apply to any operating system.
There are many excellent 3rd party security tools out there for you to install on your system. Prior to installing these though you should review the tools that are already on your system. There is probably already a package included with the system that will accomplish what you need.
Why not use these tools? The major Linux distributions have gone to considerable expense to test these tools and make sure that they will not break anything on your system. When you consider the many 3rd party applications that are certified for a distribution such as Lotus Domino and JBoss this becomes even more critical. These applications are generally installed because they are mission critical. You don’t want to install a non certified security application only to find that it breaks or creates a security flaw in your certified mission critical application. Don’t do this.
A pet peeve of mine has always been the idea of “point and click and know not what I just did” that many administrators perform. While this seems to be more prevalent in the Windows world it exists in the *nix world as well. Generally the idea of text configuration files can overcome this but not always. Take, for example, the website securecentos.com (not affiliated with CentOS). One of the things that they want you to do is patch your kernel with a patch from http://www.grsecurity.com/. Doing something like this should raise a red flag immediately. Do you know what the patch is fixing and/or how it is making your machine more secure? If you can’t answer yes to this then don’t do it with this or any other patch except one from your vendor.
Aside from that when your vendor releases a kernel update you are going to have to go and redo the whole process again. This can quite quickly become heavy with administrative costs. If your machines are duplicated across the network now you have to go and install this on all of them. And again when you run a kernel update. Don’t do this.
You should never download a configuration file that affects the core of your machine without knowing exactly what it does. Using the same site above they have many configuration files that they want you to download and put into production on your machine(s). There is even a sysctl.conf file which affects many core processes of your machine and how they operate. At the time of this post comments in this file are non existent. This amounts to the notion of “point and click and know not what I just did” mentioned above. Don’t do this.
I don’t mean to single out securecentos.com. It just happens to be the one that I ran across today among the many out there asking administrators to do some things that they should think twice about.. I’m sure that they mean well. If I got out my sysctl manual I could find out what each of those changes would to do to my machine. However I’m not going to..if they want me to use their product/advice then those should be clearly documented either in the file or with a url embedded in the file that leads to that information.
Be smart with your machines! Don’t go putting configuration files in service, clicking on buttons that affect the security or core services of your machine or installing 3rd party applications that may already have the equivalent tested on your machine without knowing exactly what other files and applications they are going to affect.
February 17, 2009 9:27 PM
Posted by: Xjlittle
, red hat
Following their mandate to be binary compatible with Red Hat, CentOS is preparing to release version 5.3.
Red Hat released version 5.3 on January 21st of this year. The CentOS developers generally follow with a CentOS release about 3-5 weeks after Red Hat. This should put the release as generally available around March 1st.
We can expect to see some very nice feature enhancements on this release. NetWorkManager and wpa_supplicant have a whole host of updates listed. This means improved wireless security and better driver support. For those of us using Broadcom wireless drivers the b43 driver from linuxwireless.org has been backported. Following the links on that page should lead you to the proper firmware as well.
The new ext4 filesystem is also incuded in the new release. Laptop users like myself will be glad to know that anaconda now supports encrypted block devices during installation. Red Hat continues their commitment to Xen and has released many updates for virtualizaton including support for up to 126 CPUs in the x86_64 Xen-based hypervisor (up to 32 CPUs per virtual server) and support for up to 1TB memory per host on x86_64 (up to 80GB per virtual server).
Other enhancements include 802.1q VLAN tagging support for kickstart, iSCSI installation and boot support, ability to install Xen and KVM guests and for fibre channel users Emulex FCoE HBA support through the lpfc driver and QLogic FCoE HBA support through qla2xxx driver. See a full list of new features here.
With all of these new enhancements desktop users and server administrators are sure to be pleased.
February 15, 2009 7:18 PM
Posted by: Xjlittle
Apple is preparing to release Tiger for it’s Mac platform this Friday, Feb, 20, 2009. The new release is set to overshadow Microsoft’s release of Longhorn.
Tiger boasts over 200 enhancements over the previous release of the Mac software. Chief among those is the Spotlight search feature. Spotlight allows searching of all files on the system including metadata inside files. Sounds a lot like the open source Beagle that SuSE uses to me.
Network administrators in the media and marketing industries say that Apple is making a comeback. The OS is seen as one of the most stable in the market because it is Unix based. Even though it is proprietary it still uses some open source applications such as Samba. Samba allows Unix like machines to connect to Windows networks.
Apple is selling the software at $129 per user or $199 for a family pack. Considerably less expensive than Windows but much more expensive than Linux, FreeBSD or NetBSD.
February 13, 2009 2:00 PM
Posted by: Xjlittle
Microsoft has announced a $250,000 reward for the arrest and conviction of the authors of the Conficker worm, also known as Downadup.
Apparently Microsoft feels that not enough is being done by Windows administrators to stop the infestation and propagation of this worm. F-Secure, an anti-virus software vendor, reported in January of this year that almost 9 million PCs had been infected. The worm was released in the fall of 2008.
The worm exploits a buffer overflow in the Windows Server Service. By doing so it attacks the Windows Automatic Update, Windows Security Center, Windows Defender and Windows Error Reporting services. Afterwards it connects to an external server where it receives instructions to further propagate. While connected to the external server it downloads more malware that affects other Windows processes including svchost.exe, explorer.exe and services.exe.
Microsoft released a patch (MS08-067) in the fall of 2008 to fix the vulnerability. Microsoft, Symantec and Kaspersky Labs also have patches to repair systems. McAfee offers an on demand scan to remove the worm. The virus can spread via any drive that uses autorun including USB drives. Many vendors are recommending disabling the AutoRun feature for external media through modifying the Windows Registry. Note that if you are using anything earlier than Windows XP Service Pack 2 or Windows 2000 SP4 a patch is not available. Sorry.
Linux and Mac computers are not affected by this worm. It is designed to exploit only computers running the Windows operating system.
Now that we have the background two questions come to mind. Why are the adminstrators not repairing these systems and, an even bigger question, how in the world are these infected machines able to provide the network services that they have been set up to perform?
I think that I’ll stick with my Linux and Solaris machines where the chances of something like this happening are slim. And if it does the patches generally aren’t limited to a certain version of the operating system especially if you are using enterprise grade software such as Red Hat, CentOS, Ubuntu, SuSE or Solaris. These companies all offer 5 to 7 years of security patches on their enterprise versions.
February 12, 2009 10:26 PM
Posted by: Xjlittle
, windows 7
I’ve talked with users who have stated that they don’t understand what Linux distribution to use, with so many of them available that it is confusing.
If that’s confusing then how in the world do you figure out what Windows version you are going to use? Windows Vista had six SKUs and it looks like Windows 7 is going to have two..or is it still six?
Sure Microsoft is pushing the Home Premium edition and a Professional edition as it’s two primary versions. The third SKU that is supposedly not going to have an impact on all of this is the Windows 7 Home Basic. Supposedly this will only be sold in emerging markets but, well it still exists as a valid SKU.
Then there is the Starter edition. The what? The Starter edition. You know for people who are just starting to use computers. It will run a whopping three applications simultaneously. That sounds suspiciously like the Windows 7 Home Basic. Why anyone would want either one of those is past me.
By my count we are up to four. Enter Microsoft’s offering of an Enterprise and Ultimate version of Windows 7. Both of these versions contain applications that should be standard with any operating system. Not so for Microsoft when they can squeeze some more money from their faithful. That brings us to six.
Now go pick one of the three mainline distributions. Any of them-Ubuntu, Fedora or OpenSuSE. No separate SKU, no licensing hassles, and not one dime comes out of your wallet to install any of them. Install only the applications that you want or better yet install everything. If you are a new user this is a great way to learn which applications that you like to use for various task. Usually two or three for any given task are included. You don’t have to buy it nor do you have to download a separate version to get a particular feature or application. Straightforward and simple.
Enjoy the freedom and lack of confusion!
But what about the fast growing part of the pc market, the Netbook? I don’t believe the Netbook will have enough power to run either one of these. That brings the versions up to three.