A few days ago, I wrote a blog post talking about how a recent hardware upgrade had maxed out my machine’s Windows Experience Index. Last night I received a very rude E-mail message in regard to that posting. I wish that I had kept the message, because I would have quoted it directly. Since I don’t have the message anymore, I will try to give you a sentence from the message, hopefully without misquoting the sender:
“Those who are not familiar with hardware might be impressed by your claims, but I seriously doubt your honesty in this matter”
You can imagine why I was a bit irritated by the message. My first reaction was to simply take a screen capture of my Windows Experience Index, and post it for the world to see. When I thought about it though, I began to realize that a screen capture wouldn’t prove anything any way. It’s too easy to manipulate the Windows Experience Index.
If you would like to try it for yourself, open Windows Explorer and go to \Windows\Performance\WinSAT\DataStore. Open the most recent assessment file using Word Pad. The assessment file is in .XML format, and you can change the scores to anything that you want. If you look at the figure below, you can see that I was able to change the scores to 9.9, even though Vista doesn’t actually use any scores higher than 5.9. Incidentally, this screen capture did not come from the same machine that I performed my recent upgrade on.
OK, so back to the issue of the score that I got after my upgrade. For those of you who may be wondering, I was using fairly high end hardware, but certainly not top notch. It really isn’t that difficult to get a score of 5.9. The hardware that I was using included:
An ASUS M3A78 system board
An AMD Phenom II Black Edition, quad core CPU
4 GB of RAM
A NVIDIA GeForce GTX 285 graphics card (the super clocked edition with 2 GB of RAM).
It’s been 24 hours since my last post. At this point, it seems as though disabling the virtual memory on my DPM 2007 server was only a partial solution. My protected volumes have remained consistent, but my Exchange storage groups have become inconsistent, and I am unable to make them consistent.
I knew that disabling the virtual memory was a risky solution, but I wanted to try it any way. I guess my next move will be to replace the server’s system board with something that can accomodate more RAM.
Even though I operate what is essentially a one man shop, I use System Center Data Protection Manager 2007 (DPM) to protect my data. Although my DPM server has been working relatively well for quite some time now, I have been getting a lot of errors lately due to inconsistant replicas.
I spoke to someone at Microsoft about the problem, and they asked about my server’s hardware. My server met all of the requirements, but they explained to me that the 2 GB of memory that is required is a minimum, and that in many cases more memory is actually required. From what I have been able to gather, if the server pages data to virtual memory during the replica creation or synchronization, it will cause the replica to become inconsistant.
Unfortunately, I am operating DPM on an older server, and the server’s system board can not accomodate more than 2 GB of RAM. Since I can’t add any more memory to the server, and replacing the server isn’t an option right now, I tried disabling the server’s virtual memory.
Only time will tell if this was a good idea or a bad idea, but as of right now all of my replicas are consistant. I’ll keep you updated on how this idea ends up working out.
When Microsoft first introduced Vista, one of the new features that was introduced along with it was the Windows Experience index. The basic idea was that the performance of various system components was rated and assigned a numerical score. The lowest score represented your overall Windows Experience Index.
At the time, Microsoft wanted to simplify things for home users by getting software publishers to list a minimum experience index rating for applications rather than firm hardware requirements. In other words, if an application required a minimum score of 2.0, and your computer was rated at 3.0, then you could be guaranteed that the application would work. Well, that was the theory anyway. Ultimately, I don’t know of any software publishers that actually began using the Windows Experience Index.
Although nobody really uses the Windows Experience Index to determine what applications they can perform, I do use it to test the effectiveness of hardware upgrades on my primary workstation. Tonight something interesting happened though. My computer previously had scores ranging from 5.0 to 5.9 and had an overall score of 5.0. I ended up replacing the system board, the CPU, the power supply, and the graphics card. This time when checked the Windows Experience Index, all of my scores were 5.9.
What I didn’t realize is that in Windows Vista, the scores are capped at 5.9. No matter how good your hardware is, you will never score higher than 5.9. This means that there will eventually come a time when the Windows Experience Index becomes meaningless as a benchmark because of improvements in hardware. Windows 7 also includes the Windows Experience Index, and the maximum score has been raised to 7.9. Even so, I have to question the effectiveness of the index if it is capped. Oh well, I guess that’s what the Performance Monitor is for.
It’s kind of funny how timing works out sometimes. About a week ago I mentioned in one of my blog posts that I hadn’t seen an automatic installation kit for Windows 7 yet. Today Microsoft sent an E-mail message to beta testers asking them to try out some of the new Windows 7 tools. Among these tools were:
- The Microsoft Assessment and Planning Toolkit
- The Application Compatability Toolkit
- The User State Migration Tool
- The Microsoft Deployment Toolkit
None of these tools are really new. We had all of them in Vista, and some have been around for longer than that. I haven’t had a chance to download any of the new versions yet, but it will be interesting to see how much or how little these tools might have changed.
Lately I have been doing a lot of work with the Windows 7 Release Candidate. One strange issue that I keep running into is that I keep losing my wireless networking connection.
I have been using an HP laptop with 4 GB of RAM to evaluate Windows 7. For some reason, my wireless network connection will drop for no apparent reason. If I wait for half an hour or so, the connection will come back, but in fifteen minutes or so it will drop again. during periods of inavailability, the machine seems to be able to see my access point, but it cannot communicate with it no matter what I have tried.
I suspected that a hardware failure might be to blame, so I blew out Windows 7 and installed Vista on the machine, but the problem went away when I did. I put Windows 7 back, and my connectivity issues returned with it.
The problem does not seem to affect wired network connections as far as I can tell. I was just curious as to whether or not anyone else has run into this issue.
For quite a while now we have known that the next Windows release is going to be Windows 7. The interesting thing is though, that Windows 7 will use an internal version number of 6.1. Actually, the version number is going to be longer than that, but we won’t know the full version number until Windows 7 goes RTM. In case you are wondering why I am even mentioning this, it is because Windows 6 is Vista.
You might initially assume that the reason why Windows 7’s version number makes it look like a Vista update is because Windows 7 was designed around the Windows Vista kernel. That’s not it though. According to my sources, the version number mimics Vista in an effort to avoid compatibility issues with applications that were designed to run on Vista.
I’m in Redmond right now, and as you would probably expect everybody up here is excited about Windows 7. Although the level of excitement that seems to be present on Microsoft’s campus is about the same as what I have experienced near the time of any other major Windows release, something funny happened today that makes me think that there might be just a little bit more fanfare than usual.
As I’m sure you know, the latest Windows 7 release candidate was made available on TechNet early last week. On Friday I got an E-mail from someone at Microsoft saying that even though the latest Windows 7 build was available for download that they just wanted to make absolutely sure that I had a copy, so they were going to be sending me a DVD in the mail. They also told me that if I had already downloaded the build that I could just give the DVD to a friend.
I thought that was a nice gesture, but didn’t really give it a lot of thought beyond that. This evening when I got back to the hotel I got a phone call from my wife who told me that I had received a package in the mail from Microsoft. To be perfectly honest, I was tired from working all day and Windows 7 was the last thing on my mind, so I wasn’t sure what they had sent me. I asked her to open the package. When she did she told me that there were eight copies of Windows 7 inside! I guess someone in Redmond is a little bit excited about the new release (as they should be)!
I just installed the Windows 7 release candidate for the third time. For some reason this hadn’t registered with me until now, but the installer is actually designed to detect any wireless networks that happen to be in range. You can set up wireless networking as a part of the initial installation process. I absolutely love this feature, because it means that when Setup is complete, you are already connected to the network and are ready to go.
I haven’t seen tha automated installation kit for Windows 7 yet (if there is one), but I wouldn’t be a bit surprised if you could specify a wireless network and an encryption key within the installation parameters.
I would like to apologize to everyone for the lack of blog posts lately. I have been hard at work on my latest book, and that has been consuming all my time lately. Thankfully, I wrapped up the book this morning and I hope to return to a more normal blog posting schedule.
The book that I just finished is solely mine. I actually only wrote five chapters in it. Even so, writing this book was a learning experience, as it always is. This particular book was about Windows Essential Business Server.
Any time that I write a book I always set up a lab so that I can test the techniques that I writing about. This time I decided to use virtual machines rather than physical ones. According to Microsoft, Essential Business Server can be virtualized. Even so, I really struggled with getting Essential Business Server to work right in a virtual server environment.
If you have ever worked with Essential Business Server, then you know that there are three servers that make up the server suite; a management server, a messaging server, and a security server. Ultimately, I was able to virtualized the management and security servers, but I ended up having to run the messaging server on a separate box.
Another issue that I struggled with was network connectivity. For some reason, my host server (which had always worked fine in the past) periodically dropped my network connectivity. I have not had this problem for any other virtual machines, so I suspect that the problem may be links to Essential Business Server. So far I have not been able to come up with definitive proof though, so I plan to continue researching this issue.