Every once in a while I like to check out what Microsoft admins are downloading these days. I sometimes do a search on Google, but to get a feel for the work being done in Microsoft shops, I always return to the Microsoft Download Center.
It lists the top five free Microsoft downloads in general, with some of the usual suspects, XP and Office compatibility. This goes to show that shops are still holding onto the older Windows OS — and moving some people to newer versions of Office.
Here are the top five free Microsoft downloads and the company’s descriptions for them:
Microsoft Office Compatibility Pack for Word, Excel and PowerPoint File Formats. Open, edit and save documents, workbooks and presentations in the Open XML file formats, which were introduced to Microsoft Office Word, Excel and PowerPoint beginning with Office 2007 and continuing with Office 2010.
DirectX End-User Runtime. Provides updates to 9.0c and previous versions of DirectX — the core Windows technology that drives high-speed multimedia and games on the PC.
Update for Windows XP (KB932823). Resolves an issue in which a user is unable to use Windows Internet Explorer 7 to download files on a computer that is running Windows XP with IME enabled.
.NET Framework Version 2.0 Redistributable Package (x86). Installs the .NET Framework runtime and associated files required to run applications developed to target the .NET Framework v2.0.
Microsoft .NET Framework 4 (Web Installer). Downloads and installs the .NET Framework components required to run on the target machine architecture and OS. An Internet connection is required during the installation. .NET Framework 4 is required to run and develop applications to target the .NET Framework 4.
Digging further, specifically looking at server management tools, what surprised me was that four of the top five weren’t tools in the sense that they fixed system problems. One is a case study on how a business benchmarked its PHP applications on Windows Server 2008, and there are a couple of how-tos, one on an Office Communication Server deployment, another on European data compliance.
Here are the free server management tools that have been downloaded the most out of 4, 821 choices, and Microsoft’s descriptions:
Security Update for Windows Server 2008 RC0 for Itanium-based Systems (KB941644). A security issue has been identified in TCP/IP that could allow an attacker to compromise your Windows-based system and gain control over it.
Microsoft Windows Server 2000 Assessment Configuration Pack for European Union Data Protection Directive (EUDPD).
This configuration pack contains configuration items intended to help you establish and validate a desired configuration for your Windows 2000 servers in order to support your European Union Data Protection Directive compliance efforts.
Customer Solution Case Study: Windows Server 2008 Charts a Secure and Flexible Roadmap for Virtual Map.
Optimization at work in Microsoft. A presentation at an executive breakfast seminar, The Business Impact of Infrastructure Optimization, held Feb. 6, 2007.
So do these downloads sync up with what’s going on in your shop, or do you have a set of your own free tools that you can’t live without? I’d like to hear about it. Email me at email@example.com.
Google CEO Eric Schmidt’s statement last week about the end of online anonymity recalls similar words by one of his former colleagues, Scott McNealy of Sun, more than a decade ago.
McNealy caused a stir back in 1999 when he said, “You have zero privacy. … Get over it.” McNealy was referring to the Intel Pentium III processor, which had a feature that could uniquely identify a user. The Internet had nothing to do with privacy in that context, but at a time when people were just getting used to using their credit card online, everybody who heard the statement could make the leap to the perils of e-commerce.
Schmidt said that society is facing major disruptions due to the incredible amount of online data being generated (“5 exabytes [or 5 billion GBs] … every two days” he said) — mostly user-generated content via blogs, message boards, Twitter, Facebook, etc.
Online anonymity is a paradox. People hide behind an anonymous email or forum comments, but there are ways to track you down. In addition, Facebook users may forget that the pictures that they posted from Friday night’s trip to the bar can be seen by everybody, even their bosses.
Predictive analysis of consumer behavior is inevitable, he said. “If I look at enough of your messaging and your location, and use artificial intelligence,” Schmidt said, “we can predict where you are going to go.”
That sounds a lot like the movie version of Philip K. Dick’s Minority Report, where Tom Cruise is pelted with personalized messages from every billboard he passes. But this is not science fiction. Electronics retailer Best Buy is partnering with Shopkick to enable your smartphone to communicate with the store for promotions and rewards points as you shop.
Online anonymity will be a casualty of the data-saturated world, Schmidt said. “The only way to manage this is true transparency and no anonymity. In a world of asynchronous threats, it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it.”
Ironic, isn’t it. People yelled in protest at McNealy’s words in 1999. I doubt Schmidt’s will make much of a ripple, since nowadays the consumer is a willing accomplice in the end of anonymity.
Get over it? Now it’s more like, who cares?
In the past few years, Ross Pettit, client principal at Chicago-based ThoughtWorks Inc., has seen a shift in client requests. The agile software development consulting firm’s projects are still mainly grounded in custom application development but, more often, he said, organizations want to apply agile best practices to non-IT-related projects. Pettit suggests following these initial steps when adopting an agile project approach:
Develop a release planning stage in which aspects of the project are divided into smaller, more manageable chunks. This stage involves defining the problem in business, not technical, terms, choosing and pairing up team members from different aspects of the business and assigning project facilitators who can step in to remove obstacles.
Every week there is a checkpoint to gauge progress, the understanding of the problem, what needs to be done next, and how the team is tracking against the agreed-upon solution path. By having the primary stakeholders involved, the business problem is laid out for all to see and obstacles can be removed. “Week after week, there is tremendous transparency and exposure to all the stakeholders …,” Pettit said. “This allows stakeholders to make the resources available for what needs to be done, immediately.”
Be retrospective during each checkpoint. Ask what worked well, what worked poorly, what was confusing, and what to change. “On a weekly basis, this builds in mechanisms that create continuous improvements … where you have continuous planning and tremendous visibility,” he said.
It’s OK to fail, but fail fast. “The nice thing about agile is that it is not only OK to fail, but it’s really good to fail,” he said. “The more often you try and fail, the more you learn about the problem in front of you.”
Start with the hardest nut to crack. Pettit has seen agile projects fall apart because project teams decide to go after the simpler tasks first. “Too often, I see projects fail because they were able to get two or three easy things accomplished, then they get to the more difficult ones they were putting off, and they say ‘I don’t know if we’re up to that one,’” he said.
And even though a project may be focused on a business problem, the CIO will be called on to act as an agile executive. The CIO is the one who can pay attention to all the data coming out of all the projects. “The CIO can make decisions outside of the context of the project, that others may not feel empowered to make, and will see problems that others may not by looking from the outside in,” he said.
Working on a new project and hitting a few bumps? Send me your project problems and I’ll post them on our blog — without your name, or company name, of course — to see if your peers have any advice. Email me at firstname.lastname@example.org.
For some firms that dole out smartphones to their employees, the mobile security policy might consist of remotely wiping the phone if it’s lost or stolen, or if the employee leaves the company. Other than that, users are left pretty much on their own.
That’s not enough, especially as smartphones and apps become smarter and more pervasive. It’s one thing for a company to provision its own smartphone. But many employees are bringing their own smartphones and doing company business on them — which some companies encourage, by the way, to cut down on cell, data plan and management costs.
Which makes last week’s news from the Black Hat conference all the more unnerving. Researchers announced that wallpaper apps on Android phones can collect and transmit data such as phone numbers and messages. In other cases, remote apps can be installed, which could be used to drop malicious payloads on to a phone.
Given the amount of work being down out of the palm of one’s hand these days, and even that mobile health care app stores are coming, the news is startling.
As with anything and security, enough is never enough. Checklists are a start, but which checklists are the right ones? For midmarket IT managers looking for some answers, SearchCIO-Midmarket.com’s sample mobile device management policies and templates is a good place to start.
I don’t want to spoil the fun or productivity that users are enjoying with iPhones, BlackBerrys and Androids, but if attention isn’t paid to them by IT security, then they will just become another access point into your networks.
How frustrating is it to be one of the little guys without the resources to bear for big projects. Well, those frustrations may actually be easier to deal with than the alternative: Being a large company entrenched in legacy systems.
I came across a blog post recently by Scott Adams, describing a fantasy world where countries weren’t so paralyzed by legacy systems that they couldn’t move forward. Older systems have too many vested interests, making it difficult to change because “anything that has been around for a while is a complicated and inconvenient mess compared to what its ideal form could be,” Adams wrote.
This got me thinking about the real legacy systems holding back IT: The projects that never go anywhere, not because the project itself is bad, but because the change would overwhelm the systems, strategies, processes and workflows that may not integrate with the change. How, too often, the need for agility is acknowledged, but never addressed. And, more importantly here, how the bigger you are, the more difficult it is to change.
So here’s to all the small IT shops out there, where each member of the staff wears multiple hats and puts in the extra hours because, let’s face it, who else will? Being smaller means you’re closer to the top, closer to making a difference. Yes, your budget is probably smaller and you can’t get to all the things you know you need to tackle, but there are also far fewer hoops to jump through and approval signatures to get.
If you take all of that into consideration, maybe the grass is greener on this side: You can quickly adopt new technologies and there is less red tape. And you probably aren’t so buried in legacy systems that every infrastructure change requires duct tape and a sprinkle of IT magic to get it working.
You can’t talk about agile data modeling without mentioning Scott Ambler, whom many consider the authority if not founder of agile data modeling.
In this tip, Ambler takes readers through an iterative approach to requirements modeling. His definition of agile data modeling is straightforward: Evolutionary data modeling is data modeling performed in an iterative and incremental manner … done in a collaborative manner.
Taking an agile approach to data modeling can help resolve one of the most onerous data modeling problems: very slow and complex notation development.
As explained by Burton Group senior analyst Joe Maguire, building data models takes so long because the practitioners developing notations aren’t seeking simplicity. “People designing notations are trying to get tenure, a PhD, when they design notations,” he said. “They don’t get tenure for making things simpler.”
Which leads vendors to incorporate more complex notation designs into their tools, he said. Data modelers are stuck working with these tools and forcing users (the consumers of this data) to provide information that covers every aspect of the requirements in these notations.
So how can agile data modeling help? Agile recognizes that there will be change, whether you are developing software or a data model, but beyond that it allows you to build a data model with less rigorous notations. System requirements, or information requirements, that are especially subject to change and ones that are less susceptible to change can be identified and put in different notation buckets.
“Then you can alter modeling behavior to model stable requirements earlier in the process with confidence, and not model unstable requirements too early in the process. That helps accelerate the process of data modeling,” he said.
And this needs to be done on a business, not technology level. “If the taxonomy in my head as data modeler is motivated by the business significance of the requirements, that’s something I think agile techniques would really improve, if [data modelers] took that kind of taxonomy to heart.”
This is only a snippet of his thoughts on agile data modeling and a much larger issue Maguire is taking on at Burton Group’s Catalyst conference in San Diego this week. Here’s the overarching theme of his talk: Agile software development has an offshoot: Agile data modeling. The decision to use agile data modeling depends on local realities about information architecture, buy-or-build approaches to software, democratization of data access, and the company’s commitment to business analytics. For IT strategists, the decision to use agile data modeling is further complicated by the rancor that accompanies the debate. The session will touch on:
- Why are typical data-modeling initiatives so cumbersome, and what can be done about that?
- Why do data modelers resist agile data modeling? Is the resistance justified? Unjustified?
- How can IT leaders mediate the disputes between agile software developers and data management professionals? Are these disputes a healthy phenomenon, or a symptom?
- What kind of business environments and IT initiatives are amenable to agile data modeling?
- How can agile data modeling techniques be improved to become even more agile?
I’d like to hear from you if you are taking on agile data modeling, and learn what obstacles or benefits you’ve encountered as a result. Email me at email@example.com.
A lack of formal standards has made cloud computing nebulous so far. If a company wants to switch vendors, move platforms or build its own infrastructure, the confusion mounts, since most providers use a proprietary platform like Amazon’s A3 or Salesforce’s Force.com.
The project could improve portability and reduce costs, with more standard features, meaning users will be able to move among vendors and even move the cloud in-house down the road. While OpenStack provides more power, storage and processing power on demand, serving the needs of larger companies looking to build complex cloud environments, the standards it is setting could also make this an attractive option for SMBs.
Other open source offerings are available from Eucalyptus (providing infrastructure software to support the creation of cloud computing environments) and Cloud.com, with packages available in its free community edition, and also from its open source and proprietary editions.
While the OpenStack project itself was designed to serve scientific computing needs, it also aims to fill another gap in the cloud computing marketplace: As a cloud platform that doesn’t have a single commercial owner, it puts the business back in charge of its own data.
Could the standards and portability these open source cloud projects aim for open the door for widespread midmarket adoption?
I’ve been touching base with IT folks and analysts lately to get a feel for the projects people are working on these days, and the phrases that keep coming up are IT consolidation, modernization and innovation.
A senior network administrator with a national hospice care provider is working on a data center consolidation project. The C-levels are pretty keen on leveraging cloud services to outsource IT functions, but he’s not convinced it will happen.
“From what I’ve seen, the ROI for the cloud is not there yet,” he said. “When you factor in the actual costs of building it in-house vs. hosting, it’s a lot cheaper to do it in-house.”
So that is the back and forth under way at his company: IT consolidation — Do we do it ourselves, go with a cloud provider or take a hybrid approach?
The task ahead for IT is threefold, explains Burton Group analyst Joe Bugajski. IT needs to contract, modernize and innovate. SearchCIO.com senior writer Linda Tucci has written a series of stories with advice from CIOs on their approaches to IT innovation.
To get to the point in which IT can help the business innovate, it first has to solve an age-old problem: IT modernization, the subject of Bugajski’s talk at next week’s Burton Group Catalyst conference in San Diego.
And tied to modernization is consolidation. To get started, the CIO and IT will have to start talking to the business to figure out what technology, what applications and what projects are really needed — and what can go. That’s where the point of innovation can begin, he explains. IT consolidation will help you free up the needed capital that will allow you to start innovating and modernizing, vs. just maintaining.
The marching orders to modernize and contract need to come from the top — meaning the board, because C-levels can come and go, he said. So I’m wondering what your marching orders are? Are you being told to consolidate, modernize or innovate, or all of the above? Email me at firstname.lastname@example.org.
You share, collaborate and inform via email, phone, instant message and more — all because face-to-face time can be difficult to pull off in our increasingly busy schedules. And while some organizations have video conferencing capabilities to bring remote workers and office dwellers together, smaller organizations may not opt for the services.
However, a rise in mobile video devices, such as the iPhone4’s FaceTime and the Cisco Cius, could bring video anytime, anywhere. So how important will mobile video conferencing be to the enterprise?
On the one hand, mobile video conferencing will be a way to include traveling executives or remote workers in important meetings. On the other hand, these outward-bound employees may not always be in a private, quiet space that will allow them to fully participate.
The results of a 2009 CIMI Corp. survey of collaborative behavior showed that the number of mobile workers who believed that mobile video conferencing was helpful was less than 10%. The off-site workers did not feel they could actively participate in mobile video conferencing because of a lack of facilities, lack of network capacity to support connection and lack of privacy.
Another consideration is the potential for impromptu video conferences. Organizations with traditional conference rooms may have to plan their conferences weeks in advance. Conferences rooms with dedicated telepresence technology could book up quickly, but utilizing mobile technology on a tablet or similar device would allow users to pop into any room for a similar experience.
Plus, there is real value in being able to show someone explicitly how to do something and being able to read visual cues that could encourage communication and collaboration. Without these cues, it’s difficult to know when it’s appropriate to “jump in” on a voice conference.
The costs and benefits have to be weighed (and one may tip the scale in your organization), but remember: It’s not just about buying these devices — you have to consider changing work patterns and other business impacts.
In the past few weeks I’ve been writing about agile projects, Scrum vs. waterfall or using the two project approaches together, and a theme that keeps coming up when I talk to project managers is cutting the fat.
For a software development project to be agile, Alex Keenan, an ERP analyst and agile project team member with a large grocery chain, believes waste should be pinpointed as you plan and implement any project, and, with agile, continued to be identified with each project iteration.
He points to a Forrester Research chart that compares lean methodologies used in manufacturing and their counterparts for software development.
Sources of waste in manufacturing include: overproduction, waiting (time on hand), unnecessary transport, overprocessing or incorrect processing, excess inventory, unnecessary movement, defects and unused employee creativity.
The application development equivalents, according to Forrester Research, are: too many superfluous artifacts, broken builds, too many tool transitions, rigid architectures, analysis paralysis, late discovery of defects, rising downstream labor costs, polluted supply chain management streams and measure of effort and not results.
CIO Niel Nickolaisen also lays out how to translate lean methodologies to IT projects in a recent column.
From lean methodologies, I have learned about the seven forms of waste and how to map those to IT processes:
• Rework: How many times in IT do we ask for do-overs?
• Waiting: For example, we often have to wait for the input of a subject-matter expert before we can make a decision.
• Overprocessing: Research tells us that 64% of the software features and functions we develop are rarely, if ever, used.
• Excess inventory: A recent study found that about 40% of our software licenses are shelfware.
• Excess motion: How many of our status reporting mechanisms actually generate value rather than churn?
• Excess movement: Am I the only one whose projects get caught in multiple decision loops?
• Overproduction: How many of us buy licenses in large batches, rather than just in time?
These are a few takes on applying lean methodologies to IT projects and processes. What works for you? Email me at email@example.com.