If you read Linda Tucci’s recent story on the efforts of Tasty Baking Co. to find a workable solution for trade promotion management, you may have been struck, as I was, about why TPM can be such a challenge.
After all, most of the biggest retailers in the world practice some sort of trade promotion management, either via vendor or home-grown solutions. Yet there seems to be a lack of effective software tools for optimizing retail partner relationships, as well as few standards to rally around.
TPM is not a new concept either, but even a top analyst covering the field, Gartner’s Dale Hagemeyer, has not found significant movement in the field since his most recent report, “Seven Key Considerations When Choosing a TPM Solution.”
Yet Tasty’s CIO, Chan Kang, is faced with real issues as he seeks to work TPM into his tightening budget. Though the company’s direct store delivery model produces quality data, “What we don’t do enough is measure the effectiveness of those promotions: how much lift, what is the baseline, the incremental profit — in other words, whether it was a good idea,” Kang said.
Kang is evaluating vendors, but even though industry groups like Trade Promotion Management Associates and the Vendor Compliance Federation are working to promote solutions for TPM, Tasty could be still confronted by vendor lock-in and integration issues with whatever solution it integrates.
Some observers are skeptical that TPM standards can be achieved, but this is one area that seems like a no-brainer for the Oracles and SAPs of the world to come together for the common good. Such cooperation could only help to increase the bottom line — for everybody.
I don’t know about you, but to me, ITIL (or IT Infrastructure Library) is a little overwhelming. I’m only looking at using the ITIL framework as an IT service catalog tool, and I get a little lost.
Under ITIL guidelines, an IT service catalog is a subset of service-level management, which is a subset of service delivery. Service delivery is the topic of only one of eight ITIL books on IT Service Management (ITSM) guidelines, and that’s just in ITIL v2. ITIL v3 has five other books that update some of v2, but also introduce new ITSM strategies.
SearchCIO-Midmarket.com and SearchCIO.com recently ran a survey asking our readers about their ITIL use. We haven’t pulled together all of the results yet, but here’s a preview: When we asked readers to choose up to three areas in which they would like to see improvements to ITIL, they said:
- 35.4% — ITIL should provide more information on how ITIL works with other
process-improvement methodologies, like Six Sigma and Lean.
- 31.1% — ITIL should offer more prescriptive advice vs. just guidance.
- 12.9% — ITIL should include more specific advice on transition from v2 to v3.
- 12.5% — ITIL needs to be clearer on the differences between v2 and v3.
- 8.1% — ITIL v3 is too complex.
So, it’s clear that people would like more guidance and less complexity, but anecdotally, a few IT shops and service providers I’ve talked to recently said that ITIL does just the opposite: It clears up some complexity.
When an IT service catalog is being put together, ITIL tells the business and IT what terminology to use, they say.
“ITIL gets people speaking the same language,” said Matt French, marketing director with Service-now, an ITSM Software as a Service provider. “It makes it clear what an incident or a request is, and helps an organization with [corporate] terminology [that is different across the company] use the same terminology.”
ITIL also helps IT set the right expectations for service delivery in terms of service levels and what is possible — and not possible — as far as services the business wants to see and what IT can realistically deliver. It does this by setting the scope of an IT service catalog project, including taking inventory of the skill sets IT has on hand (or not) to deliver a service, and helps organizations choose a set of standard services.
Any advice on how to use ITIL to reduce complexity, or how you have been able to simplify ITIL at your shop? I’d like to hear from you. Email me at email@example.com.
If the sagging economy has forced midsized companies to delay hiring more IT staff, maybe migrating to Windows 7 can move them off that dime.
In a recent report by IDC, an impressive number of midsized companies migrating to Windows 7 say they realized a full return on their investment in just seven months. The migration also helped significantly reduce the time help desks spend dealing with malware downtime and reboots by replacing Windows XP and Windows Vista.
One midsized company says the money saved in migrating to Windows 7 has allowed it to hire some much-needed developers.
“Windows 7 gave us more cash to work with because we could throw it on a couple of hundred older PCs, so we didn’t have to buy new ones. Those savings will let us hire a couple of young developers to work on some internal applications we need pushed out,” said Joe Harmon, an IT purchasing agent with a midsized regional health care provider in western New York state. “I was surprised. Microsoft usually costs me money with some of their licensing plans.”
Costs were down in three important labor categories analyzed in the report: IT labor hours per PC, per year for deployment (down 45%); IT labor hours per PC, per year for service desk support (down 65; and IT labor hours per PC, per year for PC and operating system support (down 55%). In the 14 categories where a set of common end-user activities relating to the operation of Windows 7 was measured, savings resulted in 43 hours of productivity per year, per user.
Like IT professionals at other midmarket companies, Harmon also migrated to Windows 7 because Microsoft’s technical support for Windows XP, which includes regular delivery of security patches, is ending. Harmon said the built-in security in Windows 7 is superior to that of Windows XP, so he won’t be as reliant on security patches.
It’s nice to hear that some financial relief has finally arrived for SMBs, given how the Great Recession has ravaged them.
Every once in a while I like to check out what Microsoft admins are downloading these days. I sometimes do a search on Google, but to get a feel for the work being done in Microsoft shops, I always return to the Microsoft Download Center.
It lists the top five free Microsoft downloads in general, with some of the usual suspects, XP and Office compatibility. This goes to show that shops are still holding onto the older Windows OS — and moving some people to newer versions of Office.
Here are the top five free Microsoft downloads and the company’s descriptions for them:
Microsoft Office Compatibility Pack for Word, Excel and PowerPoint File Formats. Open, edit and save documents, workbooks and presentations in the Open XML file formats, which were introduced to Microsoft Office Word, Excel and PowerPoint beginning with Office 2007 and continuing with Office 2010.
DirectX End-User Runtime. Provides updates to 9.0c and previous versions of DirectX — the core Windows technology that drives high-speed multimedia and games on the PC.
Update for Windows XP (KB932823). Resolves an issue in which a user is unable to use Windows Internet Explorer 7 to download files on a computer that is running Windows XP with IME enabled.
.NET Framework Version 2.0 Redistributable Package (x86). Installs the .NET Framework runtime and associated files required to run applications developed to target the .NET Framework v2.0.
Microsoft .NET Framework 4 (Web Installer). Downloads and installs the .NET Framework components required to run on the target machine architecture and OS. An Internet connection is required during the installation. .NET Framework 4 is required to run and develop applications to target the .NET Framework 4.
Digging further, specifically looking at server management tools, what surprised me was that four of the top five weren’t tools in the sense that they fixed system problems. One is a case study on how a business benchmarked its PHP applications on Windows Server 2008, and there are a couple of how-tos, one on an Office Communication Server deployment, another on European data compliance.
Here are the free server management tools that have been downloaded the most out of 4, 821 choices, and Microsoft’s descriptions:
Security Update for Windows Server 2008 RC0 for Itanium-based Systems (KB941644). A security issue has been identified in TCP/IP that could allow an attacker to compromise your Windows-based system and gain control over it.
Microsoft Windows Server 2000 Assessment Configuration Pack for European Union Data Protection Directive (EUDPD).
This configuration pack contains configuration items intended to help you establish and validate a desired configuration for your Windows 2000 servers in order to support your European Union Data Protection Directive compliance efforts.
Customer Solution Case Study: Windows Server 2008 Charts a Secure and Flexible Roadmap for Virtual Map.
Optimization at work in Microsoft. A presentation at an executive breakfast seminar, The Business Impact of Infrastructure Optimization, held Feb. 6, 2007.
So do these downloads sync up with what’s going on in your shop, or do you have a set of your own free tools that you can’t live without? I’d like to hear about it. Email me at firstname.lastname@example.org.
Google CEO Eric Schmidt’s statement last week about the end of online anonymity recalls similar words by one of his former colleagues, Scott McNealy of Sun, more than a decade ago.
McNealy caused a stir back in 1999 when he said, “You have zero privacy. … Get over it.” McNealy was referring to the Intel Pentium III processor, which had a feature that could uniquely identify a user. The Internet had nothing to do with privacy in that context, but at a time when people were just getting used to using their credit card online, everybody who heard the statement could make the leap to the perils of e-commerce.
Schmidt said that society is facing major disruptions due to the incredible amount of online data being generated (“5 exabytes [or 5 billion GBs] … every two days” he said) — mostly user-generated content via blogs, message boards, Twitter, Facebook, etc.
Online anonymity is a paradox. People hide behind an anonymous email or forum comments, but there are ways to track you down. In addition, Facebook users may forget that the pictures that they posted from Friday night’s trip to the bar can be seen by everybody, even their bosses.
Predictive analysis of consumer behavior is inevitable, he said. “If I look at enough of your messaging and your location, and use artificial intelligence,” Schmidt said, “we can predict where you are going to go.”
That sounds a lot like the movie version of Philip K. Dick’s Minority Report, where Tom Cruise is pelted with personalized messages from every billboard he passes. But this is not science fiction. Electronics retailer Best Buy is partnering with Shopkick to enable your smartphone to communicate with the store for promotions and rewards points as you shop.
Online anonymity will be a casualty of the data-saturated world, Schmidt said. “The only way to manage this is true transparency and no anonymity. In a world of asynchronous threats, it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it.”
Ironic, isn’t it. People yelled in protest at McNealy’s words in 1999. I doubt Schmidt’s will make much of a ripple, since nowadays the consumer is a willing accomplice in the end of anonymity.
Get over it? Now it’s more like, who cares?
In the past few years, Ross Pettit, client principal at Chicago-based ThoughtWorks Inc., has seen a shift in client requests. The agile software development consulting firm’s projects are still mainly grounded in custom application development but, more often, he said, organizations want to apply agile best practices to non-IT-related projects. Pettit suggests following these initial steps when adopting an agile project approach:
Develop a release planning stage in which aspects of the project are divided into smaller, more manageable chunks. This stage involves defining the problem in business, not technical, terms, choosing and pairing up team members from different aspects of the business and assigning project facilitators who can step in to remove obstacles.
Every week there is a checkpoint to gauge progress, the understanding of the problem, what needs to be done next, and how the team is tracking against the agreed-upon solution path. By having the primary stakeholders involved, the business problem is laid out for all to see and obstacles can be removed. “Week after week, there is tremendous transparency and exposure to all the stakeholders …,” Pettit said. “This allows stakeholders to make the resources available for what needs to be done, immediately.”
Be retrospective during each checkpoint. Ask what worked well, what worked poorly, what was confusing, and what to change. “On a weekly basis, this builds in mechanisms that create continuous improvements … where you have continuous planning and tremendous visibility,” he said.
It’s OK to fail, but fail fast. “The nice thing about agile is that it is not only OK to fail, but it’s really good to fail,” he said. “The more often you try and fail, the more you learn about the problem in front of you.”
Start with the hardest nut to crack. Pettit has seen agile projects fall apart because project teams decide to go after the simpler tasks first. “Too often, I see projects fail because they were able to get two or three easy things accomplished, then they get to the more difficult ones they were putting off, and they say ‘I don’t know if we’re up to that one,’” he said.
And even though a project may be focused on a business problem, the CIO will be called on to act as an agile executive. The CIO is the one who can pay attention to all the data coming out of all the projects. “The CIO can make decisions outside of the context of the project, that others may not feel empowered to make, and will see problems that others may not by looking from the outside in,” he said.
Working on a new project and hitting a few bumps? Send me your project problems and I’ll post them on our blog — without your name, or company name, of course — to see if your peers have any advice. Email me at email@example.com.
For some firms that dole out smartphones to their employees, the mobile security policy might consist of remotely wiping the phone if it’s lost or stolen, or if the employee leaves the company. Other than that, users are left pretty much on their own.
That’s not enough, especially as smartphones and apps become smarter and more pervasive. It’s one thing for a company to provision its own smartphone. But many employees are bringing their own smartphones and doing company business on them — which some companies encourage, by the way, to cut down on cell, data plan and management costs.
Which makes last week’s news from the Black Hat conference all the more unnerving. Researchers announced that wallpaper apps on Android phones can collect and transmit data such as phone numbers and messages. In other cases, remote apps can be installed, which could be used to drop malicious payloads on to a phone.
Given the amount of work being down out of the palm of one’s hand these days, and even that mobile health care app stores are coming, the news is startling.
As with anything and security, enough is never enough. Checklists are a start, but which checklists are the right ones? For midmarket IT managers looking for some answers, SearchCIO-Midmarket.com’s sample mobile device management policies and templates is a good place to start.
I don’t want to spoil the fun or productivity that users are enjoying with iPhones, BlackBerrys and Androids, but if attention isn’t paid to them by IT security, then they will just become another access point into your networks.
How frustrating is it to be one of the little guys without the resources to bear for big projects. Well, those frustrations may actually be easier to deal with than the alternative: Being a large company entrenched in legacy systems.
I came across a blog post recently by Scott Adams, describing a fantasy world where countries weren’t so paralyzed by legacy systems that they couldn’t move forward. Older systems have too many vested interests, making it difficult to change because “anything that has been around for a while is a complicated and inconvenient mess compared to what its ideal form could be,” Adams wrote.
This got me thinking about the real legacy systems holding back IT: The projects that never go anywhere, not because the project itself is bad, but because the change would overwhelm the systems, strategies, processes and workflows that may not integrate with the change. How, too often, the need for agility is acknowledged, but never addressed. And, more importantly here, how the bigger you are, the more difficult it is to change.
So here’s to all the small IT shops out there, where each member of the staff wears multiple hats and puts in the extra hours because, let’s face it, who else will? Being smaller means you’re closer to the top, closer to making a difference. Yes, your budget is probably smaller and you can’t get to all the things you know you need to tackle, but there are also far fewer hoops to jump through and approval signatures to get.
If you take all of that into consideration, maybe the grass is greener on this side: You can quickly adopt new technologies and there is less red tape. And you probably aren’t so buried in legacy systems that every infrastructure change requires duct tape and a sprinkle of IT magic to get it working.
You can’t talk about agile data modeling without mentioning Scott Ambler, whom many consider the authority if not founder of agile data modeling.
In this tip, Ambler takes readers through an iterative approach to requirements modeling. His definition of agile data modeling is straightforward: Evolutionary data modeling is data modeling performed in an iterative and incremental manner … done in a collaborative manner.
Taking an agile approach to data modeling can help resolve one of the most onerous data modeling problems: very slow and complex notation development.
As explained by Burton Group senior analyst Joe Maguire, building data models takes so long because the practitioners developing notations aren’t seeking simplicity. “People designing notations are trying to get tenure, a PhD, when they design notations,” he said. “They don’t get tenure for making things simpler.”
Which leads vendors to incorporate more complex notation designs into their tools, he said. Data modelers are stuck working with these tools and forcing users (the consumers of this data) to provide information that covers every aspect of the requirements in these notations.
So how can agile data modeling help? Agile recognizes that there will be change, whether you are developing software or a data model, but beyond that it allows you to build a data model with less rigorous notations. System requirements, or information requirements, that are especially subject to change and ones that are less susceptible to change can be identified and put in different notation buckets.
“Then you can alter modeling behavior to model stable requirements earlier in the process with confidence, and not model unstable requirements too early in the process. That helps accelerate the process of data modeling,” he said.
And this needs to be done on a business, not technology level. “If the taxonomy in my head as data modeler is motivated by the business significance of the requirements, that’s something I think agile techniques would really improve, if [data modelers] took that kind of taxonomy to heart.”
This is only a snippet of his thoughts on agile data modeling and a much larger issue Maguire is taking on at Burton Group’s Catalyst conference in San Diego this week. Here’s the overarching theme of his talk: Agile software development has an offshoot: Agile data modeling. The decision to use agile data modeling depends on local realities about information architecture, buy-or-build approaches to software, democratization of data access, and the company’s commitment to business analytics. For IT strategists, the decision to use agile data modeling is further complicated by the rancor that accompanies the debate. The session will touch on:
- Why are typical data-modeling initiatives so cumbersome, and what can be done about that?
- Why do data modelers resist agile data modeling? Is the resistance justified? Unjustified?
- How can IT leaders mediate the disputes between agile software developers and data management professionals? Are these disputes a healthy phenomenon, or a symptom?
- What kind of business environments and IT initiatives are amenable to agile data modeling?
- How can agile data modeling techniques be improved to become even more agile?
I’d like to hear from you if you are taking on agile data modeling, and learn what obstacles or benefits you’ve encountered as a result. Email me at firstname.lastname@example.org.
A lack of formal standards has made cloud computing nebulous so far. If a company wants to switch vendors, move platforms or build its own infrastructure, the confusion mounts, since most providers use a proprietary platform like Amazon’s A3 or Salesforce’s Force.com.
The project could improve portability and reduce costs, with more standard features, meaning users will be able to move among vendors and even move the cloud in-house down the road. While OpenStack provides more power, storage and processing power on demand, serving the needs of larger companies looking to build complex cloud environments, the standards it is setting could also make this an attractive option for SMBs.
Other open source offerings are available from Eucalyptus (providing infrastructure software to support the creation of cloud computing environments) and Cloud.com, with packages available in its free community edition, and also from its open source and proprietary editions.
While the OpenStack project itself was designed to serve scientific computing needs, it also aims to fill another gap in the cloud computing marketplace: As a cloud platform that doesn’t have a single commercial owner, it puts the business back in charge of its own data.
Could the standards and portability these open source cloud projects aim for open the door for widespread midmarket adoption?