For some firms that dole out smartphones to their employees, the mobile security policy might consist of remotely wiping the phone if it’s lost or stolen, or if the employee leaves the company. Other than that, users are left pretty much on their own.
That’s not enough, especially as smartphones and apps become smarter and more pervasive. It’s one thing for a company to provision its own smartphone. But many employees are bringing their own smartphones and doing company business on them — which some companies encourage, by the way, to cut down on cell, data plan and management costs.
Which makes last week’s news from the Black Hat conference all the more unnerving. Researchers announced that wallpaper apps on Android phones can collect and transmit data such as phone numbers and messages. In other cases, remote apps can be installed, which could be used to drop malicious payloads on to a phone.
Given the amount of work being down out of the palm of one’s hand these days, and even that mobile health care app stores are coming, the news is startling.
As with anything and security, enough is never enough. Checklists are a start, but which checklists are the right ones? For midmarket IT managers looking for some answers, SearchCIO-Midmarket.com’s sample mobile device management policies and templates is a good place to start.
I don’t want to spoil the fun or productivity that users are enjoying with iPhones, BlackBerrys and Androids, but if attention isn’t paid to them by IT security, then they will just become another access point into your networks.
How frustrating is it to be one of the little guys without the resources to bear for big projects. Well,
those frustrations may actually be easier to deal with than the alternative: Being a large company
entrenched in legacy systems.
I came across a blog post recently by Scott Adams, describing a fantasy world where countries weren’t so paralyzed by legacy systems that they couldn’t move forward. Older systems have too many vested interests, making it difficult to change because “anything that has been around for a while is a complicated and inconvenient mess compared to what its ideal form could be,” Adams wrote.
This got me thinking about the real legacy systems holding back IT: The projects that never go anywhere, not because the project itself is bad, but because the change would overwhelm the systems, strategies, processes and workflows that may not integrate with the change. How, too often, the need for agility is acknowledged, but never addressed. And, more importantly here, how the bigger you are, the more difficult it is to change.
So here’s to all the small IT shops out there, where each member of the staff wears multiple hats and puts in the extra hours because, let’s face it, who else will? Being smaller means you’re closer to the top, closer to making a difference. Yes, your budget is probably smaller and you can’t get to all the things you know you need to tackle, but there are also far fewer hoops to jump through and approval signatures to get.
If you take all of that into consideration, maybe the grass is greener on this side: You can quickly adopt new technologies and there is less red tape. And you probably aren’t so buried in legacy systems that every infrastructure change requires duct tape and a sprinkle of IT magic to get it working.
You can’t talk about agile data modeling without mentioning Scott Ambler, whom many consider the authority if not founder of agile data modeling.
In this tip, Ambler takes readers through an iterative approach to requirements modeling. His definition of agile data modeling is straightforward: Evolutionary data modeling is data modeling performed in an iterative and incremental manner … done in a collaborative manner.
Taking an agile approach to data modeling can help resolve one of the most onerous data modeling problems: very slow and complex notation development.
As explained by Burton Group senior analyst Joe Maguire, building data models takes so long because the practitioners developing notations aren’t seeking simplicity. “People designing notations are trying to get tenure, a PhD, when they design notations,” he said. “They don’t get tenure for making things simpler.”
Which leads vendors to incorporate more complex notation designs into their tools, he said. Data modelers are stuck working with these tools and forcing users (the consumers of this data) to provide information that covers every aspect of the requirements in these notations.
So how can agile data modeling help? Agile recognizes that there will be change, whether you are developing software or a data model, but beyond that it allows you to build a data model with less rigorous notations. System requirements, or information requirements, that are especially subject to change and ones that are less susceptible to change can be identified and put in different notation buckets.
“Then you can alter modeling behavior to model stable requirements earlier in the process with confidence, and not model unstable requirements too early in the process. That helps accelerate the process of data modeling,” he said.
And this needs to be done on a business, not technology level. “If the taxonomy in my head as data modeler is motivated by the business significance of the requirements, that’s something I think agile techniques would really improve, if [data modelers] took that kind of taxonomy to heart.”
This is only a snippet of his thoughts on agile data modeling and a much larger issue Maguire is taking on at Burton Group’s Catalyst conference in San Diego this week. Here’s the overarching theme of his talk: Agile software development has an offshoot: Agile data modeling. The decision to use agile data modeling depends on local realities about information architecture, buy-or-build approaches to software, democratization of data access, and the company’s commitment to business analytics. For IT strategists, the decision to use agile data modeling is further complicated by the rancor that accompanies the debate. The session will touch on:
I’d like to hear from you if you are taking on agile data modeling, and learn what obstacles or benefits you’ve encountered as a result. Email me at firstname.lastname@example.org.
A lack of formal standards has made cloud computing nebulous so far. If a company wants to switch vendors, move platforms or build its own infrastructure, the confusion mounts, since most providers use a proprietary platform like Amazon’s A3 or Salesforce’s Force.com.
The project could improve portability and reduce costs, with more standard features, meaning users will be able to move among vendors and even move the cloud in-house down the road. While OpenStack provides more power, storage and processing power on demand, serving the needs of larger companies looking to build complex cloud environments, the standards it is setting could also make this an attractive option for SMBs.
Other open source offerings are available from Eucalyptus (providing infrastructure software to support the creation of cloud computing environments) and Cloud.com, with packages available in its free community edition, and also from its open source and proprietary editions.
While the OpenStack project itself was designed to serve scientific computing needs, it also aims to fill another gap in the cloud computing marketplace: As a cloud platform that doesn’t have a single commercial owner, it puts the business back in charge of its own data.
Could the standards and portability these open source cloud projects aim for open the door for widespread midmarket adoption?
I’ve been touching base with IT folks and analysts lately to get a feel for the projects people are working on these days, and the phrases that keep coming up are IT consolidation, modernization and innovation.
A senior network administrator with a national hospice care provider is working on a data center consolidation project. The C-levels are pretty keen on leveraging cloud services to outsource IT functions, but he’s not convinced it will happen.
“From what I’ve seen, the ROI for the cloud is not there yet,” he said. “When you factor in the actual costs of building it in-house vs. hosting, it’s a lot cheaper to do it in-house.”
So that is the back and forth under way at his company: IT consolidation — Do we do it ourselves, go with a cloud provider or take a hybrid approach?
The task ahead for IT is threefold, explains Burton Group analyst Joe Bugajski. IT needs to contract, modernize and innovate. SearchCIO.com senior writer Linda Tucci has written a series of stories with advice from CIOs on their approaches to IT innovation.
To get to the point in which IT can help the business innovate, it first has to solve an age-old problem: IT modernization, the subject of Bugajski’s talk at next week’s Burton Group Catalyst conference in San Diego.
And tied to modernization is consolidation. To get started, the CIO and IT will have to start talking to the business to figure out what technology, what applications and what projects are really needed — and what can go. That’s where the point of innovation can begin, he explains. IT consolidation will help you free up the needed capital that will allow you to start innovating and modernizing, vs. just maintaining.
The marching orders to modernize and contract need to come from the top — meaning the board, because C-levels can come and go, he said. So I’m wondering what your marching orders are? Are you being told to consolidate, modernize or innovate, or all of the above? Email me at email@example.com.
You share, collaborate and inform via email, phone, instant message and more — all because face-to-face time can be difficult to pull off in our increasingly busy schedules. And while some organizations have video conferencing capabilities to bring remote workers and office dwellers together, smaller organizations may not opt for the services.
However, a rise in mobile video devices, such as the iPhone4′s FaceTime and the Cisco Cius, could bring video anytime, anywhere. So how important will mobile video conferencing be to the enterprise?
On the one hand, mobile video conferencing will be a way to include traveling executives or remote workers in important meetings. On the other hand, these outward-bound employees may not always be in a private, quiet space that will allow them to fully participate.
The results of a 2009 CIMI Corp. survey of collaborative behavior showed that the number of mobile workers who believed that mobile video conferencing was helpful was less than 10%. The off-site workers did not feel they could actively participate in mobile video conferencing because of a lack of facilities, lack of network capacity to support connection and lack of privacy.
Another consideration is the potential for impromptu video conferences. Organizations with traditional conference rooms may have to plan their conferences weeks in advance. Conferences rooms with dedicated telepresence technology could book up quickly, but utilizing mobile technology on a tablet or similar device would allow users to pop into any room for a similar experience.
Plus, there is real value in being able to show someone explicitly how to do something and being able to read visual cues that could encourage communication and collaboration. Without these cues, it’s difficult to know when it’s appropriate to “jump in” on a voice conference.
The costs and benefits have to be weighed (and one may tip the scale in your organization), but remember: It’s not just about buying these devices — you have to consider changing work patterns and other business impacts.
In the past few weeks I’ve been writing about agile projects, Scrum vs. waterfall or using the two project approaches together, and a theme that keeps coming up when I talk to project managers is cutting the fat.
For a software development project to be agile, Alex Keenan, an ERP analyst and agile project team member with a large grocery chain, believes waste should be pinpointed as you plan and implement any project, and, with agile, continued to be identified with each project iteration.
He points to a Forrester Research chart that compares lean methodologies used in manufacturing and their counterparts for software development.
Sources of waste in manufacturing include: overproduction, waiting (time on hand), unnecessary transport, overprocessing or incorrect processing, excess inventory, unnecessary movement, defects and unused employee creativity.
The application development equivalents, according to Forrester Research, are: too many superfluous artifacts, broken builds, too many tool transitions, rigid architectures, analysis paralysis, late discovery of defects, rising downstream labor costs, polluted supply chain management streams and measure of effort and not results.
CIO Niel Nickolaisen also lays out how to translate lean methodologies to IT projects in a recent column.
From lean methodologies, I have learned about the seven forms of waste and how to map those to IT processes:
• Rework: How many times in IT do we ask for do-overs?
• Waiting: For example, we often have to wait for the input of a subject-matter expert before we can make a decision.
• Overprocessing: Research tells us that 64% of the software features and functions we develop are rarely, if ever, used.
• Excess inventory: A recent study found that about 40% of our software licenses are shelfware.
• Excess motion: How many of our status reporting mechanisms actually generate value rather than churn?
• Excess movement: Am I the only one whose projects get caught in multiple decision loops?
• Overproduction: How many of us buy licenses in large batches, rather than just in time?
These are a few takes on applying lean methodologies to IT projects and processes. What works for you? Email me at firstname.lastname@example.org.
Those looking to be certified in ITIL V2 Foundation have missed their chance: As of July 1, the V2 Foundation recertification is no longer offered, as the Office of Government Commerce slowly withdraws ITIL V2 in favor of ITIL V3. It’s out with the old, in with the new(ish) ITIL.
According to George Spalding, an executive vice president at IT management consulting firm Pink Elephant in Rolling Meadows, Ill., more than 300,000 individuals worldwide have taken IT Infrastructure Library (ITIL) exams since January 2009. While both V2 and V3 have been offered as of late, there were more people seeking V3 certification than V2. “It was time to get everyone up to speed on the same version, speaking the same language,” he said.
ITIL V3, introduced in 2007, emphasizes a more business-driven, top-down approach to IT management, an approach that was expected to gain more executive support. However, ITIL V3 gained momentum slowly. In an online ITIL adoption survey from Q3 2009 of organizations (mostly from North America) with 500 employees or more, the results showed that while ITIL V3 adoption was occurring, it was happening slowly and in pieces. At the time of the survey, only 4% of respondents indicated that they were implementing V3 from scratch.
Many organizations stuck with ITIL V2 Foundation for so long because they were comfortable with it. Plus, the timing was all off: Shortly after V3 was announced in 2007, we spiraled into a recession, so new initiatives were shelved. And before that, “many organizations were still maturing their ITIL V2 strategy,” Spalding said, “It can be hard to switch gears midproject and onto a new version.”
What does this really mean for users, that there won’t be another alternative for ITIL V2 Foundation certification? Will this affect your IT shop?
One of the 12 principles of agile projects is that agile processes promote sustainable development. The sponsors, developers and users should be able to maintain a constant pace indefinitely, according to the Agile Manifesto.
But project managers new to the agile approach are struggling with the concept of a project with seemingly no end.
I posed this question to several experts and was given a few straightforward answers: The project ends when you run out of funding. Another response was that it never stops. Like any other project, it is an ongoing process in which improvements and changes will constantly be made, even after it’s put into production.
Given its iterative nature, however, forecasting the end of agile projects isn’t so difficult.
Joseph Flahiff is a program manager with a national health insurance carrier. In charge of a $20 million HIPAA/EDI project that combines agile and waterfall approaches, he explains it this way:
You know just like any other project. If it’s managed with the waterfall approach, you’re probably given a scope of work: “These are the things we need you to do, and this is the goal we are trying to shoot for.” When you’re done with the scope of work, you are done with the project. What agile brings to [a project] is that you’re delivering all along, so you can see how close you are to actually being done.
Agile projects also allow you to fix problems as they arise, as opposed to a traditional software development approach in which most components are tested at the end. It’s a bottom-up development approach in which you are delivering working software all along, and you can show project sponsors how the team is burning — aka, the burndown rate — toward getting it done on schedule, done early or if you are running behind, he said.
“Then the leadership team or steering committee can actually take action and become involved, as opposed to looking at the [product] at the last minute and having a firefight,” Flahiff said.
Knowing when to end agile projects is but one quandary. Another project manager would like to know how to ensure that all the development iterations come together at the end. If you have any advice, email me at email@example.com.
Where does ITIL fit in with SMBs? It seems more organizations are following ITIL best practices in their overall IT operations and service delivery — creating some form of a service catalog, practicing change management strategies and focusing on service design — but not following ITIL frameworks full circle. Is the trend of ITIL a la carte a good thing?
For example, my colleague Christina Torode recently wrote a story about SMBs and service catalogs — and how many are using ITIL best practices to build a portal or a catalog that works for their specific organization, sometimes using SharePoint and workflow processes, without following ITIL to the letter.
Has ITIL matured to the point of being mainstream, as organizations learn how to pick and choose the ITIL best practices that best suit them? In other words, are they taking what they’ve learned about ITIL in the past and applying it in a more palatable way?
According to Evelyn Hubbert, a senior analyst at Forrester Research Inc., many of her clients have picked the critical pieces of ITIL they need to improve efficiency and effectiveness and adopted them as ITIL best practices. However, without accurate metrics measuring where IT was before the implementation and how ITIL has improved it, it’s harder to continue moving forward after the quick wins.
“[IT] cannot say where they are before they took off and don’t know where they have gotten to and cannot justify the benefits,” Hubbert wrote in an email. “Therefore, the effort is cut.”
Further, she said, ITIL adoption in terms of service delivery (a piece many organizations fail to add) needs to continue for organizations to stay competitive moving forward — especially as technologies continue to grow and change.
“I believe that ITIL will be part of most IT organizations … by 2012,” she said. “With clouds, virtual worlds, etc., the topics of service-level management, service portfolios [and] cost models will become much more critical.”
So the question remains: How much ITIL is enough?