Are you dissatisfied with your traditional relational database management system (RDBMS) for business intelligence (BI)?
You’re not alone.
According to Forrester Research Inc., an RDBMS has always been an awkward fit for BI. When you need to find relationships that require analyzing many-to-many correspondences; when the variables themselves aren’t all of the same kind; or when you don’t know, going in, exactly which relationships you’re looking for, traditional spreadsheets and their more sophisticated relational-database progeny come up short. Even if you know what you’re looking for, a traditional RDBMS requires time-consuming tuning to get the job done. That’s just not practical in the modern business landscape. When the questions are changing faster than the BI answers can be provided, it’s time for something new.
In fact, in the search for BI agility, most companies will jettison their current RDBMS over the next decade for BI needs, Forrester BI expert Boris Evelson predicted.
Last week, I spoke to David Gallaher, IT services manager at the National Snow and Ice Data Center, who went to an object-oriented database because a traditional RDBMS was of no use.
“We have tried to shoehorn all kinds of data into these constructs, and now Big Data is where we have really run into the limitation of what you can do with these old constructs, where everything has to fit into a table,” Gallaher told me. “Well, what if my data doesn’t really fit into a table?”
In a report published in May, Evelson discussed several new strategies for extracting relationships out of ever-more-complex data sets, and reviewed four relevant BI database management system technologies that have already arrived or at least are on their way. Here’s the skinny:
Columnar DBMS: Although traditional spreadsheets — still the most popular BI tool — can always analyze a row or a column, the emphasis in some new DBMSes is shifting to the power and flexibility of columnar analysis. Evelson believes there are distinct advantages with a columnar DBMS. It compresses data better than a row-based RDBMS, because everything in a column is of the same type. Indexing is an easier task than it would be in a row-based RDBMS because each column “already represents its own index,” he said. “It can keep the database size roughly equal to that of the raw data set — or sometimes cut it in half,” he added.
Many DBMS vendors already offer columnar or hybrid row-based and columnar systems. They range from such mainstream vendors as IBM (Netezza), Microsoft (PowerPivot), and EMC Corp. (Greenplum) to such pure-play columnar RDBMS vendors as Hewlett-Packard Co. (Vertica), SAP AG, Sybase Inc. (IQ), Infobright Inc., and 1010data Inc.
In-memory index DBMS: This is the most agile and flexible of the four technologies because the entire relational database is either in memory or can be swapped rapidly into memory. That flexibility and agility, however, add risk. One risk is that business users could arrive at a wrong answer because they’re no longer constrained by the rigid data models typical of an RDBMS.
It should also be kept in mind that the functions offered by in-memory vendors vary widely, Evelson warned. Among other questions, business pros should inquire whether an in-memory DBMS can be accessed by their other BI tools. Another issue is that if Big Data is being used, the entire data model might not fit into a single memory space.
When sizing applications for a single memory space, users should consider the size of the raw data set, compression ratios and the number of concurrent users, he advised. If the total exceeds a few hundred gigabytes, he suggested picking a vendor that can “dynamically swap chunks of your model in and out of [random-access memory],” or one of the hybrid in-memory databases. The vendor list includes Tibco Software Inc. (Spotfire), Tableau Software Inc., SAP (HANA), and MicroStrategy Inc., among others.
Inverted-index DBMS: According to Evelson, this is a useful database technology when data is complex, content is unstructured and the user’s hypothesis is vague. By building indexes, an inverted-index BI DBMS upends the RDBMS practice of putting the database first and worrying about tuning it later. “This approach builds one big index, but instead of just pointing to data sources — as traditional search engines like Google and Yahoo do — it embeds data in the index itself,” he explained.
The inverted index works well for applications that use data from a variety of sources and that incorporate structured as well as unstructured content. BI pros should consider an inverted index when a project requires numerous data marts to get around the limitations of traditional and even multidimensional DBMSes. An RDBMS assumes you know what you’re looking for, “but BI end users often don’t,” he noted. This searching allows BI users to navigate through the data in order to zero in on what they want by subtracting what they know they don’t want. Attivio Inc. and Endeca Technologies Inc. offer an inverted-index DBMS.
Associative DBMS: It’s tough to make predictions, especially about the future, as Yogi Berra is said to have noted. That’s why some business users are insisting that everything gets filed away in the data warehouse because who knows when it might come in handy. An associative DBMS attempts to link everything together, allowing any trend to be pulled out at any time. “Imagine a data warehouse that can store all-to-all relationships — associations or vectors — between every entity and every attribute in your domain, with counters, aggregates and indexes for every intersection,” Evelson said. Oh, my! But it will cost you. The factor used to calculate the size of an associative DBMS as a multiple of the raw data set is as high as 10 in the associative databases used in academia, he said.
An associative DBMS also requires purpose-built graphical user interfaces, and is not easily accessed by queries based on the Structured Query Language and the Multidimensional eXpressions language. Rather than think in traditional “where clauses,” associative DBMSes let their imaginations run wild, finding connections and analogies that — you guessed it — don’t necessarily line up neatly by rows. Saffron Technology Inc., Ingres Corp. (VectorWise), Illuminate Solutions Inc. (iLuminate), LazySoft Ltd. (Sentences), and Splunk Inc. (a variation on an associative DBMS) are in the vanguard.
Companies that already have installed virtual desktops are considered trailblazers even now, and the technology wasn’t fully baked back in 2008 when Dustin Fennell, CIO at Scottsdale Community College in Arizona, decided to use desktop virtualization to give 13,000 students and 1,000 employees anytime, any-device access to data and applications.
Desktop virtualization is still uncharted territory for many organizations and CIOs, such as Maytee Aspuro, CIO at the Wisconsin Department of Children and Families. That’s why she and Fennell both had backup plans in case their application and desktop virtualization projects blew up in their cutting-edge faces.
Aspuro and her team are virtualizing 1,200 desktops using VMware desktop virtualization and Unidesk virtual desktop management technologies. The pilot phase in 2010 called for hiring a new staff that could virtualize 350 desktops within eight months. The time frame unnerved her because she had walked into a freshly minted organization: The department was new, created by the merger of three government agencies, and it had 30 vacant IT staff positions.
So, while Aspuro’s team began building a platform for a virtual desktop infrastructure (VDI), she bought Lenovo laptops fully loaded with applications for employees in the field. Fortunately for her, the pilot phase went well, and the remaining 850 devices, old and new, will be repurposed as virtual desktops, including the Lenovo laptops.
“With such a tight timeline, and because we hadn’t done VDI before, we needed a fallback plan that we could put in place in only a few weeks,” Aspuro said.
Fennell calls his contingency plan a hybrid mode in which users could access their data and applications on his college’s Web portal, using VDI, application virtualization and provisioning technologies by Citrix Systems. The applications also were installed locally on college-owned devices so users could use the Web portal and compare it to their app performance on their college desktops.
This hybrid approach also “gave users a level of comfort that, if [the Web portal] crashed, they had their application locally installed as well,” Fennell said.
After a year, as students became comfortable with the Web portal’s performance, Fennell’s team began removing the locally installed applications, and all new apps became Web-portal-accessible only.
It wasn’t exactly a contingency plan, but more of a reassurance to users getting used to a new services delivery model. Still, phasing in desktop virtualization is highly recommended, whether it’s done to comfort end users or to make sure that the technology actually does what it’s supposed to do in a complex computing environment that has a lot of room for error.
Let us know what you think about this blog post; email Christina Torode, News Director
The day of the 10-year outsourcing deal, cooked up in the
backroom boardroom and conferred to a sole provider on the promise of 10% — make that 20% — savings on Day 1, is over, at least for the rich and famous. (It actually died about the same time Lehman Brothers did.)
Less glibly: That outsourcing model is no longer viable for large enterprises with complex IT environments that are determined to leverage utility computing (cloud, Software as a Service), exploit cutting-edge technology and unload routine IT services to gain a competitive advantage. To achieve that kind of smart IT service delivery, enterprises — especially their CIOs — need to be dealing with multiple suppliers.
Of course, the devil is in the details: How do you actually do this? That was the burning question at a news briefing yesterday morning with HP Enterprise Services before the company’s announcement of a new offering. The HP Multi-Supplier Integration Service, or MSI, aims “to help enterprises and governments gain control of multivendor service environments, improving overall IT performance and quality while optimizing costs.”
You can’t beat that offer. The question is, can you afford it?
Getting this outsourcing model right is really hard. As HP correctly notes, these models “challenge IT leadership to ensure efficient workflow, timely problem resolution and adequate service-level performance.” In other words, these deals require the foresight of a Steve Jobs, the ruthlessness of a Larry Ellison and the wisdom — and wealth (we’ll get to that later) — of Solomon. When I asked Peter Yates, chief technology officer for HP Enterprise Services, to explain the mechanisms HP will use to wrangle this IT herd of disparate and even competing interests, he, not surprisingly, demurred. That’s HP’s “secret sauce,” he said.
What Yates did note, however, is that central to success in this outsourcing model is making suppliers “play nice together.” How do you get disparate and even competing suppliers to play nice together for the good of the customer? The terms need to be spelled out right up front, in the RFP. And — here’s the money question — the deal has to be so big and so good that the suppliers are willing to agree to those terms, he said. Vendor loyalty takes on a whole new meaning.
“It’s the new ‘stickiness’,” said Rob Taylor, vice president of data center services for HP Enterprise Services. He and Yates also said that this model and their integration services are aimed squarely at very large enterprises with lots of resources, including IT resources. A smaller company with fewer resources might want to stick with that sole-provider model, Yates said.
As I learned in my recent reporting on CIO Linda Jojo’s multivendor outsourcing deal, getting it to work right, with an end-to-end service-level agreement, is rare. It’s hard to govern. There needs to be a detailed strategy for managing all those moving parts, including: knowing when to move what to the cloud, what to keep close to the internal-IT vest, which suppliers to go with, and when a supplier absolutely needs to be fired and replaced by someone better-suited to the job. I have no doubt that the brainiacs at HP can help CIOs do a better job at this (after you’ve hammered them on conflict-of-interest issues). But you’d better be very ambitious and working for somebody with deep pockets.
When it comes to virtualization licensing terms, what is it going to take for some independent software vendors (ISVs) to stop dragging their feet?
When I asked IT executives at the recent Gartner Catalyst conference in San Diego about the biggest challenges of desktop virtualization deployments, most of them said that dealing with “some” ISVs remains a real pain.
In fact, some IT executives are removing some ISVs’ software applications from their desktop and application virtualization plans because they fear the ISVs will change licensing terms.
As one executive put it, “It’s not so much a challenge to get them to understand what we’re doing; it’s that their licensing is a moving target.” As more businesses adopt a virtualization model (which removes reliance on a given piece of hardware and allows multiple users to access the same software), some ISVs apparently view the trend as a threat to profits. “So, what might be OK today, six months later or a year, [the ISV] may say it’s changing our terms,” this executive said.
Some ISVs just don’t want to acknowledge that their customers are moving to a multi-tenant computing environment, but this lack of acknowledgement could lose them a lot of customers. Of course, not all ISVs fit this bill. For the most part, ISVs are working hard to accommodate virtualized applications and desktops, IT executives say.
This isn’t the first time we’ve written about the virtualization licensing-terms dilemma, and given the attitude of some ISVs, it likely won’t be the last. To recap some of the advice from one of those virtualization licensing stories, here are two tips on negotiating licensing terms, courtesy of licensing expert Paul DeGroot, formerly with research firm Directions on Microsoft:
Negotiate software licenses based on named users. The cost of licensing software in a virtual environment based on processors can add up fast. Data volume for many businesses is going up, and in turn, the number of processors they need to license is rising too, while the number of users is remaining the same or even decreasing.
Look to retrofit existing software licensing terms for a virtual environment. Some vendors offer amendments to existing licensing agreements to account for running software on a virtual machine. IBM has a Sub-capacity licensing program in which customers can sign a contractual amendment that accounts for server licenses on a concurrent basis rather than on a named basis.
Let us know what you think of this blog post; email Christina Torode, News Director.
In our SearchCIO.com tip sheet this week on outsourcing strategies for emerging tech, outsourcing adviser Andy Sealock explains how contracting for new technology is different from procuring traditional IT services. He passed along seven points that his clients at Pace Harmon LLC take into consideration when they’re writing a contract for new IT. Here are two Sealock suggestions for steps you can take in conjunction with the contract to strengthen your outsourcing strategy:
- Take an equity stake in the supplier: “An equity stake changes the dynamic of the relationship,” Sealock said. For one, it allows you to stipulate a certain number of seats on the supplier’s board. In any deal for emerging tech, keeping tabs on your project is critical to its success. “Putting members on their board is about as deep an embedding as you can get,” he said. Second, if the supplier is a startup, your equity stake will be useful.
- Offer co-branding and marketing alliances: Letting a developing tech company put its logo or trademark on your product or on your marketing materials can be extremely valuable (given your wider distribution channels). That in turn helps realize your main aim in the negotiations, Sealock said — namely, to motivate this new tech company to sink its scarce resources into the areas that benefit you most.
Check this blog soon for Sealock’s latest thoughts on calculating total cost of ownership (TCO) on outsourcing deals. Hint: They involve getting engineers to think like finance people and finance people to think like engineers. Guess which group is harder to morph?
There was an interesting side conversation during the Q&A portion of a session on enterprise mobility at last week’s Gartner Catalyst Conference.
Someone asked the panel what they thought about privacy on mobile devices. What if sharing information on mobile apps gets to the point where your insurance provider knows too much about you, for example?
As we reported in a past story on enterprises’ mobile app plans, some health care providers and pharmaceutical companies are considering apps that would tell patients when to take their medication.
One audience member pondered this question: What if a health care provider decides to sell that type of information, and the next thing you know, your insurance provider shuts off your prescription because you aren’t taking the pills as scheduled?
That’s a scary scenario, but given just how much information we are willing to share over mobile devices and on social media networks, it’s not an impossibility.
But as panel moderator and Gartner analyst Paul DeBeasi said, “Only old people care about privacy,” repeating something his teenage son had said to him. His response had been that his son would care when he’s older. (What had bothered DeBeasi more than the generation gap around privacy was the possibility that information is probably being collected about people that they don’t even know about.)
But is it true that older generations are more cautious and younger generations have no privacy boundaries? The panel thought so.
“Let’s face it, younger generations are more than willing to share personal information — they want to share personal information, said panel member Randy Nunez, advanced networks and mobility director at Ford Motor Co. “And until they run into situations of ‘How does my insurance company know what my medical practices are’, until [privacy issues] start impacting them personally, there’s going to be a lot we give up in terms of privacy and security.”
Does that mean that enterprises also will have to give up a lot in terms of privacy and security? Or will the right controls around an enterprise mobility strategy put a stop to “over-sharing?” Then again, how do you balance controls when personal information is mingled with corporate data on a mobile device? And what happens when you ask employees to buy their own devices?
Let us know what you think about this blog post; email Christina Torode, News Director.
I did a profile this week on CIO Rick Roy’s push to plot an enterprise mobility strategy for CUNA Mutual Group. I was impressed by a number of things: his data-driven approach to gathering requirements; his engagement of the top brass; his anticipation of the cultural implications of this radical change; and, not to go unmentioned, the 18 personas (personae?) his team developed for modeling the mobile computing requirements of CUNA Mutual’s 4,000 field and corporate employees.
Here’s the part of Roy’s enterprise mobility strategy story that’s ringing in my ears today: “When you’re in the corporate world,” he told me, “I think it’s easy to get comfortable with what you have. Yet the reality is, the speed of innovation, the velocity of change that we’re seeing and the acceleration of that velocity is just so enormous.”
So enormous. Lately I’ve been thinking a lot about the velocity of change. (Just this week, for example, pondering why American citizens were not storming the Capitol to protest the ineptitude of elected officials, I chalked it up to the velocity of change. We can’t get it together fast enough to affect a situation spiraling out of control.)
But back to CIOs and mobility. For Rick Roy, the velocity of change in the mobile world forced him and his team to look beyond the central tenet of a well-run IT environment of the last decade — standardization — to a flexible delivery model that could keep pace with mobile demands.
The mobility world is whirling ahead so fast that CIOs can’t catch their breath long enough to take advantage of the technology. If the guy I talked to yesterday is correct, you can inhale. Enterprise mobility is about to reach — a tipping point!
“I think we’re going to hit a point of stability pretty soon,” said Brian Reed, chief marketing officer (and YouTube presence) at mobility management vendor BoxTone.
By this fall, Reed says, the Android will stabilize, offering security levels on par with those in the BlackBerry and the Apple iOS, or be well on its way there. CIOs will be able to use the same mobile policy for every device running these top three operating systems, making it easier to “say yes” to mobile devices. That will take some of the fury out of the mobile tornado tearing through the enterprise, or as Reed put it, ease the “big squeeze” CIOs are now feeling from the rank and file (on the one side) demanding to use their personal devices for work, and from line-of-business people (on the other side) screaming for mobile apps. The very next thing — as in, the next six months — CIOs should do to buy some time on enterprise mobility is to get “some quick wins” around apps.
“The easiest way to do that? Look at the app portfolio you already have and see if you have any mobilized, and go ahead and say yes. In fact, get out in front of it, and say, ‘We’ve done research and found that Salesforce [or whatever field-force automation tool you use or whatever retail point-of-sale software you use] is already mobilized and we are going to deploy that and manage it for you,’” Reed said.
In 2021, cloud computing is simply computing, corporate office parks are senior housing facilities and the IT organization of the future has been absorbed by the business.
Oh, and Apple has lost its proprietary hold on mobile application development — in court, no less — giving every company out there the ability to build its own app store — and sell those apps.
These were some of the predictions made by Gartner analysts Chris Howard and Jack Santos during the kickoff of the Gartner Catalyst show this week in San Diego. Howard and Santos made them in jest during an end-of-day skit — Santos playing dual roles as an IT staffer and a business user of the future — but some of these predictions are taking shape in the here and now, they said.
To back up a bit, the IT organization of the future will undergo drastic shifts in the following order, according to Gartner:
Internal IT becomes an internal cloud. This shift is inevitable, given the demand from enterprise employees and customers for an on-demand service experience. It will require IT to emulate or “start to think like” external cloud providers. IT will have to figure out chargeback and self-service provisioning; above all, it will have to start to develop a services catalog. IT also will have to figure out how to get the most out of a shared services model in such areas as capacity management in a virtualized environment. In terms of security, IT will need to nail down identity management, among many other security responsibilities.
IT becomes a services broker of its own services and those provided by third parties — namely cloud providers. This puts IT into the position of showing the business which applications and data make sense in-house or with a cloud provider, and how to vet the providers on behalf of the business.
Key to this is IT’s ability to grill cloud providers on their services capabilities, one critical criteria being security. For example, does your cloud provider wipe out your data before it houses another customer’s data on the same equipment? This is a question that IT is likely to ask, versus a business user, Gartner said.
As a services broker, IT will decide which apps are cloud-ready or not. It’s not a matter of service denial when it comes to cloud providers, Gartner said, but of helping the business make the right choices. Above all, the IT organization of the future will continue to vet outsourcing partners.
Critical questions include these: Does the provider let you know if and when access attempts are made on your data that they house? Does the cloud provider allow you to perform security audits on it? What are the migration path options to another provider? Who will build the back-end connections from your data in the cloud to other applications in your organization or to data housed by another cloud provider?
In a few years, the cloud will no longer referred to as the cloud, because it’s just the way IT services are provisioned. Cloud computing, or rather, hybrid computing is the new term to reflect that many enterprises will build an internal or private cloud that integrates and shares services with public cloud providers.
The hybrid approach will prevail, given that enterprises will not let certain data or applications live on a public cloud, for many reasons including regulatory compliance. Enterprises recognize the need to move commodity services and apps, as well as infrastructure, to the public realm to cut costs and gain scalability and agility.
IT will become a function of the business. Gartner’s Howard described the days when the IT function was considered so separate from the business that it was housed in a different office. Not so now: Already IT is being looked on as another service or function within the business. “Math was once considered a department,” Gartner’s Santos said. “Send that to the math department, because only a few people could do the math. Now, IT isn’t something [like math] that only a few people can do. Business people think [IT] is part of their job.”
Here are a few other takeaways about the IT organization of the future:
- Code-writing will become less important, and infrastructure and application integration more important within the enterprise and with external providers.
- Enterprises may start to emulate the business models developing in other countries in which a business function or even an entire business can be built for a specific purpose in a virtualized or cloud environment, then torn down once the project or purpose is complete.
- Albeit obvious, less business will be done in the office, given the ability for the “anywhere” computing that the cloud and virtualization enable. “There will be no there anymore,” Gartner’s Santos said. “The office is a virtual concept.”
- Application portfolios, as well as how and why applications are developed, will be led by your customers and their mobile, on-demand, “anywhere” needs.
- Enterprise IT will struggle with managing the blurred lines between corporate and personal personas, as well as the data and devices tied to those personas.
I am just scratching the surface as far as predictions being made here at Gartner Catalyst. In the coming weeks, the hybrid IT concept, IT as a services broker and developing a fraud prevention program will be among the topics we explore.
Let us know what you think about this blog post; email Christina Torode, News Director.
There are a few annual events I adore even though I’ll never ever go to any of them: Burning Man, SXSW, TED, and this time of year, Comic-Con. When you’re hyper-focused on IT innovation, you sometimes miss good stuff disguised as frivolity.
As we speak, San Diego is being overrun by 125,000 nerds, many dressed like superheroes, video game characters, zombies, stuffed animals, space aliens or manga hotties. All are ostensibly clamoring to witness the next big thing in comics, movies, TV and Web video. To be clear, these are not the cowering, bashful nerds from back in high school. These are angry, hipster nerds drunk with nerd power and full of nerdy attitude. They’re intolerant of boredom, enraged by sameness. They want all of the tools and technologies at their disposal employed to deliver new diversions in the wildest, weirdest way possible.
Which is why they all go to Comic-Con, then complain incessantly about Comic-Con. There are plenty of IT folks in attendance in San Diego this week, but not in an official capacity. Unless you’re the CIO of Marvel Comics or Hasbro, you may never have even heard of Comic-Con. But you should have. We in the enterprise technology arena could learn a few things from our geeky brothers in arms.
The Comic-Con faithful support those dedicated to their interests. But they instinctively know something fundamental about creativity and the artistic pursuits. Wherever thousands gather in its name, creativity has fled. Real creativity is born of rejection — rejection of the current, the popular, the safe, the known. It’s why some of the best stuff at Comic-Con happens in tents in a park down the street from the convention itself. There are always at least as many people boycotting Comic-Con as attending it.
I make this observation while sitting in another hotel conference room listening to another garish, noisy, colorful presentation on another current, popular technology choice. In today’s case, it happens to be cloud computing, but it could easily be mobility or virtualization or social networking or BI. How many of these sessions have we all endured? Endured them, even as we had a nagging sense that the really good stuff was probably being discussed by a small, rebellious group huddled in a tent down the road. Why do we put up with it — silently for the most part?
Where’s our awkward hipster’s sense of outrage?
If there’s one lesson from Comic-Con for IT leaders it’s this: It’s time for them to let their snarky geek flag fly. Comic-Con is all about creativity and its continuous pursuit. That’s not so different from what we’re after in our quest for technological innovation. Innovation is, after all, another flavor of creativity. Innovation and creativity are brothers from the same mother. Innovation is creativity with more moving parts and a better credit score.
And innovation, like creativity, tends to wither when we all gather in one place around what is popular and current and safe and known. As my hero, Hunter S. Thompson, once said: “When the going gets weird, the weird turn pro.”
In enterprise technology, every day is Comic-Con. What we need are more angry nerds.
State Street Corp. announced this week that it is cutting 530 of its “non-client-facing” IT employees over the next 18 to 20 months, and shifting an additional 320 similar IT workers to outsourcing vendors IBM and Wipro Technologies. Application maintenance services are going to Wipro, while IBM will provide infrastructure support. The layoffs, which amount to 21% of State Street’s 4,000 IT employees worldwide, are part of a “multi-year business operations and IT transformation program,” to increase the efficiency of IT operations and focus more on innovation, bank officials stated — but I sort of knew that.
We recently spoke with State Street CIO Chris Perretta about his technology transformation at State Street, running a podcast just last week about the launch of a private cloud and his IT team’s laser focus on such innovations as Big Data processing. Perretta told me he has an organization now that “makes sense” to him, referring to the important role his chief architect and chief scientist play in finding IT trends that feed a “pipeline of innovation.”
Reaching Perretta this morning by phone, I asked him how many of the 850 lost or reassigned IT jobs were due to IT transformation. Before he uttered a word, the bank’s public relations specialist offered, “all of them, really.” Perretta was more reflective. Days like this remind IT people like him just how fast technology moves, and “our jobs have to reflect that,” he said. In addition, State Street has always pressured employees “to do those tasks which differentiate us with our customers.” The technology jobs being eliminated or moved off to vendors are “incredibly crucial to us,” he noted, but are more efficiently done by vendors invested in those technologies. State Street “gets to leverage” that vendor know-how and dedicate its IT people to things that are “out there on the technology edge.”
“So, for instance, our people designed our cloud; our people designed even the implementation of the hardware that we’re running. And it is our people who are designing our most innovative applications,” Perretta said. “Those are the jobs that we want to grow and keep within our employees. That’s the intellectual property we want to develop.”
The reassignment and loss of IT jobs are the consequences of a new operating model, Perretta said, and are driven by such new technologies as State Street’s private cloud and by IT’s move to Lean development principles. Automation eliminates some jobs. Outsourcing allows the bank to shift fixed costs to variable costs, a powerful advantage with technology changing so fast. “Sometimes that involves dislocation, and that’s unfortunate,” he said, but State Street has always watched the cost line. “And we want to make sure that what we do spend is spent in a way that makes a difference, so there you have it.” Whether these same forces will result in more IT layoffs, or how much money he saves by this shakeup, he declined to say.
Of course, State Street is not the only company shedding IT jobs. The federal government said today that it is closing 800, or 40%, of its data centers, a move that would save billions of dollars. The federal government’s outgoing CIO Vivek Kundra told The New York Times that the consolidation was “part of a broader strategy to embrace more efficient, Internet-era computing,” in particular, cloud computing. No word yet of layoffs.
Technology changes us. Perretta said he just bought an old typewriter and put it on his desk — the old manual kind, to remind him of that change in just his own lifetime. That’s the reality of the IT field, and because IT is integral to most business now, that’s the reality of many, many other fields as well — and the reason, in part, for a jobless recovery and why unemployment remains high, especially for the “non-client-facing.” Good luck to you.