In her recently published book The New IT: How Technology Leaders are Enabling Business Strategy in the Digital Age, author Jill Dyché lists the characteristics of an innovation-ready company. One of those characteristics? Building strategic partnerships.
These partnerships aren’t like the partnerships of yesteryear where a project was chunked by task and then assigned to whoever might do that work best. Strategic partnerships are built on collaboration, Dyché said. “The quid pro quos are different,” she said. “It’s not just about financial quid pro quos anymore like classic partnerships. It’s more about new ideas, product improvements and incubation of new functionality as well.”
By way of example, Dyché pointed to Ron Guerrier, vice president and chief information officer at Toyota Financial Services (TFS). “Ron has an innovation lab where his vendor partners are invited to come in and essentially do what they do best,” Dyché said. They can test out a new application by loading it onto one of the tablets in the innovation laboratory, and the business folks can give the app a test drive.
Guerrier will get feedback from the business through an automatic Q&A on where the application succeeds and where it falls down, but so will the vendor, who can use the feedback to drive bigger improvements. “The partnership is not only a delivery partnership, it’s also an incubation partnership, if you will, a product roadmap partnership,” Dyché said.
Tapping talent outside the four walls
Building strategic partnerships means recognizing that the internal ecosystem isn’t enough if businesses want to remain competitive. Here’s how Karen Dahut, vice president of the strategic innovation group at the consultancy Booz Allen Hamilton, put it at the recent Chief Innovation Officer Summit in New York City: “Companies that believe that the best ideas only reside in their brick and mortar will not be successful. You have to engage in the broader global ecosystem.”
Booz Allen did just that when it partnered with Microsoft and Intel and Allscripts to build Allscripts Wand, a Windows 8 tablet-based application for the medical records organization. “Great innovation happens when great companies work together, partner together and collaborate in meaningful ways,” Dahut said.
And it means tackling big questions and paving the way for new markets. Cathryn Gunther, vice president of strategy and commercial model innovation at Merck & Co. Inc., talked about how the pharmaceutical giant is trying to improve customer engagement. “We’re working with a variety of different technology companies,” she said at the Chief Innovation Summit. “[The list includes] vendors to design some platforms to improve the engagement of consumers and their health in conjunction with their healthcare providers.”
CIOs and IT professionals have been slow to adopt software-defined networks for a number of reasons: security concerns, lack of familiarity, and the lack of skill sets required of employees. But attitudes are changing, according to a new survey from Logicalis, the international IT solutions and managed services provider.
The firm’s second annual Optimal Services Study, a survey of top IT pros spanning 24 countries, found that 61% of the CIOs who responded said they expect software-defined solutions to impact their IT services and delivery strategies over the next three years, 34% said over the next two years, and 10% said over the next 12 months.
“Using the kind of policy-driven, programmable toolsets made available through software-defined solutions, CIOs can free time and resources that can be better spent focused on IT service delivery and management throughout the enterprise and can both reduce cost and complexity as well as increase the speed and flexibility with which they can respond to IT requests going forward,” Mike Martin, senior vice president of Solutions and Services at Logicalis, said in the report.
According to various articles, there are many benefits to utilizing SDNs. It can help lower costs because SDN eliminates the necessity to buy expensive hardware that is needed when it comes to traditional network upkeep. SDN also means the network administrator or engineer can simply add more virtual switches or routers instead of physically visiting each switch router. And it also allows for the flexibility to move to the cloud, whether public or private, as well as allows network engineers and administrators to respond quickly to changing business requirements.
However, this move towards utilizing SDNs does require a new set of skills from IT, and it seems CIOs are willing to pay a premium for these skills. The survey found that 65% of the CIOs who responded to the survey said they are prepared to pay more for the skills required to make the most of SDN solutions; a fourth of the CIOs said they would pay 5% to 10% more and a fifth said they’d pay 10% to 20% more.
Even better, SDN skills paired with business acumen
Want to be even more valuable in a CIO’s eyes? Then, according to the report, pairing IT skills such as SDN knowledge, with business acumen is the way to go.
“CIOs are recognizing that, as they transform their IT departments to be internal service providers, they will need to embrace the move from technology management to business service delivery, something that requires a new type of skill set with an emphasis on melding technological understanding with business acumen,” Martin said.
This value of business acumen coupled with IT skills was apparent in the survey with 64% of the CIOs responding that they first look at business skills like communication, service management, and business analysis.
“Each CIO will be examining the organization’s staff, investing in and cultivating the business skills of existing IT team members as well as replacing outdated technology skills and people with those that understand the shift to a world where business needs now dictate technology decisions,” Martin said.
After five decades of enterprise application bloat, it’s finally happened: The business is getting on board with IT and agrees that there needs to be fewer applications. That’s what Forrester analyst Phil Murphy writes in “The Secret to Rationalizing Applications: Start With the End in Mind”.
For years IT has had to live with disorganized and bloated application portfolios that have sucked up time and money because the business side hasn’t fully understood the depth of the problem. But that is starting to change.
The reason the business is starting to wake up and see that a bloated application portfolio is an expensive problem, Murphy writes, is because of the elephant in the room: the “keep-the-lights-on” expense of existing application portfolios. In some firms, these portfolios exceed 90% of the technology budgets, leaving just 10% for innovation, meaning application bloat is a business problem now. If rationalization is done and done correctly, he writes, then this “keeping-the-lights-on”-to-innovation ratio can be brought back to a more acceptable level — for example 80/20 or, even better, 50/50 — and therefore can free up funding and people resources so that more innovation can happen.
Still, wanting to do something doesn’t make it so. Although companies have realized it’s in their best interest to rationalize their application portfolios, Murphy writes that, unfortunately, many rationalize incorrectly. Based on hundreds of client inquiries, Murphy has found that most companies attempt to rationalize their application portfolios two or more times before seeing any real success. Others just give up after the first failure.
How to incorrectly rationalize applications
Murphy writes that Application Development and Delivery (AD&D) leaders that fail often follow a predictable series of steps:
– They create a list of application names.
– They gather metrics about each application over several months. For example, technical make-up such as platform, language, database, vendor, user satisfaction, and some indication of the applications’ business purpose.
– After this information is collected, it is dumped into a spreadsheet. However, spreadsheets have limitations and at this point companies begin to struggle to relate data elements together to compose a meaningful view.
– Then companies try to draw some insight from the data they have collected and this is where a “Eureka!”-type moment should happen. The depth of the problem should become clear, and the data should point business leaders to an obvious redirection of resources.
But it often doesn’t.
So why does this strategy not work? AD&D leaders often step into this process without first figuring out what they are going to do with the data they’ve collected, leaving them with data that yields little usable information.
This is the point where AD&D either simply gives up or they try again. The only problem is that people who were onboard with the project before are now skeptical and so the project may receive some opposition.
How to successfully rationalize applications
So how do you successfully rationalize your applications? Murphy’s suggestion to AD&D leaders is to start with the end in mind. Here’s his four step process:
– Define the goal. Murphy writes that portfolio management and app rationalization share the same goal: transparency into resource allocation and consumption. He argues that transparency into consumption patterns will bring insight into how consumption may need to change.
– Define the problem. Why do you need more transparency into resource allocation and consumption? What problem might this solve? Is it to reduce operation costs, reduce waste, eliminate redundant apps, increase resources for innovation?
– Define the audience. Who are you trying to convince that applications need to be winnowed? Are you talking to developers? Technology management professionals? The data you collect on application bloat and the insight you provide needs to be tailored to your audience. If your audience is business leaders or executive management, Murphy advises avoiding technical jargon and metrics to try to win them over.
– Define (and collect) the data that will resonate with your audience. Figure out which data (or combinations of data) will create a compelling story for whomever it is you are presenting to. Then go out and get it, working with other departments, such as finance and internal audit, if need be, to get it.
Though in the past this has been solely an IT issue, it is no longer that way. Successfully cleaning out your application portfolio will not only save money but allow IT more time to innovate — two important features if you want your business to thrive.
Let us know what you think about the story; email Kristen Lee, features writer, or find her on Twitter @Kristen_Lee_34.
Digital trends are transforming how businesses work, Abbie Lundberg, president of Lundberg Media, a content and engagement company specializing in CIOs and digital business, said at the Oracle CloudWorld Conference in Boston last month.
In order to make the transformation to digital business and remain competitive, Lundberg said companies must recognize that traditional IT leadership isn’t enough anymore. Businesses must foster digital leadership, which requires IT and the business to work together to make technology decisions.
Lundberg’s advice stems from a 2014 survey of 750 business and IT leaders conducted with Harvard Business Review and Oracle probing IT leadership roles in the digital age. One interesting statistic she found was that 60% of respondents said they were directly involved in making IT decisions. However, only 27% of those respondents actually worked in IT.
“So this is a real shift that we’re seeing today. If we’d asked this question a few years ago we never would have seen these kinds of numbers,” Lundberg said.
For example, nearly half of respondents said that within their companies cloud management is now a blended responsibility between IT and the business. (The more technical aspects of cloud, such as deployment and integration, remain more the domain of IT.) Specifically:
- 26% of respondents said that IT and the business were equally responsible for cloud deployment.
- 27% said IT and the business were equally responsible for integration.
- 46% said IT and the business were equally responsible for vendor selection.
- 47% said IT and the business were equally responsible for cloud requirements.
“There was almost unanimous support for this idea that collaboration is really critical to getting value from IT investments,” Lundberg said, adding that she is starting to see more crossover roles in IT and the business.
For example, the head of marketing at a major credit card company is in the process of getting his master’s in information system process management. “He sees that as his future. The future of marketing is technology,” Lundberg said.
Emerson recalibrates to serve digital customers
Emerson, a St. Louis-based global manufacturing and technology company that is one of the largest power equipment manufacturers in the United States, is an example of an old-school business that is taking digital leadership seriously, Lundberg said, citing a talk she heard by the company’s president and COO, Edward Monser.
“He said something that really made me sit up. He said changing models for customer engagement because of digitization was one of the top three risk factors that they discussed with their board of directors,” she said.
The 125-year-old Emerson has built its business on cultivating long-standing relationships with certain companies. However, those customers are now bringing new people in who have different needs and a different way of working. Emerson needed to adjust.
Recognizing that collaboration within the company was needed, Monser along with Kathy McElligot, the CIO of Emerson, and head of strategy, created a business IT strategy board comprised of 25 leaders, with McElligott as the only board member from IT, Lundberg relayed. This board meets four times a year to discuss business strategy and dive deeply into specific topics such as information security, and the digital customer.
McElligott told Lundberg that traditional business and IT “lines have blurred. IT is extending into marketing and engineering… The business now has to learn about IT and how it works.”
Let us know what you think about the story; email Kristen Lee, features writer, or find her on Twitter @Kristen_Lee_34.
Intel Corp. really, really wants to drive the wearables bus. At the International Consumer Electronics Show (CES) earlier this month, the chip company unveiled Curie, a “system on a chip” or SoC that’s no bigger than a button. The tiny computer chip can be fitted into things like bracelets and bags and sunglasses and was designed specifically for wearable device developers.
Jewelry and sunglasses and even wearable cameras that can fly aside, Intel’s wearable interests are bigger — and much more interesting — than just retail. Last August, the company announced it was forming a new partnership with The Michael J. Fox Foundation for Parkinson’s Research. To help advance the understanding of the neurodegenerative disorder, Intel outfitted 16 patients and nine control volunteers with wearable devices that collected data 24 hours a day, seven days a week, to track things like sleep quality and tremors.
These smart watches are “generating 300 measurements per second — roughly a gigabyte of data per day, per patient,” Ron Kasabian, vice president of big data solutions at Intel, said at Strata + Hadoop World in New York City last fall. “We take that data, take it from the device through the patient’s handheld or cellphone and load it into a cloud-based infrastructure we built.”
The streams of data are ingested and cleansed. (“The accelerometers in some of the devices occasionally emit inaccurate measurements, so we clean those out,” he said.) And then the data is loaded into Hadoop, a distributed computing framework that can process large data sets, for analysis.
Compared to the information collected from the traditional dexterity tests given every three to six months to Parkinson’s patients, the streams of sensor data are a potential goldmine Kasabian said. In the future, researchers are hoping to profit from the power of big data by combining de-identified patient data to uncover correlations that can answer questions such as why the disease progresses faster in some than in others.
In this third of three blog posts on IDC’s IT outlook for 2015, analysts tackled the topic of cloud.
Here IDC’s Robert Mahowald, program vice president of SaaS and cloud services; Mary Johnston Turner, research vice president for enterprise system management software; and Rick Villars, vice president of data centers and cloud, outlined three ways in which the cloud will shake up IT ecosystems in 2015 and how IT leaders can prepare.
Hybrid cloud adds complexity to IT environments
IDC predicts that by 2016 more than 65% of enterprise IT organizations will commit to hybrid cloud technologies.
“This is going to really drive a number of changes across the structure, management, and just operational velocity of many IT organizations,” Turner said.
Enterprises are being driven to adopt hybrid cloud for a number of reasons, Turner said. These include the need for cloud service diversity, the demand by end users for more and more self-service IT, and the growth of OpenStack and open source containers.
“So for IT organizations this is going to create an environment where the real demands for what you have to do every day are not so much about the care and feeding of individual components but really about ensuring the end-to-end delivery of IT services, or IT as a service, that’s defined in terms of policies and SLAs and user experience and compliance,” Turner said. “And for many enterprise IT teams it means that you’re going to have to do a lot of learning in a quick amount of time.”
IT is going to have to master new tools and standards as well as understand what’s going on in open source, because Turner predicts open source will be an important enabler for hybrid environments.
How complex will these hybrid cloud environments be? IDC estimates that 60% of enterprises will probably subscribe to 10 or more cloud services and IT will not always know about all of them. On top of that, 25% of those services may go out of business in a couple of years, Turner said.
“So it’s going to be a very dynamic, fast changing kind of churn to IT environments which is very very different to what we’ve seen traditionally where change was really centrally planned and managed and executed slowly,” Turner said.
This means there will be a lot of concern around maintaining the performance and security for all kinds of applications.
Turner said IT teams along with major stakeholders will have to jointly review corporate policies around issues such as data protection and risk management to really make sure those policies are appropriate for a hybrid cloud architecture.
Data privacy regulations will determine cloud use
IDC predicts that in 2015 65% of the selection criteria for enterprise cloud will be shaped by efforts to comply with data privacy legislation.
“We think there’s going to be a muddle of sorts for a year or so when the SaaS and PaaS providers begin to branch out from where they’re hosted to get at the new geographies,” Mahowald said.
He added that these providers don’t fully understand who will be responsible for not only the SLA but the end-to end-security as they branch out. Making matters more complex, the end-to-end service will likely involve a whole chain of cloud providers.
“It used to be, you know, I bought software, I installed it in my data center, I ran it locally, it was pretty easy to figure out where the fault might lie. Now all of a sudden there’s a chain of providers each providing a discrete piece of either capability or delivery and that makes it much more muddled,” Mahowald said.
However, Mahowald said new services are emerging, such as “indemnification as a service,” that help to figure out who is responsible for what between providers and who will bear the monetary costs should there be losses.
Even so, Mahowald urges companies to do their due diligence and make sure they are getting what they need in terms of compliance.
Enterprises — in addition to having a governance, risk and compliance committee — should also form a service management team as part of the CIO/IT team. The service management group is in charge of implementing cloud technologies within the organization. He urged CIOs to join CISOs in learning the laws that govern privacy and compliance issues. It is increasingly important for IT professionals to be aware of and learn about the legal aspects of cloud computing — whether privacy laws or dealing with cloud contracts or local laws, Mahowald said.
“At the end of the day when the lawyers show up we think it’s super important to have an understanding of where your data and your assets are when they’re not in your data center,” Mahowald said.
Managing risk in sourcing IT services
IDC predicts that in the next year or so 75% of IaaS provider offerings will be redesigned, rebranded, or phased out.
“We do expect to see some players who announced significant cloud efforts over the last couple of years to begin to back track from those or phase out those efforts,” Villars said.
Because many of the initial cloud implementations that service providers built are proving difficult to enhance, change, and modify, Villars said providers are now rethinking and changing their cloud services.
“We see many [service providers] going now and making significant [re-architectures] in their environment to create more evergreen networks or evergreen solutions that will allow them to be much more flexible about continuing to introduce new technology, introducing new capabilities without having to do a major rebuild,” Villars said. However, over the next couple of years this transition will be a significant issue for companies to sort out, Villars said.
And companies will have to incorporate the ability to switch vendors into their IT planning and governance processes.
Villars gave two pieces of advice to help companies deal with this transition.
- “Go back to service providers that you’re currently working with as well as those that you’re evaluating [and] demand deeper insights into the road map, into how the road maps are going to enable this kind of evergreen operation so we don’t have to go through these kinds of rebuilds again.”
- He also said to make sure that, when your company is talking to a service provider, “refresh cycles” or rebuilds are not part of the conversation. “This is supposed to be an evergreen environment so make sure the conversation is including that idea of long term [continuous] operations.”
For more IDC 2015 forecasts check out IDC prediction for IoT 2015: It’s a doozy and IDC 2015 security predictions: How to keep up with the bad guys
Let us know what you think about the story; email Kristen Lee, features writer, or find her on Twitter @Kristen_Lee_34.
As 2014 draws to a close, security is at the forefront of everyone’s minds. The most recent unsettling security incident is North Korea’s alleged involvement in the Sony data breach and the implications of that type of cyber attack for other private companies. How should companies prepare for security in 2015?
“The bad guys are going to grow,” Pete Lindstrom, IDC research director of security products, said during the company’s recent 2015 security prediction webinar. “They’re going to adapt and innovate, and so we have to really mirror and match that and hopefully get ahead of them in some ways moving forward.”
An attacker can innovate faster than a regulation, warned Lindstrom. “We have to keep in mind that these folks are nimble and they’re going to get around any kind of… enforced controls that are out there,” he said.
Here are four areas of security outlined in the IDC webinar that IT leaders should consider for 2015:
The first step is to figure out where to invest your money. Companies don’t have enough money to do it all and protect everything, so some analysis is needed to figure out where to strategically invest.
“You need to put this whole concept of risk mitigation on the top of your agenda,” Charles Kolodgy, IDC research vice president of security products, said. “Many more organizations will have to start looking at their security spending by risk because they just don’t have enough money… to protect [everything].”
Kolodgy suggests looking into analytics and software that may be able to help your company get a better understanding of how best to deal with security investments. IT needs to be able to quickly adjust to emerging threats, he added. And old strategic investments are becoming liabilities.
“You need to have a team of security professionals and I think that team should also include a business person… so that they can look at metrics to help with your decision-making,” Kolodgy said.
Lindstrom added: “We’re all better off as we get our arms around understanding economic impacts and probabilities… and get away from this age-old, fear, uncertainty, and doubt kind of approach to securing our enterprise.”
“[Threat intelligence is] not about just generating data as much as it is about figuring out how to get to that intelligence side of things,” Lindstrom said.
In order to successfully utilize threat intelligence, Kolodgy said companies will need to carefully vet vendors in order to make sure you’re getting full visibility.
“The problem is that… there’s a wide range of providers that are both established security vendors, established telecommunications vendors, and a lot of new guys,” Kolodgy said. He advises companies to focus on whether vendors are creating their own intelligence or just amalgamating intelligence. In other words, “are they a secondary or primary source of information?”
Kolodgy said that it is critical for a company to know this as they build out the usage of threat intelligence “because you could have duplication.”
Regardless, having some sort of program in place is key because the software that vendors provide allows companies to “pick that needle out of the hay stack,” Lindstrom said. It will be able to tell you that you’re at risk under X circumstances from X person and X type of resources need to be protected more.
Kolodgy also suggests automating threat intelligence because there is a shortage of IT security talent.
“We need to manage the data a lot better than we do it because it is a potential liability,” Kolodgy said. Especially because everyone and everything is moving to the cloud.
“It’s in a lot of respects a little disappointing that we’re at the stage we’re in given the nature and sensitivity of data. And [it is] certainly worth pointing out that this also includes the new and improved cloud-based file transfer services and the like from our data stuff,” Lindstrom said. But like it or not, there is no avoiding the cloud at this stage in the game, he said.
Lindstrom suggests “[tethering] your [cryptographic] key into your environment.” He added that “maintaining them under your control is going to be crucial to your long term strategic success around encrypting data and deploying it in the cloud.”
You need to have direct access to your cryptographic keys at all times, Kolodgy added.
“You [also] need to have policies,” he said. It is important for a company to determine what specific categories of information require confidentiality. Once those categories are pin-pointed, policies must be put in place.
But in order to do all of this successfully, Kolodgy said it has to be a team effort between the business side, the compliance auditors, and the security team.
Kolodgy points out that because attackers can innovate much faster than companies can right now, it’s important to leverage SaaS, and the agility that comes with it, to compete with attackers and be one step ahead of them.
“You’re not going to have time to roll out a product and train people and hire people,” Kolodgy said.
Either way, companies don’t really have a choice anymore.
“If our data centers are moving to the cloud, our security has got to move with it,” Lindstrom said. He advises that companies leverage outsourced managed security services because if you’re not “you’re probably missing out on the real great insight that they can gain from attacks going on all over the place.”
It’s official: The FBI linked the Sony Pictures hack back to North Korea today, as Associate Editor Fran Sales reports in this week’s Searchlight News Roundup. You can read the full FBI statement here.
The destruction and leaking of sensitive corporate data by a group calling itself the Guardians of Peace was in retaliation for The Interview, a movie that depicts an assassination plot against North Korean leader Kim Jong Un.
In a press conference following the FBI’s announcement, President Barack Obama made more news, saying that he thought Sony’s decision to cancel the Dec. 25 release was a mistake.
“Sony’s a corporation; it suffered significant damage; there were threats against its employees; I am sympathetic to the concerns that they faced. Having said all that, yes, I think they made a mistake,” Obama said.
The president said he wished Sony had discussed the issue with him first, because he would have advised the company to not let a dictator in another country bully them into pulling what was clearly a satirical movie.
“We cannot have a society in which some dictator someplace can start imposing censorship here in the United States. Because if somebody is able to intimidate folks out of releasing a satirical movie, imagine what they start doing when they see a documentary that they don’t like, or news reports that they don’t like. Or, even worse, imagine if producers and distributors and others start engaging in self-censorship, because they don’t want to offend the sensibilities of somebody whose sensibilities probably need to be offended,” Obama said.
In other news this week, Yahoo CEO Marissa Mayer tries to restore the company to its former tech glory and Apple Pay may soon face a worthy rival in Samsung; and Sony is working on a clip-on wearable that may give Google Glass a run for its money. Check out these items and more in this week’s Searchlight.
The International Institute for Analytics (IIA), a research firm based in Portland, Ore., recently discussed ten predictions for 2015. Some were conventional — Prediction #7: Hadoop will go mainstream. Some were thought-provoking — Prediction #2: Storytelling will be the hot new skill in analytics. Should CIOs consider hiring journalists to do that job?
And one stood out because it seemed, well, ominous — Prediction #9: Analytics, machine learning, cognitive computing will increasingly take over the jobs of knowledge workers. Tom Davenport, co-founder of the IIA, professor of management and information technology at Babson College and analytics thought leader, said — and has been saying for years now — that business leaders need to be preparing for this now. They should consider how to “prepare knowledge workers to augment the work of smart machines rather than be automated by them,” he said.
Automation is already happening. Journalists, lawyers and even teachers are standing by while parts of their job descriptions are being taken over by things like predictive coding, knowledge-based curriculum design or automated earnings reports. While the technology is “still quite fragmented,” Davenport said during the IIA 2015 predictions webinar, “there’s probably not a knowledge worker problem out there that can’t be addressed by some system.”
There are benefits to the advancing tech. In many cases, as fellow IIA faculty member Robert Morison pointed out, “what we’re doing is better equipping people, and if we could do that at scale, it could make an enormous difference,” he said.
Jeremy TerBush, vice president of analytics at Wyndham Worldwide Corp. explained in the call that his team developed internal pricing systems that rely on algorithms to project tomorrow’s vacation rental prices. The cognitive computing program has not had an impact on the company’s workforce. “We’ve seen it hasn’t automated away any jobs,” he said. “It’s just allowed us to be more focused on us managing our inventory better.”
The system works about 80% of the time. “But 20% of the time, the prices are overridden by our revenue management team, who is closer to the market and picks up on things the algorithms are missing,” he said.
Automation can provide efficiency, help businesses make better decisions and save on costs. But (cue the sounds of dismay) there is the other side of the coin businesses may not be considering: What are they at risk for losing? Will automation simply deepen the divide between haves and have-nots?
Said Davenport: “I suspect the people who you need to do that are your most experienced and expert pricing analysts — and not the ones fresh out of school. Because, as we were saying, oftentimes the entry level work can be done by computers, it’s the hard cases humans need to override or augment.”
The question is, said Morison, “How does someone become an experienced pricer when all of entry level work is done by machines? Who learns to be the experienced expert?”
Data breaches have unfortunately become the norm. But the now infamous Sony breach has opened the eyes of the IT world to the fact that we haven’t seen the end of what cyber attacks have in store for enterprises.
Breaches can be more than just exposing sensitive information; as the Sony hack shows, they can be personally malicious. The attack, which used “wiper” malware to steal and delete corporate data, sought to harm Sony employees, Associate Site Editor Fran Sales reports. The attack was also highly sophisticated, according to experts — sophisticated enough to get by the security defenses of 90% of the private industry, according to the FBI cyber division’s Joseph Demarest Jr.
In addition to laying out how the Sony hack was different from other corporate attacks, Sales provides tips on how to protect yourself and your company from breaches like this. Good luck.
In other news this week, IBM and Apple have released 10 of the anticipated 100 apps in the IBM MobileFirst suite; Microsoft now accepts virtual currency, The Washington Post details the demise of Pirate Bay, and more in this week’s Searchlight.