It’s human nature to want to categorize the world around us. But is this desire to classify becoming a bigger part of our nature at the expense of creativity because of … IT?
Computers encourage us to classify and categorize, but following a corporate or coded script doesn’t exactly lend itself to creativity.
Our recent SearchCIO360 breakfast speaker, Professor Gary King, used customer service reps as an example of how data classification or categorization can stifle innovation. Most companies develop categories for customer complaints and the rep is typically asked to type information into a program with fixed complaint categories.
“There are huge efforts to convince [the reps] not to come up with a new idea because, if they do, you have to reclassify millions of previous data points. So what you are doing is taking the people who are good at innovating, the ones with the most insight into the customers, and telling them to please not come up with any new ideas,” King said.
We are allowing computers to make decisions for us, which can be a wonderful thing — if the data being used to make those decisions remains relevant. Many times, it doesn’t.
To err is non-human and human
Take King’s research into the solvency of U.S. Social Security benefits. If you think some of the data classification and management methods in your office are outdated, you might feel a little better about them in comparison to the Social Security Administration’s use of a data analysis method developed 75 years ago, according to King. This method, used to forecast mortality rates, was developed by demographers at a time when obesity was not a problem and smoking was not considered the death knell it is today. Still, this 75-year-old method is the one being used today to determine when Social Security may run out of money.
By King’s calculations, it turns out that date is around 2032. A bit of “a bummer,” as he said, but a more accurate estimate, based on a method that incorporates the deep qualitative knowledge that demographers have used over the last 350 years, updated with new mortality factors (such as the rise in obesity) with automation built into calculations based on his own algorithms.
This is an oversimplification – — King conducted his research not just for Social Security solvency, but also aggregated worldwide mortality data that analyzed 150,00 cross-sections. But his point was he did not seek out new data; he just analyzed existing data better and relied on a combination of qualitative (characteristics identified by humans) and quantitative (characteristics that can be measured) computer-aided methods.
“Fully human is inadequate, but fully automated or fully quantitative — meaning Excel spreadsheet with no labels — fails too. You need some qualitative information to decide what you’re trying to quantify. What really is needed is computer-assisted, human-controlled technology to take qualitative information, systematize it and then provide it back to the human being who actually ends up making the decisions,” he said.
Another case in point on the benefits of combining a human touch with statistically driven tech? One of King’s colleagues was running a complex data analysis that caused him to run out of room on his computer. IT told him that it would cost $2 million to give him a system to support his data analysis needs. Instead, a couple of King’s grad students spent two hours developing a new algorithm that does the job in 20 minutes — on the guy’s laptop!
The title of King’s talk at our breakfast was Big Data is not about the Data. His contention is that you can have all the data you want and the most powerful computer you need, but without the right analytics — which includes that human, qualitative element — you could be looking at some costly, outdated data.
As one CIO explained, “There is a real sense of urgency to figure out how to build the right skill set to take advantage of qualitative data, qualitative people.”
It’s a dangerous cyberworld out there. The news carries weekly, sometimes daily, reminders of the potentially catastrophic impact a data breach can have on even the largest of enterprises. Besides the loss or corruption of information itself, there’s the loss of trust from customers who may well decide to bring their business elsewhere. In this information-driven economy, more than ever before, it’s become abundantly clear that a well-honed security strategy is imperative. It makes findings from a recent Ponemon Institute study (sponsored by Sophos) all the more surprising.
Released this month, The Risk of an Uncertain Security Strategy indicates that many SMB organizations are simply unclear about their security strategy and the threats they face. Included are about 2,070 responses from individuals in charge of their SMB company’s security and risk management. Among the respondents were CIOs/heads of corporate IT, heads of IT security, heads of risk management, CFOs, CEOs and chief operations officers. CIOs/heads of corporate IT made up 61% of the sample.
If any one of these takeaways sounds like it could be coming from your organization — or if you’re simply not sure — it’s time to start strategizing.
- One-third of respondents admit they don’t know if their organization experienced a cyberattack in the past year. This lack of knowledge equals a lack of “actionable intelligence” going forward. These respondents claim that in order to remedy the situation, they will invest in big data analytics and network traffic intelligence over the next three years.
- Respondents in the most senior positions knew the least about cyberthreats to their organization. This uncertainty indicates that the further an individual is from dealing with security on a daily basis, the less they understand the pervasiveness of the risks. According to study findings, 58% of respondents say management doesn’t think cyberattacks are a serious risk.
- Respondents estimate the cost of disruption to normal operations exceeds the cost of damages or theft of IT assets and infrastructure. This clashes with findings in other Ponemon Institute studies where the theft of intellectual property is the most expensive consequence of cybercrime. In this study, respondents appear unable to determine the actual cost of lost or stolen information assets.
- Respondents indicated that company-issued mobile devices and bring your own device (BYOD) raise bigger security concerns than do cloud applications and IT infrastructure services. But these concerns fail to translate into extensive adoption and use of mobile devices, especially personal devices. To lower these BYOD risks, respondents claim their organizations will invest in such protections as Web application firewalls for mobile apps and endpoint management.
- Respondents’ confidence in their cybersecurity awareness and strategies seemed to be similar among specific industries. For example, respondents in financial services, indicated a strong understanding and awareness, which can be attributed to the numerous data protection regulations they deal with on a regular basis. Not surprisingly, the technology sector, too, is more security aware, likely thanks to the IT expertise in these organizations. Retail, education and research and entertainment expressed the lowest levels of awareness.
- Respondents indicated that chief information security officers and senior management are rarely involved in IT security decision making or priority setting. Thirty-two percent of respondents said the CIO of their company is responsible for setting these priorities; 31% say no single function owns the responsibility.
So what happens when cybersecurity fails to be a priority in SMBs and no one seems to know the plan, or even if there is one? The study suggests it can become a vicious cycle. “Uncertainty about how these issues affect an organization’s security posture could lead to sub-optimal decisions about security strategy,” the authors note. And even as boards of directors and higher-level management are beginning to show greater interest in cybersecurity and risk, if IT executives don’t use the best available information in order to make decisions, it will be more difficult to make the business case for investing in the right expertise and technologies.
Innovation is the term du jour for CIOs, but figuring out the best ways to foster creative, out-of-the box thinking isn’t easy. Maybe that’s why there was an entire session devoted to it at the recent Society for Information Management (SIM) annual conference.
“Accelerating Value with Social Media Led Innovation,” presented by Gartner Inc.’s Christopher Sprague, billed itself as a social media session but, in truth, it could have just been a called a social session. That’s because the real crux of generating new and interesting ideas, it seems, boils down to good ole-fashioned communication.
Here are five quick ideas from Sprague’s session on how to get that communication going.
1. Create a culture of innovation: It starts with leadership. An attendee from a large pharmaceutical company shared how a new president of research and development tied innovation, new ideas and new science to survival. In other words, innovation needed to become part of the department’s DNA. “We started by defining how we wanted to build that culture,” the attendee said, and, once defined, how to help support that culture. A notable aspect of support: the best ideas were backed by financial investment.
2. Talk horizontally and vertically: The “if you build it, they will come” IT Hail Mary doesn’t work. Getting IT together with the business just to talk, on the other hand, can generate good ideas. That’s precisely how one SIMposium employee described overcoming the barriers of siloed, top-down communication. “Once you bring people together to talk horizontally, that’s what sparks the ideas,” he said. The danger here, he noted, is falling into a trap of all talk and no action; employees need a way to funnel their ideas into the hands of decision makers. “If it stays with the worker bees, it never gets sponsored,” he said.
3. A catchy name helps: IT employees at Computer Aid Inc. call it “Future Fridays.” That’s when 16 or so IT thought leaders — a mix of idea generators and builders — spend an hour brainstorming. The program turned out to be so successful that the IT department built a formal framework for the submission and evaluation process, and prizes were awarded to the best ideas. Future Fridays is about a year old now, and in that time 400 ideas have been submitted and about 100 of them have been implemented, a Computer Aid employee said.
4. Get down and dirty, if you have to: Sometimes, communication can be the biggest impediment. Personalities clash and people disagree and getting everyone in a room together for an hour of idea generation just doesn’t work all that well. That’s why one SIMposium attendee advised doing the unthinkable: Take the people who don’t get along and turn them into office mates. Getting through an hour-long meeting is one thing; sharing an office is another. “We took our director of supply chain and he got an office with the plant manager of one of our manufacturing facilities,” she said. “What we started to see were real results because they had to ‘live’ together.'”
Gartner’s Sprague could see the wisdom in the decision: “Bad behavior permeates the cycle. … The stone throwers have to [realize] at some point if you don’t play socially, well, you’re not in the game.”
5. Crowdsource the problem: Here’s where the social media part comes into play. Think about third-party platforms from companies such as InnoCentive and Kaggle to overcome technical challenges or roadblocks. There, businesses can post problems for a community of analytics and tech junkies to grind through. Not comfortable making proprietary data public? InnoCentive, for one, has a Software as a Service option so that businesses can bring the platform in-house and use it internally.
In a new research report, McKinsey Global Institute projects that open data can “help unlock” $3 trillion to $5 trillion in economic value annually across transportation, consumer products, electricity, oil and gas, health care and consumer finance.
How does McKinsey see value being created? McKinsey researcher Michael Chui, co-author of Open Data: Unlocking Innovation and Performance with Liquid Information, counted the ways during his presentation at Strata Conference + Hadoop World 2013 in New York.
1. Open data creates transparency. Openness and transparency are words commonly heard in connection with governmental agencies, and for good reason: The public sector is a leader in the open data trend. “In the United States, California and Texas have identified millions of dollars a year in savings by releasing budgetary information and enabling citizens to spot potential opportunities to cut costs,” according to the report. But Chui argues open data can be useful to the private sector as well. Businesses that open up data can build deeper relationships with their customers by, for example, giving greater visibility into household energy spending, medical expenses and how financial products are built.
2. Open data improves performance. Open data can help businesses compare performance. Several of the examples contained within the report suggest sharing benchmarking data to create operational and even project management efficiencies. For example, sharing budgetary information can help keep procurement costs in check or opening up manufacturing benchmarks can help businesses increase precision in production. “About a third of the value from open data comes from benchmarking,” Chui said.
3. Open data creates new products and services. Think The Climate Corporation. Established by two former Google employees and purchased by the Monsanto Company for about $1 billion, The Climate Corp. supplies farmers with weather insurance by incorporating years and years of open data from places like the National Weather Service into its analysis. Insuring against harsh weather conditions isn’t new, but how Climate Corp. doles out that insurance is. The process is a more streamlined, self-service model. Farmers can purchase coverage online; the policy is customized to their specific location and the policy pays out after the coverage period ends.
4. Open data matches supply and demand. “We’ve studied a problem we’ve called ‘education to employment,'” said Chui. The problem, in short: Students don’t know what skills are needed to acquire a job and employers don’t know what skills potential employees have. In fact, according to the McKinsey report, “Today, school reputation is often used by many employers as a proxy for a candidate’s skill level.” Open data could help change that. How? The report points to Mozilla’s Open Badges platform as an example: Users earn badges by demonstrating mastery of certain skills, such as proficiency in a programming language. Businesses can study the badges earned by their own employees and seek out others who match the profile.
5. Open data helps to collaborate at scale. “Those of us who are geeks remember the Eric S. Raymond thing about open source: With enough eyeballs, all bugs are shallow,” Chui said. “That’s in code.” Extend that argument to data, and could open data make all insights shallow? “That’s probably not exactly true, but if you have more eyeballs working on data, you’re more likely to get better insights and better analysis,” he said.
For midmarket and small-business CIOs, hearing about the latest technologies or IT strategy trends sometimes amounts to little more than cacophonous buzz. What’s cutting edge in tech is often explained or measured in relation to its effect on the enterprise — and is often well outside a midmarket budget.
One example of this (or so I thought) is three-dimensional printing. It’s getting a lot of buzz these days, especially as the types of objects that can be created move beyond plastic tchotchkes and headline-grabbing 3-D-printed guns to things as stunningly complex and useful as artificial limbs and replacement organs for humans.
So, naturally, my ears perked up when analyst Daryl Plummer listed 3-D printing among Gartner’s top IT predictions — things expected to be real game-changers in the next few years — at the recent Gartner Symposium/ITxpo. Yes, IT people, this is something you need to pay attention to, Plummer said. For large companies, especially manufacturers and retailers, this is going to change if not everything, then at least a heck of a lot. As Plummer pointed out, when you start printing products, distribution systems change, the software changes, the way the work is done changes. “Just like the industrial revolution, people will begin to change where they live, what jobs they do, what products they produce, and I can guarantee that will affect an enterprise,” he said.
Right, but what about smaller companies? Well , Plummer hit right on my thought that 3-D printing sounds like something for really big companies with really big bucks to worry about. And for the most part right now, it is. But like every other way-out-there technology, prices are coming down. While high-end models (the sort you’d use to print an organ, for example) run well into the tens of thousands of dollars, consumer versions — albeit obviously smaller, slower and less complex — can now be had for less than $3,000. And that price point is why CEOs and CIOs of smaller businesses need to start paying attention to this technology, even if they never plan to print so much as a plastic figurine.
As Plummer put it, technically, 3-D printing is “simply an additive process for building up physical layers, based on a digital template.” That’s technically speaking. Let loose upon the world, Plummer opined, it will also most likely be a really easy way to counterfeit physical goods. As such, it could be a really easy way to cause big-time damages.
“What about the small business that creates little artistic designs? Maybe a bowl or cup or some kind of wall hanging, and they have a digital template for that and it’s stolen and now people are printing out their own?” Plummer said. “Those people are being affected directly and immediately by the loss of business they could otherwise have had.”
How fast does he think people will start stealing ideas and using 3-D printing to create business-busting fakes? Pretty fast. The U.S. already sees about $300 billion in stolen intellectual property each year. Gartner predicts that by 2018, globally, 3-D printing alone will account for $100 billion per year in stolen intellectual property.
It will become easy to steal an entire business, Plummer stressed — not just a product, but a whole business. “So 3-D printing becomes a risk point for us on intellectual property,” he said. This means CEOs and CIOs, especially those in manufacturing and retail, need to start thinking now about how they plan to protect intellectual property. The designs of bowls and decorative wall hangings were one example, but others include things like car parts and all the little things that make up the bigger things in products all around us.
It sounds pretty doom and gloom, but preparing now will go a long way later. And lest we get too pessimistic about how 3-D printing is just going to complicate lives, remember that it can save lives too. As Plummer summed it up: The threats are real, but so are the opportunities.
So how are you preparing for the 3-D printing revolution? Leave a comment and let us know.
This report is by Bianca Rawson, a fourth-year student at Northeastern University in Boston, Mass., and an editorial assistant with SearchCIO-Midmarket.com.
Big data presenters at MIT’s EmTech 2013 conference expressed great enthusiasm about the power and evolution of big data — but they were also forceful in their assertion that it needs to be used cautiously.
“We’ve said that this year, the next frontier of big data will be the individual,” said Jason Pontin, editor of the MIT Technology Review, who moderated the kick-off big data discussion at the Oct. 9-11 conference, which focuses on emerging technologies. He noted that, in areas such as health care and data accessibility, big data is becoming highly personalized.
Panelists took Pontin’s sentiments a step further: They said that we’re already there, and, while extremely useful, “highly personalized” data should raise some concerns.
Kate Crawford, a principal researcher at Microsoft Research, was the main voice in identifying those big data concerns and clarifying why individuals, and the companies they work for, need to be aware that that personal data is everywhere. Crawford specifically focused on some of the biggest big data myths: the myths of objectivity, data discrimination and the end of anonymity.
Pointing to Google Flu Trends, Crawford relayed that, by using an objective stance in calculating data to predict flu patterns through mobile, Twitter and Internet users, Google predicted twice as many flu cases as the Centers for Disease Control and Prevention actually recorded for the year. How did Google get it so wrong? Crawford claims it comes back to the myth of objectivity, wherein “[data] requires an enormous amount of care and thinking in terms of how we use it.”
Drawing on a big data study conducted by the University of Cambridge and intended to raise awareness about data discrimination, Crawford said that solely by studying Facebook “likes,” researchers were able to predict somebody’s sexuality, ethnic background, religious beliefs and physical as well as mental health. Not only that, but they did so with incredible accuracy.
In the wrong hands, that data can be a powerful deal maker or deal breaker. If that information were accessed by a bank or a landlord, an individual with what’s considered an undesirable data profile might never see a loan offer or apartment agreement, Crawford argued.
In a technological age in which it takes 12 points on a fingertip to identify an individual but takes only four pieces of cell phone data to do the same, organizations big and small must establish strong data safeguards and policies, and individuals should have the freedom to “opt out,” rather than hand an organization the complete freedom to track them and draw conclusions — and not always correctly.
You may think we’ll never get to that point, but it’s already happening — just look at how the European principality of Andorra is handling its big data. Alas, the end of anonymity is already here.
From driverless cars to the new old age to the end of privacy, there doesn’t seem to be a cultural hot button Eric Schmidt, the executive chairman of Google Inc., can’t opine on. He was one of a handful of “mastermind keynotes” at the 2013 Gartner Symposium/ITxpo, but his talk wasn’t all philosophy. When asked what he’s learned in the last couple of years about the needs of the enterprise, here’s how he began his response:
“Let’s talk about incumbents versus brand new startups. If you’re starting a new company, literally as a founder or a small team, you would never have the traditional IT data center room. What you would do is use cloud-based services, which are Workday, Amazon Web Services, Google, Salesforce, Netsuite. And you would build your company out of that. It looks to me like companies up to on the order of 1,000 people can operate that way with pretty reliable systems that are scaled for the level of operations that sized company would have.”
It’s not surprising to hear newer, smaller companies are taking a more non-traditional IT route compared to older, bigger companies. And yet I couldn’t help wonder if the real nugget for CIOs was in what wasn’t said: Is the division between new companies and established enterprises getting so big, those large, entrenched and relatively inflexible companies are destined to falter, done under by the weight of their legacy technology? What is the tipping point? When exactly should these incumbents, as Schmidt called them, cut their losses with legacy systems and go the way of the upstarts?
Alas, for the larger enterprise, Schmidt didn’t have much advice beyond “start small.” Here’s what he said: A typical adoption venture for our stuff is either an email replacement because the email systems are inflexible with respect to the existing customer problem … and the other choice is the data sharing model, as in [Google Drive] and so forth. People typically start there and then they expand over time.
Oh, and another thing Schmidt noted about companies that aren’t tied to the traditional IT model: In those kinds of companies, you give the employees a budget and say, ‘Go buy your own computer, and connect it into the Internet and off you go.’ It’s that simple of a model.”
The movie Moneyball turned statisticians into superheroes, but the baseball industry, for the most part, still operates on crude analytics. That’s according to Hyoun Park, president and chief research officer for the Boston-based Blue Hill Research Inc.
“The biggest problem fundamentally in baseball is that you get a lot of consumption of analytics,” said Park, a featured speaker at Predictive Analytics World in Boston last week. “And what you get is data crunching without context.”
Take scouting reports, one of three key types of data used in baseball. That’s where experts try to predict how a 16-year-old will perform in 15 years. They use simple metrics to size up the skills of potential players (and potential multimillion dollar investments) — sometimes even ranking their ability on a scale of 1 to 10, Park said.
“Not only is that not granular enough to understand what is really happening, you don’t have enough information to create a good predictive model,” he said.
Another baseball data source is the scorecard, a log- or event-based analysis of the game itself. Baseball, Park said, can be boiled down to two real actions: Scoring runs or making outs. For the team up to bat, the goal is to score as many runs as possible before getting three outs. Analyzing scorecard data can provide insight into what’s going to occur based on what’s already happened – and, in some cases, that means casting new light on old thinking.
“One of the most popular strategies in baseball is the sacrifice: the idea that if you have someone on first base, you can move the ball along, get this person to second at the cost of an out,” Park said.
But the data crunching shows something different. It may seem as though a base runner’s chance of a making it home increases as he goes from first to second, but sacrificing an out to get him there actually reduces his chances of scoring a run. “It’s a very interesting [example],” Park said. “We don’t often think of predictive analytics by looking at the positive and negative correlations together at the same time.”
The box score
Another metric relic is the box score, a dashboard that hasn’t changed for years. “It represents a very consistent data point for high-level baseball activity,” Park said.
The tough thing about data crunching the box score? It provides a good understanding of what’s happening in a particular game, but the view is too narrow for prediction. It doesn’t, for example, consider how external factors, such as weather, could impact a team or how variability impacts the data.
“You’ll find that a lot of teams try to make decisions based on predictive modeling for a single game,” Park said, “That doesn’t really work well because … a small sample size leads to a high potential of variability.”
Why it matters to your business
It’s these kinds of lessons that can be applied to businesses, said Park, whose aim is to parlay these insights into better business analytics. The metrics many companies use to evaluate employees are on par with (or inferior to) baseball scouting reports. But the less-than-meaningful business metrics are not limited to employee evaluations — simplistic ranking systems can be found in other areas of the business, Park said. Simplistic ranking systems are used to measure customer satisfaction and even product development. To really add value to the business, he advises (re)building predictive models that use rich, granular data.
Sales departments provide another example. The prevailing mindset there is to push forward, sometimes without studying the data and looking at how current events might impact what happens in the future. That means sales teams can fall into the trap of seeing any progress as good progress while ignoring other risk factors (*ahem* sacrifice bunt) that may mean losing the sale altogether.
Finally, small sample sizes can lead to analyzing data in a vacuum. Without considering the context –external and internal — businesses end up with limited insight because they’re crunching data without context.
If it wasn’t obvious before, it’s definitely crystal-clear now: We’re engulfed in a worldwide march towards mobile. For the previously uninitiated, this realization comes in the aftermath of the most recent iPhone unveiling at the Sept. 10, 2013, Apple Special Event where the tech giant announced a more “affordable” smartphone encased in a plastic shell. Android and Windows phone manufactures have been making more reasonably priced smartphones for quite some time, so few analysts were surprised by Apple’s announcement. In fact, Forbes reports “met expectations” as a reoccurring phrase in a number of analysts’ reports.
The rumor mill started to roar before Apple’s June 2013 Worldwide Developers Conference, when tech blogs started hearing about a “cheap” iPhone. At last week’s special event, Apple announced the new iPhone 5s, an evolution of the iPhone 5 released last year, and a completely new model, the iPhone 5c (‘c’ for ‘cheap’?). While the iPhone 5c costs $100 less than the new iPhone 5s, its still-steep price tag — $99 on-contract, $549 off contract for the 16GB model — isn’t an accessible buy for a solid chunk of the population, both in the U.S. and in target markets abroad.
Apple’s decision to create a plastic-clad, more affordable smartphone speaks volumes to the direction mobile is going — both among consumers and in SMBs. What does this march toward affordable mobile mean for business, exactly? With smartphones becoming more accessible, bring you own device (BYOD) will likely ramp up from here. When employees have an amazing device capable of handling personal and work matters, they’re going to want to use them — both on company time and, perhaps, on the company’s dime.
If small-business CIOs are not crafting reasonable, effective BYOD policies now, as the consumerization of mobile rockets forward, then they’re falling behind. The first step is to understand what SMB customers and employees value, and develop policies from there. More often than not, it all comes down to these three things: simplification, accessibility and affordability.
If employees have one awesome device capable of handling both personal and business matters, why would they want to carry an additional company-issued device? Of course, security comes into play the instant you mix business with personal. SMB employees using one device could go from accessing confidential company data to posting a status on Facebook to taking a call from Mom in a 1-minute span. As complicated as that multitasking sounds, the fact that it is all on the same device is a convenience that employees aren’t likely to surrender once they’ve gotten a taste.
Piggybacking off of “simplification,” smartphone users value accessibility. When employees have a personal device capable of so much, the last thing they want are restrictions placed on their activities. What CIO has time to orchestrate that anyway?
Having a smartphone or tablet has revolutionized the way employees are able to do work, i.e. responding to emails on the go, accessing important information from home – simply being available even when they aren’t in the office. This brings a whole new perspective to how business and IT operate, and adds a layer to mobile device policy considerations.
As mentioned earlier, Apple wasn’t the first company to make an affordable smartphone, but if the Ferrari of smartphone manufacturing is coming off its high horse to make a phone more accessible across demographics, CIOs best take notice. Smartphones and tablets are becoming more available to those who couldn’t splurge before. According to the Pew Research Center’s Internet & American Life Project, smartphone ownership among U.S. adults has increased from 35% in May 2011 to 56% in May 2013. Over the same span, the percent of U.S. adults with a non-smart cell phone dropped from 48% to 35%. Finally, the percentage of those without a cellphone fell from 17% to 9% over the two -year span.
Other findings from this study suggest that smartphone ownership increases as yearly income increases. For adults in the 30-to-49 age range, 47% of those making less than $30,000 own smartphones. The numbers increase from there — 68% of those with annual incomes between $30,000 and $75,000 own smartphones, and 86% of those making more than $75,000. With Apple’s introduction of the iPhone 5c, those numbers are only likely to increase.
Developing sound, but flexible, BYOD policies are a must. Other emerging bring-your-own movements to consider when developing a BYO-policy: bring your own apps (BYOA), bring your own cloud (BYOC) and bring your own network (BYON). All in all, “bring your own” is hot, and SMB CIOs must update policies as smartphones suffuse all demographics.
What do you think will come of this iPhone announcement in the SMB sphere? Will employees insist on BYOD? Will consumers start ditching expensive smartphones for cheaper ones? Can businesses afford to give employees company-issued smartphone devices? Sound off in the comment section and let us know what you think will be the biggest business-related side effect of making smartphones more widely accessible.
If you missed Thomas Friedman’s recent column on the “sharing economy,” it’s worth going back and taking a look. Friedman documents how a spark dubbed “airbed and breakfast” exploded into what most of us now refer to as Airbnb Inc., an online platform that matches travelers looking for short-term rentals with those who have the space to rent out.
Friedman calls this the “sharing economy”; others have dubbed it “collaborative consumption.” Regardless of what you call it, it’s weaving its way into just about every facet of life: From the home to the car to the office to the very clothes you’re wearing.
“There’s this sea change and mind shift happening around the American dream,” Brian Harrington, chief marketing officer for Zipcar Inc., said at a recent smarter cities event. “It used to be ownership, but now it’s access and convenience.”
Could the IT department be next? Let’s be clear, the concept of platforms and services like these aren’t new: eBay, Craigslist and even Zipcar, a car share company that got its start in Boston, Mass., have all been around for more than a decade. But in the last few years there’s been an uptick in resources offered through peer-to-peer platforms which some have argued is directly related to the downturn of the economy.
The willingness to embrace a rent-rather-than-own mentality could be a real opportunity for CIOs and IT departments, according to Deloitte Consulting. Deloitte suggests CIOs think about collaborative consumption “in terms of ‘excess capacity.'” That can be computers, server space, unused office space — whatever’s on hand.
And then Deloitte says this: Enterprises can take advantage of collaborative consumption in two primary ways: They can generate revenue by selling their own excess capacity (directly to consumers or to other businesses), or they can save money by buying or renting another organization’s excess capacity at a lower cost.
Could the sharing economy come to IT departments? And if the answer is yes, would it really make a difference by providing additional revenue or cost savings? Leave a comment and weigh in.