Facebook image via Shutterstock
What are the future prospects for Facebook at Work? Find out in this week’s roundup.
1. CIOs’ response to Facebook at Work? Never say never – Kristen Lee and Linda Tucci (SearchCIO)
Facebook is trying to make its way into the enterprise with a new collaboration tool called Facebook at Work. But will it work? CIOs and enterprise collaboration experts weigh in.
2. IT pros disappointed in Microsoft response to Azure outage – Ed Scannell (SearchCloudComputing)
Microsoft’s slow response to the recent Azure outage left some users wondering if they should entrust critical business data to the cloud environment.
3. Microsoft Skype for Business to replace Lync – Gina Narcisi (SearchUnifiedCommunications)
Lync is being rebranded as Microsoft Skype for Business, helping to fuse enterprise UC needs with the usability employees want.
4. Amazon bolsters AWS security, adds encryption key management – Rob Wright (SearchCloudSecurity)
Newly announced Amazon Web Services security features include an encryption key management service intended to boost cloud security and strengthen appeal of AWS to enterprises.
5. Seagate bolsters ClusterStor HPC storage systems for Hadoop, Lustre – Carol Sliwa (SearchStorage)
Seagate beefs up ClusterStor storage systems acquired from Xyratex, with new Hadoop optimization tools, Lustre update and secure data appliances.
Patches image via Shuttstock
Why did Microsoft hold off on an Exchange update? Find out in this week’s roundup.
1. Microsoft delivers hefty batch of patches – Toni Boger and Jeremy Stanley (SearchWindowsServer)
Microsoft patched two zero-day vulnerabilities in the year’s largest Patch Tuesday update, but it delayed an Exchange update.
2. Juniper CEO shuffle creates uncertainty, excitement – Shamus McGillicuddy (SearchNetworking)
Leadership problems and troubled negotiations with an unnamed customer spurred the resignation of Juniper CEO Shaygan Kheradpir.
3. U.S. Postal Service latest government target for cyber attack – Warwick Ashford (ComputerWeekly)
Hackers breach the networks of the U.S. Postal Service in the latest of a series of attacks on US government agencies.
4. Startup Cazena talks up big data in the cloud at AWS re:Invent – Scot Petersen (SearchDataManagement)
Cazena, a stealth-mode startup founded by former Netezza executives, is looking to solve the pain points that many enterprises have with managing and analyzing big data in the cloud.
5. IT pros consider KACE systems management as a service – Diana Hwang (SearchEnterpriseDesktop)
Dell Software is investing in the cloud and IT pros weigh in on KACE as a service-based system management offering.
Wearable technology image via Shutterstock
By James Kobielus (@jameskobielus)
Behavior is something we usually measure and analyze at the personal level. By that, I mean we tend to measure whether such-and-such a person went here, said that, and did that. In doing so, we almost always abstract away the finer-grained intrapersonal behaviors involved in all of that. We rarely measure the specific behaviors of the person’s legs, arms, hands, torsos, faces, tongues, eyes, brains, and other organs that made it possible for them to do all that.
We abstract away these lower-level details because they are usually irrelevant to behavioral analyses we are performing. For example, the specific sequence of movements of your customers’ hands across their smartphones’ touchscreen applications is immaterial if you’re simply trying to determine the circumstances under which they’ll click your “buy” button. By the same token, the specific accent that inflects how they speak the word “buy” into your voice-recognition application has no bearing on their decision to do so.
However, as the Internet of Things (IoT) pushes more deeply into our lives, we’ll start to rethink these assumptions. IoT-enabled wearable devices will incorporate interfaces that respond to inputs that are primarily tactical, gestural, ocular, muscular, motion-sensitive voice-activated, and brainwave-triggered in nature. To keep pace with innovations in wearable devices, IoT behavioral analytics will need to enable the user experience to predictively morph in keeping with people’s changing circumstances and intentions.
In that regard, I recommend this recent article on innovations in cognitive-computing technology that can predict how people pose, move, and gesture in various activities. As author Derrick Harris notes, these analytics have the potential to improve gesture-recognition capabilities built into wearable devices, and also to enable better simulation of real human behavior in computer animations. The deep-learning algorithms that have been developed can accurately predict the positions of people’s arms, legs, joints and general body alignment in various activities. These advances, according to Harris, “could lead to better gesture-based controls for interactive displays, more-accurate markerless (i.e., no sensors stuck to people’s bodies) motion-capture systems, and robots (or other computers) that can infer actions as well as identify objects.”
Conceivably, predictive behavioral analytics of this sort might be used in IoT wearables to drive more fine-tuned gestural interfaces. Wearables, either through embedded and/or cloud-based algorithms, could conceivably anticipate what the wearer will do or intend next. Behavioral predictions could seamlessly guide the wearer toward those ends by, for example, adjusting the gestural, tactical, visual, or auditory device interfaces in real time.
Clearly, immersively wearable user experiences are just around the corner, and cognitive-computing algorithms will tailor them to our individual physiologies like a virtual, dynamic epidermis.
Dell image via Shutterstock
Is the IT industry impressed by Dell’s latest tablet? Tune into this week’s roundup to find out.
1. Dell tablet, security efforts result in new 2-in-1 device – Diana Hwang (SearchEnterpriseDesktop)
The latest Dell tablet includes security features that impressed IT industry watchers, but Dell’s product integration poses challenges.
2. Infosec services firms Accuvant and FishNet to merge – Brandan Blevins (SearchSecurity)
The union of fierce rivals Accuvant and FishNet promises to combine two of the largest vendors offering information security services and consulting in the U.S.
3. Tintri channel chief: Cloud transition demands partners spend money – Spencer Smith (SearchCloudProvider)
Partners must develop a balanced portfolio between on-premises and cloud-based offerings and endure the recurring revenue model’s cash flow trough, says Tintri’s Americas channel chief.
4. The real Cisco OpenStack story begins with policy control – Rivka Gewirtz Little (SearchSDN)
Cisco has seeded its Group-Based Policy language into the new OpenStack Juno release, saying it will drive a standardized policy abstraction layer that works across multivendor networks.
5. Microsoft gains on Amazon as IT pros weigh AWS vs. Azure – Beth Pariseau (SearchAWS)
AWS remains the 800-pound gorilla in cloud, but Microsoft Azure has something it doesn’t: a preexisting relationship with enterprises.
Windows Server image via Shutterstock
Are Docker and Microsoft a match made in heaven? Find out in this week’s roundup?
1. Microsoft tries Docker containers on for size – Ed Scannell (SearchWindowsServer)
Continuing its commitment to not lock users into Windows-only products, Microsoft will build support for Docker containers into Windows Server next year.
2. Google adds security and flexibility to latest Android mobile OS – Warwick Ashford (ComputerWeekly)
Google is to begin rolling out the latest, most enterprise-friendly version of its Android mobile operating system.
3. Dropbox hack denied, but company encourages 2FA use anyway – Rob Wright (SearchCloudSecurity)
Dropbox refuted reports that a hacker had obtained 6.9 million customer usernames and passwords from the cloud storage service, but encouraged customers to use its 2FA security feature regardless.
4. Experts say Symantec break-up makes sense for storage – Garry Kranz (SearchDataBackup)
Symantec breakup came about because executives realized its storage and security products were sold to a different set of customers, storage experts say.
5. AWS and Google stage showdown in big data cloud services – Beth Pariseau (SearchAWS)
Big data in the cloud is more than just MapReduce. In a growing market for big data, capturing the next big innovation is key.
Verizon image via Shutterstock
Should Verizon focus more on its private cloud? Tune into this week’s roundup to find out.
1. Verizon changes course, focuses on private cloud service – Trevor Jones (SearchCloudComputing)
Verizon has turned its focus away from IaaS and toward private cloud for enterprise customers, but analysts are still advising caution.
2. CIOs react to HP split: What took so long? – Linda Tucci (SearchCIO)
Can the HP breakup deliver enterprise innovation and better service? CIOs say the Silicon Valley legend has nothing to lose by trying.
3. Analysis: Symantec split was a long time coming – Brandan Blevins (SearchSecurity)
The long-anticipated Symantec split will leave one company focused entirely on security, but experts caution that it’s just the first step in fixing the many problems in Big Yellow’s product lines.
4. Next round of HIPAA audits nears horizon – Shaun Sutner (SearchHealthIT)
Following last year’s pilot, the next round of HIPAA health data security audits are expected to start, behind schedule, in early 2015.
5. Eight big data myths that need busting – Nicole Laskowski (SearchCIO)
Can CIOs make big data the new normal by 2020? It starts with helping their companies distinguish big data facts from big data fiction, says Gartner analyst Mark Beyer.
Microsoft Windows image via Shutterstock
With all the hype surrounding Windows 10, will enterprises be impressed? Find out in this week’s roundup.
1. Is Windows 10 enterprise-ready? IT’s not convinced – Diana Hwang (SearchEnterpriseDesktop)
Microsoft previews Windows 10 with more enterprise controls but IT pros need more details before considering a migration.
2. Rackspace tackles bug with full Xen reboot – Trevor Jones (SearchCloudComputing)
Rackspace went a step further than Amazon with its Xen reboot, taking down its entire public cloud region by region to address the bug.
3. JP Morgan breach affects 7 million small businesses – Warwick Ashford (ComputerWeekly)
US bank JPMorgan Chase says a data breach in August affected up to 76 million households and seven million small businesses.
4. Cloud security experts call for global data privacy standards – Rob Wright (SearchCloudSecurity)
A recent study from the Cloud Security Alliance shows strong support for global data privacy standards as well as a consumer bill of rights, but there are major obstacles for privacy in the cloud.
5. Red Hat Storage Server 3 update brings capacity, analytics features – Carol Sliwa (SearchStorage)
Red Hat’s new Storage Server 3 software update adds support for snapshots and hot-button capabilities such as petabyte-scale capacity, SSDs and Hadoop.
Analytics image via Shutterstock
By James Kobielus (@jameskobielus)
Speed isn’t always a value. Faster data is not necessarily better data. If the data whizzes by faster than you can extract value, it’s a waste.
Stream computing is much more than low-latency middleware. Its value-added applications are several. It supports high-throughput filtering and analysis across disparate data streams. It delivers real-time updates to consuming applications. It enables rich query of high-velocity data. And it provides continuous updates of pre-processed intelligence to downstream repositories, ranging from small databases to big-data clusters.
In all of these ways, stream computing is a central component of any comprehensive big-data infrastructure. This recent article does a good job explaining how stream computing platforms, such as IBM InfoSphere Streams, can complement Hadoop, enterprise data warehouses (EDWs), in-memory databases, and other big-data platforms that are optimized for data that spans the latency spectrum from “at-rest” to “in-motion.”
What I found especially interesting was the discussion of “live data marts” that are refreshed by stream computing. Author Kai Wähner describes the concept as one of “provid[ing] end-user, ad-hoc continuous query access to this streaming data that’s aggregated in memory….A live analytics front ends slices, dices, and aggregates data dynamically in response to business users’ actions, and all in real time.”
What’s useful about this “live data mart” concept is that it blurs the increasingly arbitrary distinction between “in-motion” and “in-memory,” on the one hand; “in-motion” and “at-rest” on the other; and also (if it were possible to have a third hand) “in-motion” and “in-process.” The purpose of stream computing is to drive speedier results through delivery of live intelligence into live business processes. Ideally, every “at-rest” big-data repository–be it enterprise data warehouse (EDW), Hadoop, or whatever–can and should host live data in order to drive live decisions.
Live data marts should live on a converged infrastructure of stream computing, complex event processing, and various real-time-optimized big-data platforms, including the EDW. I’m happy that Wähner picked up on the notion that stream processing can figure into an EDW modernization strategy. I prefer to call this the “live EDW”:
- Using stream computing to filter and reduce EDW storage costs
- Leveraging the structured, unstructured, and streaming data sources required for deep analytics that are hubbed on the EDW
- Combining streaming and other unstructured data sources to existing EDW investments
- Delivering improved business insights from the EDW to operations for real-time decision-making
Essentially, the “live EDW” would aggregate at least one streaming source with other lower-latency sources into a conformed, continually refreshed in-memory data structure that drives real-time business processes.
Security image via Shutterstock
Should you be concerned about the Bash security vulnerability? Find out in this week’s roundup.
1. Attackers already targeting Bash security vulnerability – Brandan Blevins (SearchSecurity)
Exploits are already being written and rewritten for the ‘Shellshock’ Bash security vulnerability, which was announced just days ago, increasing the urgency for enterprises to remediate it quickly.
2. HP SDN app store is open for business with eight OpenFlow apps – Shamus McGillicuddy (SearchSDN)
HP’s marketplace for SDN apps is now open. Download apps from F5, Blue Cat, Kemp and others for HP’s OpenFlow controller.
3. Experts: Expect cloud breaches to endanger data privacy – Rob Wright (SearchCloudSecurity)
Attendees and speakers at the CSA Congress and IAPP Privacy Academy stressed the need for better data classification to reduce the effects of cloud breaches.
4. CloudBees move shows PaaS is no place for the little guy – Trevor Jones (SearchCloudComputing)
CloudBees is the latest small PaaS provider to bow out, leaving enterprise IT questioning the market as larger vendors squeeze out remaining players.
5. NetApp brings out new version of StorageGrid object storage – Dave Raffo (SearchStorage)
NetApp expands its low-profile StorageGrid object storage with a Webscale version to go beyond its healthcare niche.
By Greg Lord (@GregLord11)
Businesses are focused on pursuing the holy grail of higher revenue and lower costs, and as they evaluate new technology solutions to help drive this growth, CIOs are changing the way they deploy and manage business critical applications by striving to leverage the ubiquity and cost efficiencies of the Internet for application delivery. Although every organization’s IT strategy and approach to application delivery varies, the common requirement across all organizations is that end-users need fast, reliable and secure access to all their business applications. This requirement has become increasingly challenging given the complexity of application distribution across multiple data centers, end-users located all over the world using various devices and a growing list of business applications such as customer relationship management (CRM), collaboration, product lifecycle management (PLM), and support portals that users rely on every day.
For CIOs to successfully deliver applications to end-users within an organization, they need to understand the challenges in managing the Internet connection between an end-user’s device and the data center where a particular application is hosted – specifically on at the enterprise-level. The Internet was not designed to handle the demands and requirements of business use given the legacy architecture and logic of the Internet, the selection of routes between end-users and data centers is extremely inefficient. Once selected, the transmission of data along a route is slow and error prone. The Internet itself, and large Internet-connected cloud data centers, are prone to over-congestion and downtime. In addition, mobile devices have different operating systems, browsers and connection types that introduce complexities. The Internet offers no inherent web security protection and it can be very difficult to gain visibility or manage and control applications being delivered over the Internet.
These challenges are what we call “The Enterprise Internet Problem,” which can result in lost revenue due to partner and customer frustration with the poor response times and spotty availability. There are negative impacts on end-user productivity due to long load times as well as data loss vulnerabilities. Frustrated IT organizations struggle to troubleshoot issues and support complex application delivery architectures, let alone find the time to try to optimize the end-user experience.
To begin addressing the Enterprise Internet Problem, organizations typically try one of the following two approaches:
1. Implement a solution that lives within the four walls of the data center – either a physical hardware box or a virtual appliance.
The data center could be an organization’s own data center or the data center of their cloud or hosting provider. Any way you slice it, this approach doesn’t work because organizations need a symmetrical solution that addresses both ends of the application delivery and IT organizations can’t possibly implement a box or virtual appliance in every data center and every end-user location. Also, this approach introduces additional cost and complexity, because organizations need to purchase, implement, and support these solutions – a challenge which is compounded as applications inevitably move across datacenters and cloud environments over time.
2. Continue to invest in maintaining private network infrastructure.
This approach works to a certain extent, in that it helps address Internet performance and reliability issues, but it doesn’t scale because it limits access to applications and restricts organizations from leveraging the cost efficiencies and ubiquity of the Internet.
In order to solve the “The Enterprise Internet Problem,” organizations need to look at various options, including a possible movement to the cloud. Instead of requiring IT organizations to take on the burden of deploying and managing these critical capabilities on their own, cloud-based platforms can help enterprises determine the most optimal Internet route, connection offload, load balancing, real-time failover, web acceleration, Front-End-Optimization, DDoS mitigation and Web Application Firewalls that are not constrained within the four walls of a few data centers. Deploying applications in servers and networks can be effective, as it brings end users closer to the applications needed to operate a business.
By understanding and addressing these problems, organizations can position themselves to instantly enter new markets, improve customer interactions, do business via lower-cost online channels, enable end-users to get more done in less time, and realize the holy grail of higher revenue and lower costs.
Greg Lord is the Sr. Product Marketing Manager responsible for Enterprise Solutions, including Enterprise Application Delivery and Cloud Solutions, at Akamai Technologies. Before joining Akamai, Greg held several Enterprise Sales and Marketing roles at Intel Corporation, including having led Cloud & Data Center Marketing for Intel’s Americas business. Prior to Intel, Greg was an IT Manager at both Reebok and Partners Healthcare. Greg is a certified Project Manager (PMP), has an undergraduate degree in Computer Information Systems from Bentley University, and his MBA from the University of Notre Dame.