Data image via FreeImages
How far along is your organization in its preparation for GDPR? Find out why your organization should speed up the process in this week’s roundup.
1. Importance of GDPR preparation highlighted by Equifax breach – Sonia Lelii (SearchDataBackup)
While doing preparation work for GDPR, organizations should look at the Equifax breach and understand they would have to notify customers of a problem much sooner.
2. Apache Struts vulnerability blamed for Equifax data breach – Michael Heller (SearchSecurity)
Equifax has confirmed an unpatched critical Apache Struts vulnerability was exploited in the breach that compromised the personal data of 143 million U.S. citizens.
3. Healthcare quality goals set for telehealth, interoperability – Kristen Lee (SearchHealthIT)
In a media briefing, NQF experts and telehealth committee members discuss work being done to develop quality measures for telehealth and interoperability.
4. The dos and don’ts of brand content marketing – Jesse Scardina (SearchContentManagement)
As consumers get savvier about avoiding traditional advertising, companies build content libraries to establish themselves as trusted providers of information
5. DHCP server exploit highlights September Patch Tuesday – Dan Cagen (SearchWindowsServer)
Microsoft patched 76 vulnerabilities on September Patch Tuesday, but Windows Server administrators only have a few critical updates that require immediate attention.
Conference image via FreeImages
What do you expect the biggest news to be from this year’s Microsoft Ignite conference? Check out what’s on tap in this week’s roundup.
1. On tap for Microsoft Ignite conference: Adobe, Dynamics 365, LinkedIn – Jesse Scardina (SearchCRM)
New Adobe integrations and further integrations of existing products are possible announcements from Microsoft Ignite 2017.
2. SHA-1 hashes recovered for 320M breached passwords – Michael Heller (SearchSecurity)
Security researchers once again proved how easy it can be to recover SHA-1 hashes by cracking the hashes on nearly 320 million passwords related to data breaches.
3. CEO sees clouds lifting for Cloudian object storage – Carol Sliwa (SearchCloudStorage)
Cloudian CEO Michael Tso notes favorable trends for object storage: Capacities are increasing to petabytes, and customers want to use different clouds for different workloads
4. Windows DevOps shops quickly gain on Linux counterparts – Beth Pariseau (SearchITOperations)
Early adopters of DevOps embrace open source software, but enterprise Microsoft shops have made strides in 2017, as have Microsoft’s DevOps products.
5. Intermedia expands UCaaS platform with AnyMeeting acquisition – Katherine Finnell (SearchUnifiedCommunications)
In UC news, Intermedia gains web conferencing and webinar technology with its AnyMeeting acquisition, while VOSS adds features to its UC analytics service.
Security image via FreeImages
What do you think of the Intel Management Engine kill switch? Check out its connection to a certain NSA program in this week’s roundup.
1. Intel kill switch code indicates connection to NSA – Michael Heller (SearchSecurity)
Researchers discovered an Intel kill switch hiding in one of the chip maker’s software products along with references to an NSA program focused on secure computing.
2. CloudBees updates UX with new version of Jenkins Enterprise – Darryl Taft (SearchCloudApplications)
CloudBees delivers a new user experience based on its Blue Ocean open source project as part of the latest version of the company’s Jenkins Enterprise offering
3. Salesforce unveils image recognition technology for Social Studio – Jesse Scardina (SearchSalesforce)
Salesforce’s Einstein Vision for Social Studio currently only works with Twitter, but other integrations are expected.
4. Tegile becomes the latest Western Digital acquisition – Garry Kranz (SearchSolidStateStorage)
Western Digital’s acquisition spree continues with the pickup of flash vendor Tegile Systems. The drive-maker says Tegile storage complements its ActiveScale object storage.
5. VMware gets in tune with Office 365 mobile app management – Colin Steele (SearchMobileComputing)
VMware’s support for the Microsoft Graph API for Intune means Workspace One users can manage Office 365 mobile apps directly, without using a separate console.
Cloud image via FreeImages
Which cloud infrastructure provider is best? Check out Gartner’s rankings in this week’s roundup.
1. AWS, Azure tie for top spot in 2017 Gartner ranking – Jason Sparapani (SearchCIO)
The stellar scores of the top two cloud providers in this year’s Gartner ranking reflect tremendous innovation, says analyst Elias Khnaser.
2. Oracle Blockchain Cloud Service set to be unveiled – Adam Hughes (SearchOracle)
Following competitors Microsoft Azure and IBM, Oracle plans to release its Blockchain Cloud Service to give its cloud customers a higher level of security.
3. Cisco-Springpath buy carries intellectual property control – Antone Gonsalves (SearchNetworking)
The Cisco-Springpath acquisition places crucial HyperFlex technology under the networking vendor.
4. iPhone Security Enclave firmware encryption key leaked – Michael Heller (SearchSecurity)
Experts and Apple say despite the leak of the iPhone Secure Enclave Processor encryption key that can be used to decrypt firmware code, user data and biometric information are still safe.
5. IBM Spectrum Protect Plus tackles VM data protection – Sonia Lelii (SearchDataBackup)
The new IBM Spectrum Protect takes another shot at virtual server data protection. Spectrum Protect Plus provides backup and recovery for VMware vSphere and Microsoft Hyper-V.
Cloud image via FreeImages
Who wins: Private or public cloud? Check out what many IT leaders think in this week’s roundup.
1. Public vs. private cloud cost comparison finds enterprise winners – Robert Gates (SearchDataCenter)
Nearly half of enterprises do not save money with the public cloud, and say they have figured out how to beat public cloud costs with their own private cloud.
2. Cisco revenues fall, likely to go lower – Antone Gonsalves (SearchNetworking)
Top-line Cisco revenues fell in the July quarter, as the company’s legacy switching business continued to weaken. Cisco could soon reach two straight years of revenue declines.
3. NotPetya ransomware impact costs Maersk hundreds of millions – Michael Heller (SearchSecurity)
Danish shipping giant A.P. Moller-Maersk said the NotPetya ransomware attacks severely damaged business processes and the impact has been estimated at as much as $300 million in lost revenue.
4. PC market decline coming to an end – Priyanka Kethar (SearchEnterpriseDesktop)
The PC market decline has encouraged vendors to innovate in order to survive. With this and several other factors, the decline is set to see a marked slowdown.
5. IBM cracks the code for speeding up its deep learning platform – Ed Burns (SearchBusinessAnalytics)
Using more GPUs when training deep learning models doesn’t always deliver faster results, but new software from IBM shows it can be done.
Robotics image via FreeImages
By James Kobielus (@jameskobielus)
Deep learning (DL) feels monolithic. This branch of artificial intelligence (AI) routinely achieves amazing results in computer vision, speech recognition, natural language processing, and many other applications. However, it does so by leveraging architectures that are deeply hierarchical, massively parallel, and intricately neural.
The essence of a monolithic stack is that its processing nodes are tightly coupled. The core criteria for determining whether any architecture is tightly coupled include whether processing nodes have direct physical interconnections, engage in synchronous communication amongst themselves, process models that have strong typing and complex object trees, execute process-logic control centrally, bind services in static fashion, and have strong platform and language dependencies.
Most of those criteria apply in spades today’s DL processing architectures. The core hardware substrate, GPUs, support incredibly scalable, efficient, and fast processing of DL models in a single tightly coupled server or cluster. However, they’re not geared to distributing the layers and nodes of a DL model as microservices across a cloud-native computing fabric, nor are they optimized for loose coupling of feedforward, backpropagation, and other neural-net internode communications.
Almost every technical discussion of DL architectures–such as this recent blog on computer-vision architectures—proceeds from the assumption that the convolutional, pooling, encoding, inception, residual, discriminator, and other layers run on a tightly coupled, high-performance single-node hardware platform (based on GPUs, CPUs, FPGAs, and other technologies). As a proofpoint, check out this article from earlier this year in which I discuss how DL models’ fast matrix manipulations are still largely executed on GPU-based single-node co-processors, while much of the heavy lifting of DL model training takes place in in multi-node Spark clusters that are horizontally scalable, CPU-based, and in-memory.
Acutely aware of DL execution’s traditional orientation toward single-node architecture, I took great interest in IBM Research’s recent announcement of its Distributed Deep Learning (DDL) technology. The DDL software library, which is still in technical preview, enables a DL neural-net model to execute transparently across distributed environments that consist of up to 64 IBM PowerAI servers. In IBM’s implementation, the PowerAI cluster is configured with an aggregate processing capacity of 256 NVIDIA GPUs. DDL provides an API that enables TensorFlow, Caffe, Torch, and Chainer developers to scale-out DL model execution across PowerAI clusters for accelerated training and other functions. Under the covers, DDL-enabled apps and PowerAI clusters speak a “multiring” algorithm that dynamically optimizes cross-node network paths to automatically balance latency and bandwidth utilization across a distributed DL cluster.
IBM’s announcement represents an exciting milestone in the decoupling of the DL ecosystem from single-node execution architectures. However, it only scales out the horizontal execution of entire models, across all layers and neural nodes. It does not decouple the execution of processing layers of any specific DL neural-net model. For that latter capability, I recommend the work that Facebook is doing in the decoupling of Caffe2 model execution across multi-node GPU servers clusters. In particular, Facebook is implementing decoupling in two areas of their multi-node DL architecture:
- Decoupling cross-layer dependencies in the computations of their respective gradients executing a common DL model: As Facebook researchers state it in this recent study, “In order to scale beyond the 8 GPUs in a single Big Basin server, gradient aggregation has to span across servers on a network. To allow for near perfect linear scaling, the aggregation must be performed in parallel with backprop[agation]. This is possible because there is no data dependency between gradients across layers. Therefore, as soon as the gradient for a layer is computed, it is aggregated across workers, while gradient computation for the next layer continues.”
- Decoupling cross-thread dependencies in the parallelized execution of various subgraph within a common DL model: As the paper states, “Caffe2 supports multi-threaded execution of the compute graph that represents a training iteration. Whenever there is no data dependency between subgraphs, multiple threads can execute those subgraphs in parallel.”
As modularized decoupling of the AI ecosystem proceeds, the ability to farm out execution of entire DL models, components thereof, and/or inter-component communications will become critical for scaling and acceleration. As I discussed in this recent Wikibon research note, DL development will evolve into the modeling of these capabilities as functional primitives for containerization, orchestration, and management as microservices. Eventually, the functional primitives exposed as microservices will include both the coarse-grained capabilities of entire DL models (e.g., classification, clustering, recognition, prediction, natural language processing) and the fine-grained capabilities (convolution, recurrence, pooling, etc.) of which those models are composed.
In the emerging world of radically decoupled DL, these functional-primitive microservices will have the following capabilities:
- Support accelerated DL development through an abstraction layer that compiles declarative program specifications down to DL model assets at every level of granularity;
- Call each other as submodules within a more versatile, adaptive DL architecture;
- Invoke RESTful APIs for dynamic binding with other DL modules;
- Share cross-module variables through stateless, weakly typed DL semantics;
- Orchestrate complex patterns within a distributed DL control plane; and
- Steer clear of monolithic OS and language lock-ins.
Before long, DL functionality will be so easy to decouple that you’ll be able to embed whole models—or even the tiniest pieces of them—on the edge in mobile devices, Internet of Things (IoT) endpoints, and so. Actually, that trend is well along, as can be seen in Intel’s recent release of a USB-based “Neural Compute Stick” for development of embedded DL.
DL functionality will soon be decoupled so thoroughly and disseminated so broadly that it will seem to disappear. Through cloud-native computing and the IoT, DL DNA will be literally everywhere.
Security image via FreeImages
Just who exactly are the Shadow Brokers? One expert shares his opinion in this week’s roundup.
1. Who are the Shadow Brokers? Signs point to an intelligence insider – Rob Wright (SearchSecurity)
At Black Hat 2017, security researcher Matt Suiche analyzed the Shadow Brokers dumps, postings and behavior to get to the bottom of one of the infosec industry’s biggest questions.
2. Mitel eyes hybrid UC with ShoreTel acquisition – Antone Gonsalves (SearchUnifiedCommunications)
Mitel expects the ShoreTel acquisition to double its cloud business and help accelerate customers’ migration to hybrid UC services.
3. Service Cloud Lightning upgrades give agents improved UI, mobile app – Jesse Scardina (SearchSalesforce)
With the latest upgrades, Service Cloud Lightning enables easier contact center customization and provides a mobile app for agents.
4. Chief data officer jobs call for nurturing data ethos in companies – Jack Vaughan (SearchDataManagement)
Chief data officer jobs now centered on defense against risk will give way to ones emphasizing innovation. To do so, CDOs must nurture data culture, MIT panelists say.
5. New Microsoft fuzz testing service brings AI, automation to developers – Ramin Edmond (SearchEnterpriseDesktop)
Microsoft Security Risk Detection automatically tests application code for errors and vulnerabilities so developers can fix issues within an app before releasing it to users.
Data image via FreeImages
By James Kobielus (@jameskobielus)
It’s often said that identical twins can read either others’ minds. Believe that if you wish, but there’s no doubt that sharing a common birth experience, genome, and appearance tends to produce synchronicity among certain siblings’ minds.
Twins are innately attuned to each other’s best interests. In the industrial Internet of Things (IoT), the concept of a “digital twin” is gaining traction as a foundation for building analytic intelligence that mirrors and manages specific physical entities. Essentially, a digital twin describes the current configuration, state, condition, behavior, location, and other attributes of some physical device that possesses an IoT connection. As a data construct, the digital twin’s core functions are twofold: to aggregate, manage, and analyze the sensor data emitted by its IoT-connected physical counterpart, and to support data-driven functions such as simulation, monitoring, maintenance, diagnostics, and other life-cycle management functions vis-à-vis that analog entity.
However, the notion of a digital twin isn’t intrinsically limited to industrial applications. In a larger sense, it’s a modeling paradigm that can be applied to any IoT-connected endpoint, including human beings. It can serve as a framework for developing, training, and managing any intelligent virtual assistant that acts on behalf of some endpoint, such as a user connecting into the IoT via their smartphones, smart appliances, and so forth. To serve this function, the digital twin would need to encode the knowledge and interest graphs that intelligent agents use to calculate what data-driven actions are optimal for us at any time.
It’s in this regard that my colleague George Gilbert recently discussed digital twins in a recent Wikibon Premium research note. He defines digital twin as “a data representation or model of…any entity involved in a business.” In other words, he broadened the concept to apply to the “IoT&P,” where the “P” stands for “people,” “process,” “product,” “partner,” and pretty much anything else (regardless of initial) that’s connected, emits sensor data, and for which a data-centric replica might be built, managed, and analyzed.
Artificial neural networks and other machine learning (ML) algorithms drive much of the analytics in digital-twin applications. Many of these models use supervised learning methods to boost the accuracy of predictions, classifications, pattern recognitions, and other ML-driven results. Consequently, digital twins serve as containers for the “ground truth” data needed to train the ML models that infuse sensor-driven intelligence into the agents that operate on behalf of connected endpoints.
In other words, digital twins can be the foundation of agent-centric application architectures on the IoT&P. There can be as many digital twins as there are sensor-equipped people and devices connected to the network. For each of us, there might be distinct twins associated with each of our physical environments (home, office, car, etc.) and physical possessions (smartphones, smart appliances, etc.). Each of those device-associated twins might, in turn, inherit identities, preferences, and other attributes from the digital twins associated with us as connected individuals.
Personalization would be the key use case for digital twins in the human-facing IoT&P. To enable 360-degree personalization, an individual’s digital twin would need to manage the gamut of historical, real-time, and other data flooding in from all connected devices and all mobile, social, Web, and other applications with which a user engages. Personalization would also depend on the ability of our digital twins to engage with the corresponding twins of other individuals, groups, and organizations within our social graph.
The twins of all of those engaged endpoints might, in turn, share training data and federate ML models to drive dynamic optimization of every experience within our own personal sphere of the IoT&P. Orchestrated into more complex multilateral relationships involving people, organizations, and machines, and benefiting from intricate feedback loops, the twins might even drive cognitive optimization of “swarming” behaviors of the sort that I discussed in this post. Within a hierarchical structure of endpoints, the more stable endpoint-swarms might even gain “endpoint” status in their own right. This would happen with the creation of of a digital twin that encodes the swarm’s collective interests and drives creation of ML-driven intelligent agents to advance those interests.
As the IoT blurs the virtual and physical worlds, digital twins will help us to manage the intricate interplay among ourselves and the devices that, we hope, will always have our best interests at heart.
Jobs image via FreeImages
Do you think we need to worry more about AI taking jobs? Find out how the combination of AI and jobs will change the way workplaces look in the future in this week’s roundup.
1. AI taking jobs isn’t the problem; how it will change them is – Ed Burns (SearchBusinessAnalytics)
The combination of AI and jobs will change the way workplaces look in the future, experts say. People will have to focus more on their areas of strength.
2. Dark web markets’ shutdown may lead to more arrests – Michael Heller (SearchSecurity)
Cooperation between law enforcement from around the world led to the shutdown of AlphaBay and Hansa dark web markets and potential leads of illegal vendors.
3. Infrastructure, endpoints to drive UC tech budget boom in 2018 – Katherine Finnell (SearchUnifiedCommunications)
UC budgets are expected to rise in 2018 as organizations look to update aging voice infrastructure, promote user adoption and extend UC into contact centers.
4. Android apps on Chromebooks hurt appeal as thin clients – Ramin Edmond (SearchVirtualDesktop)
All future Chromebook releases will support Android apps, which brings concerns regarding their security, management and overall usefulness as thin clients.
5. Meet James, the new QA on the team – he’s also a virtual robot – Valerie Silverthorne (SearchSoftwareQuality)
In a DevOps world full of automated software testing, things are about to get even more interesting. James is a virtual robot designed to handle CX QA — for $5 an hour.
CEO image via FreeImages
Are you surprised by Citrix’s latest CEO change? Find out why the move was made in this week’s roundup.
1. Tatarinov out as Citrix CEO in surprise shake-up – Ramin Edmond (SearchVirtualDesktop)
Citrix has named its fourth CEO in less than two years, appointing its CFO to replace Kirill Tatarinov. The move has analysts and IT pros questioning the company’s motivation.
2. Symantec certificate authority business reportedly for sale – Peter Loshin (SearchSecurity)
As Google and Mozilla prepare plans to reduce trust for Symantec’s certificate authority, the antivirus vendor is reported to be seeking a buyer for its web certificate business.
3. Microsoft partnerships boost Skype for Business capabilities – Katherine Finnell (SearchUnifiedCommunications)
Microsoft partnerships highlighted new products and services at its Inspire conference. Several partner companies had a keen focus on improving Skype for Business integrations.
4. IBM to debut z Systems mainframe with beefed-up security – Ed Scannell (SearchDataCenter)
IBM is poised to leap forward in cybersecurity with a refreshed z Systems mainframe built to handle pervasive encryption.
5. Azure SQL Data Warehouse turns up the heat, expands processing power – Jack Vaughan (SearchSQLServer)
Spirited competition is under way among cloud providers as they enhance large-scale relational data warehouses in the cloud. An Azure SQL Data Warehouse update by Microsoft is the latest example.