A number of traditionally hardware-focused IT vendors now pursue software, services and consumption-based pricing. Cisco provides an example in the networking field, making its first move to sell a switch as a subscription in 2017. Server-maker HPE in June announced plans to deliver its entire hardware lineup as a service by 2022. Storage vendors are moving in this direction as well: the NetApp subscription model provides one example.
NetApp channel executives in March discussed the company’s hardware-to-software transition and partner enablement strategy. Mathew Chacko, director, channel sales, consumptive solutions at NetApp, and Mara McMahon, global principal consultant, Fueled by NetApp, provided additional details in a statement describing the company’s subscription strategy and how partners fit into this approach.
NetApp’s strategy is built around its portfolio of managed and unmanaged consumptive offerings associated with the company’s private and public cloud solutions.
“NetApp, as a whole, believes that these offerings enable partners to embrace a land-adopt-expand-renew model in subscription-based selling,” according to Chacko and McMahon.
With land, adopt, expand, renew — or LAER — the seller seeks to create a long-term relationship with the buyer. The “land” step represents the initial sale, while “adopt” covers activities designed to ensure customers successfully use the offering. This phase is particularly important with cloud-based, subscription software and services — organizations are unlikely to renew an offering users don’t embrace. The “expand” step encompasses up-selling and cross-selling initiatives that extend use. All the steps point to the goal of customer renewal.
NetApp uses LAER in its partner program, with support coming from the company’s Fueled by NetApp (FBNA) program. FBNA equips NetApp partners with best practices for go-to-market strategy and planning. FBNA covers pricing, packaging, defining use cases, creating service-level agreements, messaging, sales training, demand generation and market awareness, the company executives noted. They added that the best practices create a foundation for selling subscription services.
NetApp subscription model for public, private clouds
NetApp’s lineup of consumption-based offerings gives partners various entry points to customer sales. NetApp’s Cloud Volumes OnTap, for instance, offers a way to land deals in the public cloud, putting partners in a position to help customers as they migrate to AWS, Microsoft Azure or Google Cloud Platform, according to NetApp. Cloud Volumes OnTap offers SnapMirror replications for disaster recovery, which NetApp said is a typical first step in a customer’s cloud journey.
In keeping with the NetApp subscription model, Cloud Volumes OnTap is available as an hourly pay-as-you-go subscription or as a one-, two-, or three-year subscription.
Meanwhile, NetApp’s hyper-converged infrastructure (HCI) offering, HCI Cloud Consumption, lets partners land customers in a private cloud setting, NetApp noted. The offering is available as an annual subscription.
“More and more partners are establishing private managed cloud and multi-cloud support services themselves based on NetApp technology,” according to Chacko and McMahon.
At the adoption step, the NetApp partner shifts to helping customers migrate production workloads to the public cloud. In private cloud environments, partners help customers incrementally deploy “right-sized compute and storage nodes” to boost adoption, according to NetApp.
NetApp partners have a few options for expanding upon the initial sale. For public cloud use cases, for example, partners can tap NetApp’s Cloud Data Services portfolio, which includes Cloud Sync for migration and Cloud Insights for hybrid multi-cloud monitoring and optimization. And integration between Cloud Data Services and NetApp’s HCI platform lets partners extend the company’s private cloud offerings to enable hybrid multi-cloud data management, according to NetApp.
Another expansion opportunity: Partners can position NetApp Kubernetes Service for application development on the company’s HCI platform or in the cloud.
“This would enable partners to attach high-value services focused on application lifecycle management in a hybrid multi-cloud environment,” the NetApp officials said.
To complete the LAER sequence, NetApp offers what the company describes as an “increasingly simplified renewal plan” that aims to help partners quickly finalize deals.
Remaining trusted advisors
The LAER approach within the NetApp subscription model intends to help partners avoid churn and encourage customers to renew subscriptions. Successful adoption and the resulting renewals cement the partners’ trusted advisor role among customers.
This model, according to NetApp, “provides NetApp partners with an opportunity to generate multiple, recurring revenue streams that have the potential to increase over time.”
Organizations adopting DevOps principles and moving toward the new world of cloud-native applications may be overlooking an important element in this modernization push: the data problem.
That’s the thinking of Jeff Bozic, principal architect in Insight Enterprises’ cloud and data center division. Data management, he said, involves a host of issues, not the least of which is the basic question of where does the data exist in a complex hybrid IT environment.
“It’s a big challenge and it’s only going to get worse as we create more data,” Bozic said.
Data problem: key factors
A number of factors contribute to the data problem. Bozic cited the DevOps movement, public cloud adoption, microservices-based applications residing in containers, new data sources such as IoT streams, and the pressure to create dynamic apps that react faster to user needs.
Those considerations put a strain on traditional data flows, databases and extract, transform, load (ETL) processes, according to Bozic. The newer development techniques, application models and performance expectations raise data-centric questions for organizations.
“How do these data structures need to start changing, and how do the databases need to start changing?” Bozic asked “How do I protect that data? Maybe my traditional data protection solution may not make sense. The disaster recovery process may not make sense. I want apps to be portable to run in different clouds; data may have to start moving. How do I know where it is, and how do I follow it?”
Blithely acquiring more up-to-date database technology — NoSQL offerings versus traditional relational databases, for example — may not be the answer, however. Similarly, organizations that immediately attack data structure as a point problem, without paying attention to the broader organizational context, may be in for some difficulty, Bozic suggested. He emphasized the importance of getting various internal teams talking together as a precursor to fixing the data problem.
Bozic pointed to Agile as a method for breaking down silos within an organizations and getting the database team, the infrastructure team, the security team and other groups working together to take on challenges. Breaking down cultural barriers is the first step toward understanding how to change an enterprise’s data structure and paving the way toward cloud-native development patterns, he noted.
Success is unlikely “if you don’t have the culture of becoming Agile and processes in place and cross-functional teams,” Bozic explained.
Smart regions, which seek to unite neighboring cities and towns around joint smart city projects, have begun taking shape across the U.S.
A pioneering force behind the emerging smart region trend is the Institute for Digital Progress (iDP), a nonprofit based in Phoenix. IDP developed its focus on smart region initiatives after working one on one with 22 cities and towns in the Greater Phoenix region.
“While we were doing a lot of great work, working with these cities individually, we realized that in order to be competitive, we were going to need to get the [cities and towns] to think together, act together, innovate together and procure technology together. … We realized that our competitive advantage was scale,” said Dominic Papa, executive director and co-founder of iDP.
IDP launched the Greater Phoenix Smart Region Initiative in 2018, resulting in a consortium of the 22 cities and towns iDP was working with in addition to Arizona State University.
Smart region benefits
One of the goals of iDP’s smart region initiative is to “disrupt government” and change the traditional procurement processes that often bog down government technology projects, Papa said.
The consortium, he said, has complete buy-in from state and local government stakeholders, who aim to develop cooperative purchasing contracts that will open up regional collaboration, innovation and procurement. By procuring technologies as one regional entity, consortium members can gain greater economies of scale and purchasing power than attainable otherwise.
“While procurement has been the messy part of this, we are actively working to solve it, and that is kind of the most exciting part,” he noted.
As for smart city projects, the consortium has set out to identify four to five urban challenges that it can collectively tackle. He added that part of the consortium’s efforts will include adopting interoperable technology platforms and systems. “It doesn’t make sense for the City of Phoenix to buy a transportation solution to speed up traffic through Phoenix for [traffic] to come to a dead halt in the City of Tempe,” he said.
Roles for channel partners
The Arizona smart region consortium will eventually seek partnerships with channel firms, Papa said. “We are just getting [the consortium] up and off the ground. We built the operational framework and the structure for how the consortium will run, but the channel partners are going to play a critical role in this.”
Partners, for example, can offer the technical expertise the consortium needs to deploy smart city projects. “We understand the problems … but then we need the channel partners to bring the deep understanding of the technologies and how those technologies can adapt and fit and address those challenges and issues,” he said.
He noted that North Texas, North Florida, Cincinnati, and Central Coast California all have smart region initiatives underway. He hopes to see various smart regions eventually forming a country-wide network that will share best practices and knowledge.
“The more of these smart regions that there are, the better we will be,” he said.
Microsoft has dropped planned policy changes that would have compelled Microsoft channel partners to pay for software they have been using under internal use rights licenses.
Amid considerable partner dissent, Microsoft on July 12 rolled back its plan to cut IUR licenses, a move that would have put partners on the hook for thousands of dollars in fees. The change in direction comes a few days after the new IUR policy surfaced and just before the upcoming Microsoft Inspire partner conference, which runs July 14 to 18 in Las Vegas. Microsoft revealed a number of investments in Microsoft channel partners ahead of the event.
Gavriella Schuster, corporate vice president, one commercial partner, Microsoft, discussed the company’s decision to rescind the planned policy changes in a blog post:
“Given your feedback, we have made the decision to roll back all planned changes related to internal use rights and competency timelines that were announced earlier this month. This means you will experience no material changes this coming fiscal year, and you will not be subject to reduced IUR licenses or increased costs related to those licenses next July as previously announced.”
The IUR change was to go into effect July 1, 2020.
In panning the software vendor’s proposed changes, Microsoft channel partners cited a combination of new licensing costs and a diminished ability to gain experience with products. Channel companies use vendor technology in-house to run their businesses and then deploy products for customers in a use-what-you-sell strategy.
In 1962, Ross Perot was an IBM mainframe salesman with a vision for providing technical expertise and services around the big machines.
He left IBM that year to launch Electronic Data Systems (EDS) in Dallas with an initial investment of $1,000. In the ensuing years, EDS helped pioneer what would become the IT services industry. EDS sold compute time on underutilized mainframes, built and managed computing centers for customers and created a network of data centers to run clients’ workloads remotely. Those services would eventually morph into IT outsourcing and, decades later, cloud computing.
Perot, who died July 9 at the age of 89, was ahead of his time at EDS and seemed to have a keen sense of repeating historical patterns. General Motors purchased EDS in 1984 and Perot transitioned from entrepreneur to GM board member. The shift proved unsuitable: Perot in short order became fed up with what he viewed as the car maker’s stifling bureaucracy and eventually departed GM-EDS through a $700 million buyout. In 1988, Perot was back in entrepreneurial mode, founding Perot Systems Corp. Perot’s investment company that launched the new venture was aptly called HWGA Partners, with the acronym standing for Here We Go Again.
EDS took Perot to court, claiming he had violated the terms of a non-compete agreement. Perot, however, asserted he was free to compete, provided he did so on a not-for-profit basis. The lawsuit played out in the Fairfax County (Va.) Circuit Court, where Perot amusingly told reporters no one could beat his prices, alluding to his non-profit status.
The judge sided with Perot on the question of competition, ruling that Perot was free to seek contracts provided those deals didn’t contemplate a profitable return. That restriction expired in late 1989 and Perot Systems was free to make money.
With the lawsuit out of the way, Perot Systems went on to capture projects with high-visibility clients and, in another example of circularity, advised Perot’s former employer, IBM, on the outsourcing business.
EDS and Perot Systems are gone now, with EDS finding its way via acquisition into DXC Technology and Perot Systems absorbed within NTT Data. But the industry Perot helped create continues to grow — Gartner forecasts global IT services spend of $1.065 trillion in 2020 — and evolve.
What do you get when you cross the CRISP DM model with Microsoft’s TDSP? Well, yes, another acronym. But Cognilytica, a market research firm in Washington, D.C., believes the pairing provides a methodology for AI project management.
CRISP DM stands for Cross-Industry Standard Process for Data Mining, a step-by-step approach for launching a data mining project. CRISP DM was created in the late 1990s, before the current surge in AI investment. TDSP, or Team Data Science Process, is a methodology for implementing data science initiatives. Microsoft introduced TDSP in 2016, with the aim of facilitating predictive analytics offerings and intelligent applications.
Cognilytica’s Cognitive Project Management for Artificial Intelligence (CPMAI) methodology combines CRISP DM, TDSP, Agile and its own thinking and research on best practices for AI project management. The company offers training and certification on CPMAI.
Building upon the CRISP DM model
Ron Schmelzer, managing partner and principal analyst at Cognilytica, said AI projects call for a different take on project management compared with traditional IT initiatives. Specifically, an approach for running an AI, machine learning (ML) or cognitive technology project must be “much more data centric,” he noted.
The CRISP DM gets an AI adopter partway there. The methodology starts with understanding a business’ data mining goals and works its way through phases such as data collection, preparation and modeling. CRISP DM, however, “is not AI-specific and is missing some AI details,” noted Kathleen Walch, managing partner and principal analyst at Cognilytica.
CPMAI takes the data centricity of the CRISP DM model and adds TDSP, which Microsoft describes as an “agile, iterative data science methodology.” The Team Data Science Process encompasses a data science lifecycle definition and a standardized project structure along with infrastructure, resources, tools and utilities.
Cognilytica, meanwhile, contributes components such as best practices from in-production AI implementations, ML model training approaches and ML model evaluation.
Broader AI discussion
Schmelzer and Walch discussed CPMAI during last months’ AI World Government conference in Washington, D.C. Other conference speakers outlined a three-step process for launching AI projects and discussed conversational AI’s potential in the public sector.
AI World will take a wider-angle view of the technology when the conference convenes Oct. 23-25 in Boston. The event will consider AI in manufacturing, healthcare, pharmaceuticals and financial services among other markets.
Dell Technologies continues to report strong channel growth and highlight the cross-selling opportunities inherent in its portfolio.
Looking back at its fiscal year 2020 first quarter, which ended May 3, Dell painted a healthy picture of sustained momentum in partner sales. According to Cheryl Cook, senior vice president of global marketing at Dell, channel sales grew 15% year on year in the first quarter, while distribution rose by 10%. Partner sales of Dell’s client products increased 9% year on year, servers by 18% and storage by 26%. Cook noted that Dell’s reporting of storage sales included the performance of its converged infrastructure products such as VXrail. Dell Technologies partners brought in 15,000 new customers in the first quarter.
“We think the strategy and the breadth of the portfolio are helping us be in a good position to win,” Cook said.
A focus on cross-selling
Since the early days of Dell’s merger with EMC, the company has relied on the breadth of its portfolio as a crucial differentiator in the market. The cross-selling potential of the portfolio was a theme at this year’s Dell Technologies World conference, which ran in conjunction its Global Partner Summit, from April 28 to May 2 in Las Vegas.
At the event, the vendor unveiled the Dell Technologies Partner Program, promising greater ease in engaging VMware, Pivotal, SecureWorks and other business units under the Dell Technologies umbrella. The program simplified partners’ tier requirements by counting sales revenue from Dell Technologies subsidiaries toward tier credits. Dell also consolidated its partner training and certification requirements.
Scott Winslow, president and founder of Winslow Technology Group, an IT solutions and consulting firm headquartered in Waltham, Mass., said the new partner program made several improvements to the Dell EMC Partner Program it replaced. He added that the vendor is doing more to enable cross-selling.
“I think the big change for us, and we see it as a positive, is doing more about getting [Dell’s] strategically aligned businesses working together,” Winslow said.
Winslow Technology Group has ramped up its cross-selling engagements as Dell builds tighter integrations between its business units. “We were doing some VMware business [before], but now we are doing a lot more VMware business,” Winslow said. The company is also doing more business with SecureWorks, Dell’s managed security services company.
“We see [Dell Technologies subsidiaries] sales teams working together a lot more closely than they have been in the past. … I would say we are doing more cross-selling, more collaboration … and coming up with solutions for customers that are more comprehensive,” he said.
More work to be done
Even as Dell takes bold steps to unify its various business units for partners, the company recognizes more work is needed. Cook said the vendor continues to focus on creating a simpler operating experience.
At the 2019 Global Partner Summit, Dell senior vice president of global partner strategy, programs and operations, Darren Sullivan, highlighted ways in which the vendor is streamlining channel businesses processes. Initiatives included new investments in automation as well as more integration between Dell’s business lines.
Winslow said he is optimistic that the vendor is moving in the right direction. He acknowledged that complexity remains.
“The nirvana would be one partner portal, one deal [registration] page, one partner program for all” Dell Technologies subsidiaries, Winslow said. “They’re not there yet, but they are taking steps toward unifying these … disparate strategically aligned businesses to be more ‘the power of one.’ “
Startup Armorblox is looking to carve out a space in the email security software market with the help of channel partners.
The company, based in Cupertino, Calif., touts its platform’s use of natural language understanding (NLU) as its key differentiator among competitors. Armorblox formalized its inaugural partner program last week, overseen by senior vice president of sales and channel Brian Harmon, who joined Armorblox in March. Harmon said the platform’s NLU capabilities have sparked the interest of managed security service providers (MSSPs).
“Many of [the MSSPs] want to understand how they can support multiple customers and … manage their email security and data security as well as document management platforms,” Harmon said.
Natural language understanding in email security software
While Armorblox follows the same metadata-driven approach of its competitors, Harmon said the platform’s natural language processing and deep learning capabilities go beyond email security at the contextual level. The technology drills into communications to determine “what messages are intended for, what their content is all about — the who, what, when, where and why.”
“The NLU component is absolutely the game-changing technology we are bringing to the market. That is the core of what we do,” he said.
One way Armorblox detects threats is by building profiles of employees’ writing styles. If the platform sees a suspicious aberration in how an employee typically composes emails, Armorblox can trigger an alert. The alerts can be configured to go to security operations centers or users for verification. The platform can also quarantine messages for further examination, Harmon said.
In addition to email, Armorblox has created integrations with Slack, Office 365 and other messaging platforms, as well as document management systems. “We are taking a look at other messaging and human-to-human communication … as a threat vector for attackers to go after,” he said.
Commitment to channel sales
In addition to MSSPs, the Armorblox partner program targets value-added resellers and distributors. The program features two tiers, Preferred and Premier, and provides partners with support, training and preferred pricing.
Although Armorblox has a direct sales force, Harmon said he and his reps have been firmly committed to the channel model from the start. “We will never compete with a partner and take a deal direct,” he said, adding that Armorblox will reward partners who register the most opportunities.
The company is currently working with about 12 partners in the U.S. “On the [partner] recruitment side, I am not looking to create a huge … ecosystem. I want very specific targeted partners in each region that are specialists in their area [and] are bringing innovation into their market,” he said.
Harmon said Armorblox will expand its partnering efforts outside the U.S. on an opportunistic basis.
“As we expand and grow … I will expand the program to match that,” he said.
Titus, a data classification software vendor based in Ottawa, today launched a new partner program and moved to a 100% channel model to fuel the company’s growth.
Integration with other security products is an important part of the Titus strategy. The company’s products use machine learning algorithms to automate the identification and classification of data. This task comes at the front end of broader security engagements, which makes Titus software “the tip of the spear” for other security offerings such as cloud access security brokers (CASBs), data loss prevention (DLP) systems and next-generation firewalls, according to Mike Kuehn, chief revenue officer at Titus.
Specifically, Titus data classification products integrate with Netskope’s CASB offering, Forcepoint’s DLP suite and Palo Alto Networks firewalls among other solutions. Titus aims to work with its technology allies’ channels as well as their products.
“Palo Alto Networks only sells through partners,” Kuehn noted. “By injecting our technology into that partner community, they can provide an enhanced solution for their sales and existing customers.”
Titus data classification and the channel
Although Titus previously worked with channel firms, the data classification software company’s approach was more opportunistic and lacked a focus on partner enablement, Kuehn said. The new channel initiative, however, provides self-paced certification programs with sales and technical modules, a partner portal and 30 points of margin protection for Elite-level partners that register Titus data classification opportunities. The vendor is also putting more pre-sales engineering resources in the field to support partners.
The 100% channel approach goes beyond the go-to-market strategy to include other facets of the company, from product engineering to support, Kuehn said.
“If you’re not all in on working with the channel … as an organization, it is really difficult to build an effective program in the field.”
Kuehn said Titus works with value-added resellers (VARs) that sell security offerings, VARs with cloud practices that need to protect customer data migrating to the cloud, managed service providers and global systems integrators. The latter, Kuehn said, help customers architected broader data protection strategies that begin with a data identification and classification phase.
As a Blackstone Group portfolio company, Titus’ goal is to self-fund growth to the tune of 30%-plus per year. The company’s channel partner focus is “our means to accelerate growth,” Kuehn said.
Sensu Inc. is finding an audience for its multi-cloud monitoring tool among managed service providers, which are integrating the product with their ticketing, configuration management database and event logging systems.
The Portland, Ore., company’s offering lets organizations monitor public clouds such as AWS, Google Cloud Platform (GCP) and Microsoft Azure and private clouds built on OpenStack, VMware and Xen. Sensu earlier this month launched a redesigned tool that offers multi-tenancy and support for container and Kubernetes workloads.
“We are seeing an uptick in popularity among MSPs,” noted Sean Porter, CTO at Sensu. “We know MSPs are dealing with a very wide breadth of technologies and we want to make sure we service that.”
Multi-cloud monitoring attracts MSP
8K Miles Software Services, an MSP and cloud consultancy based in Pleasanton, Calif., has been using Sensu for nearly two years. Sudish Mogli, the company’s CTO, said Sensu’s multi-cloud monitoring capabilities are a major attraction. 8K Miles focuses on healthcare and pharmaceutical companies that often inhabit more than one cloud.
Prior to selecting Sensu’s cloud-monitoring tool, 8K Miles said many of the tools the company evaluated were specific to a particular cloud. “We were looking for something that we could spin up in any cloud environment,” Mogli said. “Sensu met our requirement.”
Container support is another plus in Sensu’s favor, Mogli added. “We also have to look at the future in terms of how applications are being containerized” and designed for serverless architectures, he said.
Cloud monitoring tool integration
Sensu has become 8K Miles’ go-to public cloud and hybrid cloud monitoring tool, but it doesn’t exist in a vacuum. The company has built an automation that lets it take Sensu-generated alerts and automatically open tickets in service management tools such as ServiceNow, Mogli said. Other MSPs have also taken the approach of integrating cloud monitoring tools and service management systems.
Porter said that type of integration is prevalent among customers using Sensu’s multi-cloud monitoring tool. He said MSPs integrate Sensu into ticketing applications, CMDBs and event logging systems such as Splunk. Sensu can serve as a backbone and interconnection between an organization’s systems and data formats, he added.
8K Miles, meanwhile, plans to continue its association with Sensu’s multi-cloud monitoring tool. Mogli said other vendors’ monitoring products have come and gone over the past few years, but the company has been able to stay with Sensu because of the range of technologies it covers. “At the moment, I don’t see a gap,” he noted.