Cliff Saran’s Enterprise blog


March 22, 2019  1:23 PM

Welcome to the new legacy

Cliff Saran Profile: Cliff Saran
Uncategorized

Budgets in IT general do not grow at a rate that can sustain stellar financial performance across the IT industry.

However, Gartner’s latest spending forecast has reported that worldwide software spending projected to grow 8.5% in 2019. It will grow another 8.2% in 2020 to total $466 billion. According to Gartner, organisations are expected to increase spending on enterprise application software in 2019, with more of the budget shifting to software as a service (SaaS).

Among the IT firms hoping to capitalise on this shift is SAP, with S/4Hana, the company’s updated ERP system that runs off an in-memory database.

Those organisations still running SAP’s older ERP Central Component (ECC) system are only guaranteed support until 2025; after this data, SAP has not made a firm commitment to carry on support.

Does SAP see Hana as a cash cow?

Looking at the transcript of SAP’s Q4 2018 earnings call, posted on the Seeking Alpha financial blogging site at the end of January 2019, “Hana” is referenced 29 times. “It’s a Hana world,” CEO Bill McDermott, proclaimed. When asked about the company’s plans for growth, McDermott claimed that Hana is the ultimate platform for a modern enterprise. “You think about what we can do with Hana as database-as-a-service, you think about Leonardo with predictive AI and deep machine learning and IoT, we’re going to double down on these thing.”

Putting it more bluntly, the SAP CFO, Luka Mucic, said: “The S/4Hana upgrade cycle drives potential for substantial further renovation of a company’s IT architecture and gives us multiple cross-selling opportunities.”

However, as Computer Weekly recently reported, a new study from Resulting IT, questioned whether IT decision makers can build a compelling business case to do the upgrade from ECC. Computer Weekly spoke to former Gartner analyst Derek Prior, who co-authored the report.

Prior argued that while SAP S/4Hana has lots of nice stuff, he said: “It is different to ECC. You spent decades bedding in ECC, then it all changes with S/4 Hana, which is quite different.”

Complexity and risk of an ERP migration

S/4Hana is not simply an upgrade to the latest version of SAP. It is entirely different, and has new functionality that may not map easily onto how the business currently operates with ECC. The majority of people who took part in the Resulting IT study are most likely to choose a brownfield deployment than redevelop everything they have built into ECC from scratch. Resulting IT believes that with an estimated 42,000 SAP customers on ECC, upgrading to S/4Hana is going to cost £100bn globally in terms of IT consulting.

This translates into a huge expenditure for organisations. In the case of a brownfield deployment, the potential benefits of upgrading to S/4Hana are minimal – organisations will essentially spend a lot of money and may well experience major disruptions by embarking on a new ERP project that effectively delivers more-or-less what they already run on ECC.

Support beyond 2025

Clearly SAP wants people to buy into S/4Hana, and may well incentivise organisations to purchase it. However, organisations should not be held to ransom by their IT supplier. There is no good reason to spend money unless there is a compelling business case. A vague reference to digital transformation and AI-empowered business processes sounds good on paper. However, organisations will need to construct a watertight business case for upgrading from  ECC to S/4Hana.

Just as in legacy banking systems, by using a third party for support, ECC customers can still take advantage of emerging technology and undergo their digital transformations. If a stable ECC can be maintained and supported, organisations can then focus on adding functionality around this, without the need to migrate to S/4Hana. New products and services can be developed entirely separate to the existing ERP, while still keeping ECC, supported by a third party, as the system of record. The core ERP metamorphosizes into a legacy system.

As SAP’s recent Q4 2018 results show, the IT industry relies almost entirely on organisations upgrading. Often these upgrades either add little value, or, as in the case of S/4Hana, customers end up not using the new functionality. So, given that IT budgets are tight, upgrade if there is a compelling. If there isn’t, now is the time to add SAP ECC to the legacy IT estate.

March 11, 2019  9:09 AM

Web @ 30: Thank you for global connectivity

Cliff Saran Profile: Cliff Saran
Uncategorized

For the last three decades since its introduction, the web has informed, educated and entertained society. It has given musicians, artists and businesses of any size connectivity to a global audience. Anyone with something to say, can express themselves and publish on the web.

Sir Tim Berners-Lee could not have anticipated that his invention would have such a profound impact on society. Who would have thought in 1989 that with just a few taps on a touchscreen-enabled device or the click of a mouse, a global connected web would make it possible for someone to stream music and movies; transfer money and pay for goods instantly;order a pizza; book a foreign holiday and arrange a taxi pickup.

Online replaces high street shopping

Blockbusters, Maplins and many high street retailers have failed to capitalise on the opportunities the web offers. Instead, the likes of Spotify, Netflix, and the behemoth, Amazon, are increasingly taking a bigger and bigger share of people’s wallets. Perhaps Maplin’s next chapter, as an eclectic online bazaar for all things tech and electronic, may turnaround the business.

In the UK, department stores like Debenhams and House of Frasier have failed to stem the decline in sales. People can buy things far easier online than trying to track down something they really want to purchase in the high street. John Lewis, a company renowned for its peerless customer service, is another department store coming under the spotlight. In its financial statement, the retailer attributed the poorer than expected results partially down to Increased IT costs. “Over the last few years we have steadily increased IT investment to set ourselves up for the future. A number of those significant new systems are now operational resulting in incremental maintenance, support and depreciation costs,” the company stated.

This shows that John Lewis Partnership is looking at investing in the future. The only way it can address the online threat is to invest heavily in IT. Similarly, the recent princely sum of £750m Marks & Spencers has paid for half of Ocado, shows that investing in technology is the only sure way to keep up with the likes of Amazon, especially since the e-commerce giant acquired Whole Food in 2017 for a whopping $13.7bn – 18 times as much as what Ocado is receiving from M&S. The acquisition of Whole Foods has put Amazon in direct competition with the likes of Waitrose (part of the John Lewis Partnership) and M&S, which may be the reason behind M&S’ Ocado tie-up.

In 1994, when it was set up, Amazon was just an online bookstore. It quickly killed off Waterstones in the UK, and later music stores began to see sales plummet. Remember Tower Records, Virgin Music? HMV is struggling to remain relevant.

Connectivity creates business opportunities

Thanks to its global reach, the web has enabled companies to connect to one another, creating complex business ecosystems, where organisations can find a niche to add value. Ocado, in fact, could be regarded as Warehouse as a Service business – providing distribution and online deliveries for Waitrose, Asda and through its new business venture, M&S.

Even Royal Mail is not immune. It has finally come round to the idea that there is a business in delivering people’s Amazon purchases. It even handles Amazon returns, without the need for the customer to print out a return label. Numerous newsagents and dry cleaners are official drop-off and click and collect partners for online stores. Argos’ click and collect and drop of service for eBay buyers and sellers shows that the high street can adapt.

Changing trade connectivity

The winners on the web will be the organisations that have agile business models, that can adapt quickly to new opportunities.One can imagine that a hotel group like Hilton would never have contemplated that its business could be disrupted by a web service that owned no hotels – but this is exactly what AirBnB has done. Now thanks to the web, Alibaba can offer a global trading hub, connecting Chinese manufacturing directly to anyone who needs something made. Thanks to the global reach of the web, anyone who feels they can spot a product with potential and is prepared to take a punt can connect with supplies based anywhere in the world and become a distributor.

While bricks and mortar businesses have needed to comply with local laws, pay business rates and need to invest in buildings and hire tax-paying staff, online businesses have used global web connectivity to to flaunt local regulations, get around employee law by not having permanent staff, and relocate their head office in tax havens.

This has meant that traditional firms are at a disadvantage and today’s web appears to be owned by a few, mega businesses.

As the web turns 30, perhaps now is the time to sit back and evaluate how best to curb some of its excesses.

The House of Lords, Select Committee on Communications’ Regulating in a Digital World paper, published on March 9, warns: “The digital world has become dominated by a small number of very large companies. These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses. Without intervention the largest tech companies are likely to gain more control”.


March 6, 2019  2:29 PM

WWW @ 30: Why the Web needs to remain open

Cliff Saran Profile: Cliff Saran
Uncategorized

While the internet existed way before the Worldwide Web (WWW), the web changed everything.

Its success has as much to do with the simplicity of using an HTTP web browser, as the fact that it was put into the public domain and the timing of its invention.

A lesson from the past

During the 1980s the TCP/IP protocol evolved to the point where basic command line tools could be used by Unix users and admins to share documents between networked computers.

The internet was predominately used in academia. In the commercial space, proprietary email services offered walled gardens, only available to subscribers. But in 1988, thanks to Vint Cerf, the internet was opened to commercial email services. This opened up the internet to everyone, and laid the foundations for a global communications network connecting HTTPD web servers to users’ HTTP web browsers.

In his March 1989 Information Management: A Proposal paper, Sir Tim Berners-Lee describes the original premise for the WWW as an approach to enable the people at Cern to share documents easily.

While it started at Cern, within three years, the WWW and the HTTP protocol was in the public domain. Then it took off.

No one owns the web

There have been plenty of attempts to make the web proprietary, but in its purest form, the WWW has remained free and open. However, the web represents many more things today compared to 30 years ago. It is the basis of social media platforms, music and video subscription services and global online shopping centres. Every business wants to own their customer’s web experience. But this is not why the web has been so successful.

Last year, Berners-Lee published an open letter in which he explains why the web needs to be more open, rather than users’ experiences being defined by the web giants. He argued that just like a software product, the web itself can be refined, and the “bugs” ironed out.

Just as CompuServe and Aol had walled gardens before the web, now the likes of Amazon,Facebook and Netflix often represent people’s primary experience of the web. They are not public domain. And while they may be built on open source, they are commercial services, effectively closed off from a WWW that offers free access to all. Three decades on since its invention, now is the time for society to consider how the web should evolve and what role commercial exploitation of the web should play.


March 1, 2019  3:41 PM

GDPR: Irish Data Protection Commission may show where the WWW is heading

Cliff Saran Profile: Cliff Saran
Uncategorized

The Irish Data Protection Commission’s (DPC) annual report makes interesting reading, given that the World Wide Web is celebrating its 30th birthday this month.

People regularly give away vast amounts of personal data through social media and instant messaging platforms like Facebook, Instagram, WhatsApp and Twitter.

These web giants need to comply with GDPR. But in its annual report, the DPC said it has 15 statutory inquiries open in relation to multinational technology companies compliance with GDPR.

The firms investigated are: Apple, Facebook, Instagram, LinkedIn, Twitter and WhatsApp.

For Apple, the DPC said it is examining whether the company has discharged its GDPR transparency obligations in respect of the information contained in its privacy policy and online documents regarding the processing of personal data of users of its services.

As for Facebook, the DPC said it  is conducting several investigations on the social media platform’s compliance with GDPR. In relation to a token breach that occurred in September 2018, the DPC said it was looking at whether Facebook Ireland has discharged its GDPR obligations to implement organisational and technical measures to secure and safeguard the personal data of its users. The DPC said it was also looking into at Facebook GDPR’s breach notification obligations.

Facebook LinkedIn, WhatsApp and Twitter are all being investigated in relation to how they process personal data.

GDPR and advanced analytics

The wording in the annual report concerning the  LinkedIn  inquiry is particularly intriguing. In the report the DPC states it is: “Examining whether LinkedIn has discharged its GDPR obligations in respect of the lawful basis on which it relies to process personal data in the context of behavioural analysis and targeted advertising on its platform.”

The fact that the DPC is looking at LinkedIn’s use of behavioural analysis is certainly very interesting. The web giants rely on understanding their users better than they know themselves. This level of AI-enabled advanced analytics and machine learning is now available to more and more organisations, not just the multinational tech companies the DPC is investigating.

The outputs from the DPC’s investigations will very likely influence heavily the way organisations use advanced analytics on web data that can identify individuals.

Ultimately, it may even influence how the WWW evolves and whether today’s web giants as well as those in the making, will be able to sustain business models that see them through for the next 30 years.


February 19, 2019  9:39 AM

WWW at 30: the worldwide swamp

Cliff Saran Profile: Cliff Saran
Uncategorized

Next month will be the 30th anniversary of the world wide web. In 1989, who would have thought the web would touch every aspect of people’s lives – not only in a good way but also in ways that seem to undermine the fabric of society? An elegant way for researchers across the globe to collaborate, has evolved from a platform for free speech into a swamp seeping disinformation, hate, paedophilia and online bullying.

For instance, the BBC’s Countryfile, which was broadcast on Sunday 17th February reported on how illegal gambling rings live stream blood sports like hare coursing and cock fighting over Facebook and YouTube. And on today’s web, it seems open debate and fair comment can lead to a tirade of abuse targeted at anyone who appears to have a different opinion.

People must understand they are being nudged

The DCMS’ Fake News and Disinformation report discusses at length how easy it is for organisations to target social media users en masse in the same way online marketing campaigns are used to sell and recommend products.

The techniques have become increasingly more sophisticated. Behavioural economics uses so-called “nudge” technology, to try to influence people. It seems the ability to target individuals online through the use of carefully crafted online advertising campaigns with subliminal messaging is moving beyond the big marketeers and state agencies with a subversive agenda. Now, a service called TheSpinner claims: “TheSpinner enables you to subconsciously influence a specific person, by controlling the content on the websites he or she usually visits.” Sold as a service starting at just $29.00, anyone can sign up and target another individual, such as in the run up to a marriage proposal, by ensuring their special person sees a series of 10 related articles when they are online. “People need to be aware this technology can be used,” warns Bridget Kenyon, global chief information security officer at Thales, mirroring one of the findings in the DCMS report.

Digital literacy is key

Facebook, YouTube, et al, take the original premise of the web and democratise information sharing to the point that anybody can post an update, image or video anywhere and at any time. Anyone can receive this post, no matter how irrelevant or inappropriate it is. However as the DCMS recommends: “Digital literacy should be a fourth pillar of education. People need to be resilient about their relationship with such sites, particular around what they read and what they write.”


February 13, 2019  9:41 AM

GE Digital addendum

Cliff Saran Profile: Cliff Saran
Uncategorized

The term “digital” is only referenced twice, in the 87 minutes of GE’s Q4 2018 earnings call that took place at the end of January. In a transcript of the call, posted on the Seeking Alpha financial blogging site, none of the financial analysts who participated, asked about the company’s digital strategy in spite of GE announcing a new $1.3bn digital business on December 13 2018. Their main concern was the company’s Power business, which is referenced 62 times in the transcript.

GE was among the industrial giants that showed huge potential in evolving from a company that makes big machines to one that sells software-powered services. It was, but where is it now? Last December, Bill Ruh, the GE chief who led this strategy, left the company.

Computer Weekly first started reporting on the GE story in 2013, when Ruh was vice-president for software at GE Research. At the time Ruh talked about how industrial IoT would enable GE to predict machine failures, and power a service-led business. Ruh eventually headed up the company’s digital division GE Digital.

Reimagining industry

GE joined the ranks of traditional organisations pioneering platform businesses. Its former CEO, Jeff Immelt, was regarded as a digital visionary. Under his watch, GE became a 150-year-old startup. In a 2015 McKinsey article he wrote: “We want to treat analytics like it’s as core to the company over the next 20 years as material science has been over the past 50 years.”

Looking at the company’s Q1 2017 earnings statement,  released just a few months before Immelt stepped  down, he stated: “GE is continuing its portfolio transformation and investing in innovations in GE Digital and GE Additive.” The term “digital” appears five times in this statement.

Immelt’s tenure as the CEO of GE was plagued with problems in the company’s power division, something that remains a big issue.

In June 2017, he was replaced by John Flannery, who has subsequently been replaced by Larry Culp. Winding the clock forward to Q4 2018, and in the latest earnings release on Seeking Alpha, “digital” is referenced only twice, to announce that GE Digital will be spun off as a separate company.

GE also announced an agreement to sell a majority stake in field management services firm ServiceMax, a company it acquired in 2017. As Computer Weekly reported at the time of the acquisition,  ServiceMax was among the key components in GE’s digital strategy. When combined with the concept of running digital twins of customers’ machines,  Ruh believed GE could move beyond simply predicting machine failures. Instead, he said GE would be able to deliver business outcomes to its customers, such as higher production yield.

Digital version 2.0

Analyst Forrester believes among the positives to come out of the demise of GE Digital version 1, is that a new version 2 business will be able to operate as a separate company. “GE Digital can focus on developing as a software business and not an internal IT shop for GE industrial units,” Forrester noted.

However, all along, the challenges GE has faced is that it is not being recognised for its software business. Immelt acknowledged this challenge in a 2017 Harvard Business Review article. “It will take years for GE to fully reap the benefits of the transformations,” he wrote.

When Computer Weekly met Ruh in July 2018, he described how Saudi company Obeikan was building food and beverage applications on top of the GE Digital Predix platform, helping GE expand the reach of its software platform. Now that Ruh has left GE, what will GE Digital version 2, evolve into?

Growing a software arm is hard. Looking at GE’s progress to date, there are wider lessons that can be gleaned. Every business that wants to compete effectively with agile startups and the web platform giants will need to go through challenging times before their transformation is complete.Those who start the journey may not be the ones who complete it.

Incidentally, “Additive”, which Immelt highlighted in his 2017 trading statement, refers to using 3D printing in manufacturing. And yes, GE does have an additive manufacturing division. But it is not mentioned a single time in the transcript of the Q4 2018 earnings call, posted on Seeking Alpha.


February 8, 2019  11:41 AM

IT misaligned with business’ appetite for data

Cliff Saran Profile: Cliff Saran
Uncategorized

A recent pop event at Lloyds Bank in London hosted by DominoData has highlighted one of the age-old ailments of IT. It is the illness of being misaligned with the business.

The latest example of this was succinctly described by MoneySuperMarket.com’s analytics head, Harvinder Atwal during a presentation at the event.

One of the surprises that came out from his presentation is that data scientists tend to do their work on their own laptops.

Laptop data models

So data scientists tend to use their own laptops  with real customer data to create data models. The reason they see a need to do this is  because corporate IT puts in layer upon layer of process and procedures.

Data scientists have to request data access from IT. They need to negotiate with IT for the required compute resources, then wait for these resources to be provisioned. They may need to go back to IT to install query tools. “As a data scientist, you just want to use data as quick as possible,” Atwal said in his presentation. But in Atwal’s experience, IT is stuck in a 20th century operating model. “People don’t have access to data warehouses.”

Continuous improvement

Clearly, having real customer data on a laptop, may well infringe data regulations and is definitely a no-go area for IT security and the corporate governance, regulation and compliance teams. Nevertheless, without this data and the right data manipulation tools and environment, data scientists are unable to do their work effectively. Given that businesses hire data scientists in the hope they will discover some hidden meaning in the masses of data they collect, it seems illogical that access to the data and the tooling the experts need is so complex.

This becomes more of a challenge when the data scientists wish to improve the accuracy of their data model by having the data analytics equivalent of continuous testing and delivery. In an ideal world, a data model will continuously improve once deployed. This is because insights from new customer data can provide a feedback loop. The very structured approach to delivering corporate IT is not aligned with the need for business to gain insights from data rapidly.

DataDomimo is one of a number of companies hoping to tackle this problem by providing a containerised environment for data scientists to work in.


January 29, 2019  9:43 AM

AI keeps seeing average people

Cliff Saran Profile: Cliff Saran
Uncategorized

There is a mantra in technology that if a program is fed bad data, it will produce bad results. The same is true of AI. The quality of a machine learning application is directly related to the quality of data it is fed.

Sébastien Boria, mechatronics and robotics technology leader at Airbus, says an AI system that takes a sample of all the people who work on a particular manual process, to understand how they work, may well come up with a machine learning model based on the average of the people sampled. He says: “Just because 50% of the population does something, does not make it the right solution.”

Statistically speaking, population samples generally follow the so-called Normal distribution, where the majority of people are at the top of the bell curve, and 5% at the tail ends of the curve. This may be fine for discarding anomalous results and one-offs, but when applied to machine learning, Boria believes the top performer’s results “may be seen by the machine as an anomaly.”

Miracle on the Hudson

A decade ago captain Chesley Burnett Sullenberger landed US Airways flight 1549 on the Hudson river in New York after both engines on his Airbus A320 failed. In the FAA transcript of the cockpit recording, air traffic control guided Sullenberger to make an emergency landing at Teterboro airport. He replies: “We can’t do it….we’re gonna be in the Hudson.” There is no flight operations manual on how to land an A320 on a river, but Sullenberger did it, saving 155 passengers and crew.

Industry 4.0 needs quality data

The heroic actions of Sullenberger is just one example demonstrating how humans can think outside the box. It is often necessary to work around normal procedures in order to achieve extraordinary results.

AI and machine learning were among the hot topics discussed at the recent World Economic Forum in Davos. Many believe AI technology will be the fuel that powers Industry 4.0. This is not necessarily about fully automated manufacturing. Instead, relatively small changes can be scaled up, significantly improving operational efficiency. But, as Airbus’ Boria has noted, basing machine learning on the average of how people work, results in average AI. Industry 4.0 needs extraordinary AI, and this means capturing exceptional data.


January 24, 2019  12:57 PM

Windows 7: Don’t expect Microsoft to handle all Windows 10 issues

Cliff Saran Profile: Cliff Saran
Uncategorized

It does not seem that long ago that IT departments were battling with upgrading PCs from Windows XP to Windows 7, as Microsoft finally pulled the plug for support on the legacy operating system.

Now it is the turn of Windows 7, which has become almost a de facto PC standard for end user computing. This is mainly down to the absolute flop that was Windows 8. The idea of trying to push a touch-screen optimised user interface without a start menu button into enterprise IT, probably seems ludicrous now. But that is exactly what Microsoft tried to do with Windows 8. Businesses stuck with what they knew: Windows 7. And now, as those still on Windows 7, start counting down the days before January 14 2020, many see Windows 10 as the only route forward.

Yes it does have a start menu, but it is certainly not simply a more modern Windows 7. The radical change Microsoft introduced with Windows 10, is that it manages the operating system updates – not corporate IT.

Auto-updates keep Windows 10 fresh

Microsoft promises there will no longer be big bang, high risk roll-outs of new Windows operating system. Instead, users will receive fresh  features on an ongoing basis, twice a year.

Why would this not be a good idea?

There are plenty of reasons. Which IT admin would entrust Microsoft for the reliability of the updates that impact an entire PC estate, potentially spanning thousands of machines? After all, it is an approach that works on Google’s Android and Apple’s iOS operating systems.

Application compatibility

But one only has to look on the app store to read of horror stories of apps that used to work until Google or Apple updated their respective OSes. Windows is far worse because Windows claims backwards compatibility. Enterprise software providers do not have to rush out a new version of their applications to support the latest Windows OS if their software will run on an older version in so-called “compatibility mode”. But as anyone who has undertaken application compatibility testing will appreciate, just because something should be compatible in theory, does not mean it works when deployed.

Even something as ubiquitous as Google Chrome, has been known to fail. What hope is there for bespoke software or a third-party applications?

Microsoft may well handle the updates of Windows 10. But IT teams and third-party developers will still need to ensure software is compatible with the update.


January 15, 2019  11:37 AM

AWS mixes toxic cocktail for open source

Cliff Saran Profile: Cliff Saran
Uncategorized

There is currently a crisis unfolding in the open source world, with a number of companies changing their licensing to protect revenue. This has arisen due to a potentially toxic situation where public cloud providers have introduced managed services based on free open source products.

Is AWS toxic to open source?

OSS software providers generally commercialise their offerings by developing value-added services, additional enterprise-grade products and support. This is the kind of model adopted by NoSQL database provider, MongoDB. It can reap the rewards of developing the open source NoSQL database as and when commercial customers pay for this enterprise features. This funding is essential in open source software because it helps smaller open source companies continue to invest develop time on open source products.

However, if a public cloud provider offers its own managed service, that funding quickly diminishes. In October 2018, MongoDB introduced a so-called server-side public licence, which effectively curbs the ability of public cloud providers to build their own commercial services based on the free, open source technology.

So has AWS decided to pay for this licence? It seems it may have not.

AWS DocumentDB: It’s MongoDB compatible

AWS recently introduced DocumentDB, a managed NoSQL database service compatible with MongoDB 3.6. While this is not the latest release, AWS now has a rival cloud-hosted NoSQL product. From a customer perspective, AWS can offer unmatched scale and resiliency across multiple availability zones.

Clearly there are many organisations that have built MongoDB systems on-premise, and many use cases where on-premise makes sense. But managed cloud services. represent the path enterprise IT is taking. MongoDB even offers its own, called MongoDB Atlas, available on the AWS marketplace.

But it would appear that AWS has allowed an open source project to thrive and built business offerings around that product. But when the licensing for that product changed, AWS introduced a rival compatible offering. It may well make commercial sense. But morally, is this right approach a company with such a dominant position in the public cloud market should willingly take?

The internet would not exist without open source software. By adding popular and innovative open source projects to their platforms, public cloud provider lower the technical barriers for enterprises to adopt new technologies. But unless there is a viable way to fund innovative open source companies, public cloud companies like AWS will eventually kill off open source innovation.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: