Open Source Insider

Page 4 of 103« First...23456...102030...Last »

October 25, 2017  1:36 PM

7 years of open source: GitHub, Puppet, DataStax, Severalnines

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

As the Computer Weekly Open Source Insider blog approaches it’s 1000 post since first starting in June 2010, we feature a number of comments from movers (and hopefully shakers) in this space.

Looking back seven years, it was almost slightly hopeful to dedicate a whole column to open technologies, platforms, tools, software languages and wider open design principles.

Today, things are different, as we know… and even Microsoft ‘hearts’ Linux, right?

Let’s hear from Puppet, GitHub, DataStax and Severalnines.

Puppet

Nigel Kersten, chief technical strategist at Puppet speaks first.

“Since 2010 we’ve seen almost all significant software in the infrastructure automation space include a large open source component, so much so that it’s difficult to envisage an entirely proprietary new major player,” said Kersten.

“Take VMware (whose technology is built on proprietary software), the firm has open sourced its SDKs and container runtimes — and we’ve seen Microsoft become an authentic and impactful open source contributor. Plus, and open source projects like Kubernetes, Docker and our own Puppet have all changed the way infrastructure is managed,” he added.

GitHub

Mike McQuaid is senior open source engineer at GitHub.

“In 2010 GitHub had 1 million open source repositories, by the beginning of 2017 this had grown to 67 million. GitHub has become the default place for open source software development and this has been hugely beneficial for the open source community. The workflow for contributing to most open source projects is now the same: create a pull request,” said McQuiad.

DataStax

Patrick McFadin is VP of developer relations at DataStax.

McFaddin asks — what’s changed over the last seven years?

“The biggest change has been the use of more open source and specifically more distributed computing. First it was Hadoop, then Cassandra and Spark. Almost every action you make today on a PC or phone relies on open source projects to store that data. That wasn’t the case seven years ago,” he said.

He continues… what does the next seven years hold?

“More focus on cloud deployment of open source infrastructure around data and machine learning for production, and more around how open source and proprietary software work together.   Building sustainable businesses around open source is, I believe, the best strategy for keeping the best people available focused on the challenges we have around data,” said McFaddin.

Severalnines

Vinay Joosery is CEO at Severalnines.

“While the commercial database market is seeing a regular decline in market share overall, open source database technologies and companies continue to expand and grow… but they have a long way to go. Open source database vendors still only account for a few percent of a $40 billion database market share, the rest continues to be largely dominated by Oracle, Microsoft and IBM,” said Joosery.

Joosery thinks that new entrants have been able to build some decent businesses around their software, they have not been able to monetise it to the same extent as for instance the cloud vendors have. Developers, for example, like using the cloud-based services to quickly and conveniently spin up database instances for their projects.

“Where this is all going is getting harder to predict. On one side, we have hundreds of smaller vendors building different variants of open source databases which are often application-specific. On the other hand, new revenues generated in the market are mostly going to the cloud-based DBaaS businesses,” said Joosery.

Joosery insists that his firm’s focus remains the same… to give users the ability to securely automate and manage open source database technologies using proven methodologies with enterprise-grade tools at a fraction of the cost of those currently dominating the commercial database market.

October 23, 2017  11:49 AM

7 years of open source: Cloud Foundry, Diffblue & Quest

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

As the Computer Weekly Open Source Insider blog approaches it’s 1000 post since first starting in June 2010, we feature a number of comments from movers (and hopefully shakers) in this space.

Looking back seven years, it was almost slightly hopeful to dedicate a whole column to open technologies, platforms, tools, software languages and wider open design principles.

Today, things are different, as we know… and even Microsoft ‘hearts’ Linux, right?

Let’s hear from the Cloud Foundry Foundation, coding tools startup DiffBlue and software management tools company Quest.

Cloud Foundry

Abby Kearns is executive director of the Cloud Foundry Foundation.

Looking at the macro-level progression of technologies across the seven year period in question, Kearns offers some insight into where open source ‘weight’ has been felt over what is most of a decade.

“In mid-2010, cloud computing was beginning to take hold. Amazon Web Services (AWS) was synonymous with the term. While AWS still holds a majority share, the companies growing and building the cloud are doing so on the back of open source technologies like Cloud Foundry, Kubernetes and the Open Container Initiative,” said Kearns. “To put it another way: open source had won the datacentre by 2010. In 2017, it has won the next generation of enterprise technology: the cloud.”

Diffblue

Daniel Kroening is professor of Computer Science at Oxford University and founder of Diffblue, a automated code quality testing and management tool.

“Throughout these past seven years, open source has been thoroughly embraced by companies – nine out of ten UK enterprises currently use open source software to reduce IT project costs by over £30,000, with 78% of companies using it. The community has earned a lot of trust. However, software developers often erroneously assume that open source components are reliable, patched and up to date but more than 50% of the Global 500 use vulnerable open source components,” said Kroening.

Kroening and team say that their goal with Diffblue is to automate ‘all traditional’ coding tasks: bug fixing, test writing, finding and fixing exploits, refactoring code, translating from one language to another and creating original code to fit specifications.

Quest

John Pocknell is senior product manager at Quest.

Pocknell says that since the turn of the decade, the open source movement has played a critical role in the evolution of technology and the way products or services are built.

“Innovation thrives on the contributions from the developer community and despite resistance in the early days, it’s given rise to open development platforms like GitHub and Docker, but also new toolsets for development, management, and migration of databases. Now companies who were built in the age of hardware can still thrive in the open, digital-first era, thanks to open source,” said Pocknell.


October 21, 2017  5:06 PM

7 years of open source: Digital Guardian, SUSE, Red Hat & Sonatype

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

As the Computer Weekly Open Source Insider blog approaches it’s 1000 post since first starting in June 2010, we feature a number of comments from movers (and hopefully shakers) in this space.

Looking back seven years, it was almost slightly hopeful to dedicate a whole column to open technologies, platforms, tools, software languages and wider open design principles.

Today, things are different, as we know… and even Microsoft ‘hearts’ Linux, right?

Let’s hear from data loss prevention software company Digital Guardian, enterprise Linux distribution specialists SUSE and Red Hat, plus also software repository and software supply chain automation firm Sonatype.

Digital Guardian

Global security advocate at Digital Guardian is Thomas Fischer. Saying that the speed with which open source has become an industry norm is remarkable, Fischer is upbeat about its wider application at the enterprise level going forward.

“Today, open source software is not only used by businesses, but also as the foundation of many cloud-based vendor offerings. Open source software has also become a key element of the cybersecurity arsenal, from simple tools like Netcat through to more complex, customisable tools like the Social Engineer Toolkit (SET). In fact, so well known are these kinds of open source tools that they even feature in popular TV shows like Mr Robot,” said Fischer.

SUSE

Danny Rowark is regional director for EMEA West region at Germany headquarterd open source perennial SUSE.

Rowark insists that, over the last 25 years, the open source community of thousands of developers and companies has produced more innovative technology than individual companies ever could.

“United and based on transparent open-source systems, the community emerged as a breeding ground for innovative technology. Examples of open-source innovation include software-defined networking and IaaS, containers, Cloud Foundry, OpenStack and Ceph or the DevOps concept — essential technologies for modern business models in and out of the cloud, forming the basis of digital transformation,” said Rowark.

Red Hat

James Read is senior solutions architect at Red Hat.

Read has pointed out that open source continuously evolves over time.

“Since 2010 (when this column started), large enterprises have increasingly adopted open source as part of their strategic direction. Linux and open source is omnipresent in enterprise IT and the community has been the birthplace of technologies like containers, automation tools, and the software defined datacenter. These are technologies on which these large enterprises increasingly rely upon to embrace digital transformation successfully and move towards a software defined business model,” said Read.

Sonatype

Derek Weeks is vice president and DevOps advocate at Sonatype.

Weeks points out that organisations today enjoy an infinite supply of open source component parts to build software applications.

“[As much as] 80% to 90% of every modern application consists of open source components, largely developed and maintained by a highly responsible community of volunteer contributors who provide fixes for vulnerabilities that transpire,” said Weeks.

Weeks continued by saying that the ‘onus’ is on organisations to govern the quality of open source components within their software supply chains.

“Evidence shows that those who deploy DevOps-native automation to software supply chain governance improve application quality by up to 63%. Those that don’t, face increased liability due to gross negligence and consequential security breaches,” he concluded.


October 20, 2017  6:04 PM

Microservices served on blockchain, in open source

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Cloud application marketplace company Wireline is working with open source blockchain project developer Qtum

The new union is intended to provide a conduit to consuming microservices at [web] scale using blockchain at the core.

As we know, microservices offer the ability to create Application Programming Interfaces (APIs) without having to manage the underlying hardware and software infrastructure.

Wireline is building an ecosystem based on a microservices exchange; it is a marketplace for developers to test and integrate open source apps. The company is also seeking to create the largest open source developer fund, whereby developers building software critical for serverless architecture and the blockchain ecosystem are rewarded.

“This cooperation opens up new possibilities for both Qtum and Wireline developers for connecting distributed applications,” said Lucas Geiger, CEO at Wireline. “With Qtum we want to change this. Through the combination of both our efforts through the Qtum Foundation and Wireline Developer Fund, we will create an opportunity to monetize existing apps as well as support the creation of new ones.”

Qtum aims to establish a platform designed to bridge the gap between blockchains and the business world. The firm’s strategy includes providing a toolset to standardise the workflow of businesses and a hub of tested and verified smart contract templates that address various specialized business-use cases.

“We’re excited to collaborate with Wireline as we have a shared mission — to decentralise the application market and create a platform to facilitate the use of blockchain in business,” said Patrick Dai, co-founder and chairman of the Qtum Foundation. “We think this is beneficial to our developer community and will dramatically reduce the time to market for many of their apps.”

The Qtum a blockchain application platform combines the functions of Bitcoin Core, an account abstraction layer allowing for multiple virtual machines and a proof-of-stake consensus protocol aimed at tackling industry-use cases.

The Qtum Foundation, headquartered in Singapore, is the decision-making body that drives the project’s development.


October 20, 2017  1:14 PM

7 years of open source: Twilio, Synopsys & Veracode

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Open source

As the Computer Weekly Open Source Insider blog approaches it’s 1000 post since first starting in June 2010, we feature a number of comments from movers (and hopefully shakers) in this space.

Looking back seven years, it was almost slightly hopeful to dedicate a whole column to open technologies, platforms, tools, software languages and wider open design principles.

Today, things are different, as we know… and even Microsoft ‘hearts’ Linux, right?

Let’s hear from code analysis and vulnerability software company Veracode, cloud communications platform firm Twilio and Synopsys Software Integrity Group.

Synopsys

“What was once considered fringe and anti-establishment has now become the norm powering some of the largest technological innovations of our times. In the fields of artificial intelligence, machine learning, autonomous driving and block chain, OSS leads the way,” said Mel Llaguno , open source solution manager  at Synopsys Software Integrity Group.

Twilio

“These days it feels like there is an open source project for just about any computer science challenge that exists. Thanks largely to SourceForge and GitHub, community development and collaboration is easier than ever, leading to more creating and sharing between developers and as a result, more innovation,” suggests Patrick Malatack, VP of product & GM of messaging at Twilio.

That being said, Malatack says that scaling and operating the services you’re building is still an incredible development challenge.

“At Twilio, we believe serverless computing paired with web services like AWS, Stripe and our own communication APIs will have the same effect on operations that open source has had on developer collaboration,” said Malatack.

Veracode

Solution architect at Veracode Chris Campbell says that 0pen source software is clearly enabling business to lean in on community expertise and deliver value from applications faster than ever before.

“But as recent high-profile breaches have shown us, there are tangible consequences to customers and employees if the vulnerability risk associated with OSS components isn’t managed effectively,” said Campbell.

Veracode’s 2017 State of Software Security report suggests that 88% of Java applications have at least one vulnerability from OSS components.

“The tools already exist to record and deal with OSS risk, many businesses now need to build these in to their application security programs as a top priority,” notes Campbell.


October 16, 2017  4:53 PM

Huawei Kirin 970 AI processor — an open platform & ecosystem

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Huawei Consumer Business Group has introduced its first mobile engineering offering with a distinct and dedicated Artificial Intelligence (AI) engineering chipset, the Kirin 970.

Positioned as inherently ‘open’ as a technology to serve as a platform for onward development by programmers looking to put AI-driven functionality into applications and information services targeted within the mobile arena, the Chinese telecommunications and cloud company has shipped the Kirin 970 in its new Mate Series devices.

Intelligent machines

Preferring to label its newest Mate 10 and Mate 10 Pro devices as so-called ‘intelligent machines’ rather than ‘plain old’ smartphones, Huawei insists that this generation of products can now combine a degree of both cloud AI and on-device AI.

The company suggests that combination of these two streams of AI should now be viewed as a programming platform opportunity for developers who want to bring more AI functionality into the software they produce.

Sensing sensors

Part of where Huawei says developers can look for AI-driven AI-enabled software innovations is the fact that the Kirin 970 does (of course) ship on a mobile device, a smart smartphone. This in and of itself opens the door for AI-analytics focused on the data being produced from a smartphones many sensors.

With contemporary phones now featuring perhaps 20 or so sensors (accelerometer, gyroscope, temperature gauge, camera and light sensor, pedometer and so on)… there is a good chance to start chanelling some of these information streams into AI-centric applications.

According to an official Huawei press statement, “By combining individual and collective intelligence for on-device AI, the new Huawei Mate Series delivers real-time responses to users, including AI-powered Real-Time Scene and Object Recognition and an AI Accelerated Translator. Kirin 970 is an open, mobile AI computing platform for third parties to create new and imaginative AI applications and which extends Huawei’s processing capabilities to the entire value chain.”

Speaking at a launch event in the Bavarian city of Munich this month was Christophe Coutelle director of software engineering technology development and branding at Huawei Consumer Business Group (CBG).

Coutelle reflected upon Huawei’s central positioning for these technologies and promised that software application development professionals will find the company to be both engaging and helpful in its dealings with programmers who seek to create new AI tools with the Kirin 970 chipset.

An open AI ecosystem

Huawei wants third party apps to be driven by its own approach to AI chipset engineering.

“This is one of the first flagship devices that ship with Android Oreo operating system. We believe that computing is shifting from mobile first to AI first. Mate 10 will also include Android’s neural network API so developers can enable more AI functionality in their applications… this will come in a software update scheduled for 2018. The device also ships with Google Play protect to provide a security layer,” said Jamie Rosenberg VP Android and Google Play.

NOTE: Breakout sessions at the Mate 10 launch event featured discussions with Kevin Ho in his position as president of Huawei’s handset product line. Pressed on whether Huawei would ultimately see itself working to sell chipset engineering design Intellectual Property (a la ARM, for want of an obvious example) Ho would not elaborate beyond saying that whitepapers would be forthcoming detailing the firm’s specific and detailed plans for the developer ecosystem it wants to build.

The Google Assistant has also been optimised to work on the Mate 10.

The Huawei Mate 10 and Mate 10 Pro launch with Huawei’s EMUI 8.0 powered by Android 8.0.

 


October 3, 2017  10:40 AM

Databricks ‘lays foundations’ for machine learning (sorry)

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Everybody is loving that thing we’re all calling machine learning aren’t they?

Splunk wants to make it mainstream, several other firms want to demystify it (or probably democratise it… or both) and Databricks wants to basically just make sure we all get a slice of the pie.

The firm says that while Artificial Intelligence (AI) and machine learning technologies are hot… the lack of data science and software engineering skills in this space is holding us back.

Spark up

The company itself is known for its Databricks-branded Unified Analytics Platform and the company was founded by the team who created Apache Spark – the open source processing engine built around speed and sophisticated analytics.

Databricks is keen to tell us that it has opened a new EMEA headquarters in London and there is probably a list of new VP-types to help celebrate the fact.

Of more interest (arguably), we can look at how the firm’s analytics platform might now help bridge the gaps that exist around automating analysis of data and deploying it to suit the business — this, could be the real sensitivity point.

“We are seeing huge and rapidly growing demand within enterprises for help with their machine learning and AI strategies and specifically their Spark deployments,” commented Ali Ghodsi, CEO at Databricks. “The demand is particularly strong in Europe and as such, it is our top priority for global expansion. With the opening of our London office, we will have a stronger springboard to drive growth and better serve our customers in EMEA.”

Shell has said that working with Databricks has allowed the petrochemicals giant to use  big data  across its business and create operational efficiencies in its supply chain.

Fertile breeding, in London

Deputy Mayor of London for business Rajesh Agrawal has added appropriate platitutdes and claims that the UK capital is “a fertile breeding ground” for companies developing disruptive technologies.

In an effort to accelerate global growth, the company recently raised $140 million from the likes of Andreessen Horowitz, New Enterprise Associates and Battery Ventures.


September 22, 2017  6:45 PM

Shawn Powers on Ansible: inside the mind of a veteran sysadmin

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Open Source Insider column written by Shawn Powers — describing himself as Linux Journal editor, CBT Nuggets trainer, writer and geek all wrapped into one, Powers has been teaching IT for more than a decade and his specialties are Linux, Chef and integrating multiple platforms for larger networks.

Powers has just launched a new Ansible training series – Ansible Essentials – to help users develop the knowledge and skills required to install, configure and manage Ansible, the open source software automation, configuration management, application deployment and task automation platform.

The Ansible Essentials series from Powers covers playbooks, roles, templates, loops and conditionals, Ansible Tower etc.

Powers writes as follows…

I love to do system administration and operations work. I’ve been doing it for over two decades and few things bring me as much joy as solving a problem with a keyboard and green text on a black background.

I do not, however, enjoy doing the same thing more than once.

Like most veteran sysadmins (many of which have changed titles in the migration to a DevOps world), I have a /usr/local/bin folder chock full of scripts that have automated my workflow thousands of times. Some of them are brilliant. Some of them are horrible insecure kludges. None of them are as elegant as Ansible.

When I describe Ansible to folks, I explain that it’s as if all of my BASH scripts were written by a team of professional programmers. Where I have a bunch of hacks, Ansible provides elegance. It’s not that Ansible does things I couldn’t have done on my own, it’s that with Ansible it’s done efficiently, securely, and admittedly better that I could do on my own.

Basically, Ansible is what we always knew we could accomplish with scripting, but never had the time to fully realise because there was too much work to do. If I had Ansible 15 years ago, I might have been able to actually work a 40 hour week as opposed to the 60-70 hours I put in most weeks.

What is Ansible, I mean really?

That doesn’t really answer the question of what Ansible “is”, but simply saying it’s a configuration management and automation tool in the DevOps world doesn’t do it justice.

Ansible allows you to apply management tasks to 1000 computers with the same amount of effort as doing those tasks to one single computer.

1edededwdedw2eIt doesn’t need a fancy server running, in fact, it doesn’t need any server running at all. Ansible is simply a client-side toolset that you can run from your laptop. It also doesn’t have any agent program running on the machines it manages. It uses plain old SSH, just like us crusty old sysadmins have been using for years.

If I’m being completely honest, it’s the lack of “stuff” that makes Ansible good.

Using either ad-hoc command line commands, or simple to read text “playbook” full of commands, Ansible handles day-to-day tasks with minimal effort. I can confidently say it’s easier to use Ansible than it is to learn the BASH scripting which previously defined my IT world. Ansible is faster, easier, more scalable and better than my handmade tools could ever hope to be.

… and thanks to its open source nature, it’s completely free.


September 13, 2017  12:31 PM

Richard Morrell: a brief history (of life) in open source

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Open Source Insider blog written by Richard Morrell, group CTO at Falanx Cyber Technologies and CISO of the Cloud Security Alliance.

Morrell looks back fondly on seven years of the Open Source Insider blog as we approach our 1000th post.

Hello open source world

Emerging from the shadows they came, predominantly genuinely well-meaning… early open source advocates were hobbyists bent on creating a new world order.

They were in the process of defining not just an operating system but a hierarchical meritocracy built around an ecosystem of licences giving birth to libraries, utilities, frameworks, applications and more importantly the skeleton of Internet and mobile computing.

I’ve been very fortunate to make a living out of open source since 1998. Founder staffer at many of the open source startups that grew out of the Bay Area Linux User Groups and Silicon Valley Linux User Groups, I worked at Linuxcare and VA Linux, founded my own global security project and ecosystem SmoothWall and the company of the same name (still trading today protecting millions of networks globally including major retail brands and governments).

I worked with Red Hat until the end of last year and am now at Falanx in the UK – a firm building possibly the fastest and most intelligent security platform to ever emerge from the open source community.

I’ve earned my chops and seen it all en route.

Friendly journalists, really

So there have been calls to arms, friendly journalists who were always there to write editorial or who would use us to correct stories or to check content.

Journalists gave us a lot of important validation, to see yourself in print media other than on Slashdot was a massive vote of confidence. None more so the last seven years than the Open Source Insider blog on Computer Weekly.

Over the years it has really challenged the market to stand up to be counted and to give an important credence to the dynamic way open source was changing not just the server landscape and being the backbone of mobile development but also as an important witness to the three critical building blocks of technology we take for granted.

Those blocks being cloud, communications and portability.

These three standards may not seem to jump off the page, but cloud has succeeded by giving the major vendors and the major social media and retail platforms the ability to offset the cost of ownership of Microsoft Windows or Microsoft instances in a virtualised environment by using custom built, fit for purpose Linux OS derived instances and services to spin up at scale environments and architectures and to use the inherited security and network controls to tie them all together — scalable, secured, built on the code looked at by a million pairs of eyes in the community.

The disruption of emerging messaging protocols in the last seven to ten years first saw open source clients being able to piggyback to use ICQ/MSN/AIM and other standards and then, as vendors such as Microsoft and AOL made life harder at the API level to play, to replace those vendors with standards such as AMQP and XMPP.

Platform games

Lighter, more secure a published open standard with an entirely open ecosystem it has meant the emergence of the likes of WhatsApp. A platform developed entirely harnessing the open source framework Erlang to become the dominant market winner in messaging — and then to build an open source security framework for protecting messages in transit and at rest using a PKI built around OpenSSL.

This again demonstrates how the use of open source is hands down the clear unrestrained factor in where we are going.

Portability has become a watchword for the automation of how we get applications to live. Whether you are designing and developing your application using Python, Ruby, Java or any other open framework in your open source development environment or tooling, the chances are you are going to use technologies such as GitHub or GitLab.

Automate for speed

If you are going to automate to speed up and prove your workings using tools such as Ansible or Chef – you aren’t going to be using proprietary platforms for pushing out and managing those applications in their lifecycle.

From the cloud to mobile application development to the fast emerging Internet of Things open source has dominated and done so with grace and applomb.

But has the legacy IT industry kept pace ?

Established system integrator partners who traditionally resold or partnered with the heavyweight platforms, operating systems and application vendors continue to be profitable but there is now a defined ‘them and us’.

No longer are the big system integrator partners the automatic go-to choice for customers. Unable to hire and retain the talent they need and still to a degree how to monetise open source the chasm continues to grow and with that split comes a defined lack of innovation from the old world order. That innovation continues in the open source community and from the least likely sources.

A feeder system has emerged spurred by the articles such as those published in Computer Weekly, demystifying the dark arts of the community. That feeder system of Meetups and Eventbrite meetings means that on any day of the week somewhere on the planet there are at least two dozen meetups ranging from Java users, to Python developers, across cloud, security, mobile development and every topic you can imagine.

User group goodness

When we started with Linux, the ecosystems the Linux User Groups (LUGs) were friendly if combative bastions of pride with fierce mailing lists and (to be totally honest) dubious grooming. Now Meetups and Eventbrites have taken their place offering specialised homes for evolution and integration, communication and innovation.

Richard Morrell on open source: been there, done it, got lots of T-shirts

Richard Morrell on open source: been there, done it, got lots of T-shirts.

This is innovation often away from the radars’ of vendors but doing one important thing. Sharing use cases and learning from each other. These two emerging standard bearers have importantly provided a structure for user communication and education, strengthening the ambition of developers and operation staff alike to take huge leaps in utlising OSS.

If someone had said in 2001 that Microsoft would be working so much in open source I’d have laughed so hard then told you that you needed assistance.

A seed change within Microsoft led by the openness team and then Satya Nadella and his vision for a new Microsoft is very real. Customers are benefitting from the investment that Microsoft and Red Hat and others are putting in to make cohabitation alongside each other and supporting each other is a prima facae example of how open source is very here to stay.

As I said at the beginning, supporting us along that way are (in my estimation) eight journalists who have given us a standing and aircover. The Computer Weekly Open Source Inside blog has been huge to so many of us. Here’s a glass raised what is now seven years and lets see where we can take the open source dream next.


September 13, 2017  11:34 AM

Swiss financial services outfit SIX rolls for Red Hat

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Linux has always done well in the server room.

As we know, Linux is particularly good at uptime, running multiple processes at once and configuration changes can be executed while systems are live and running. Linux is also eminently controllable and so well suited to the management backend… the list goes on and it’s a long one.

Swiss Red Hat

News this week then sees open source behemoth Red Hat confirm its work with Swiss financial services provider SIX. The numerically named organisation has deployed Red Hat Enterprise Linux as the primary operating system supporting its IT infrastructure.

Part of a modernisation programme, SIX will run a DevOps strategy supported here by Red Hat OpenShift Container Platform and Red Hat JBoss Enterprise Application Platform.

SIX operates an infrastructure for the Swiss financial sector and provides financial services for businesses around the world. The organisation is aiming to use new application architectures such as microservices and containers, to optimise its use of private cloud technologies.

According to Red Hat’s Switzerland country manager Léonard Bodmer, “Red Hat OpenShift Container Platform is Red Hat’s enterprise-grade Kubernetes container application platform. It enables organisations like SIX to build, manage and scale cloud-native applications alongside legacy applications across hybrid and multi-cloud environments, in a stable, reliable and secure manner. Red Hat OpenShift Container Platform provides tools and processes to support DevOps, including automated provisioning, management and scaling of applications.”

JBoss usage

SIX has also selected Red Hat JBoss Enterprise Application Platform as its Java EE-based application runtime platform in order to deliver greater enterprise-grade security, performance and scalability across its environments.

By using Red Hat Enterprise Linux as a standard operating system backbone for all of its infrastructure, SIX says it expects to gain performance benefits whilst simultaneously helping to streamline administrative work — SIX will also use Red Hat Satellite as its management console to oversee its Red Hat Enterprise Linux footprint.

Image credit: SIX

Image credit: SIX


Page 4 of 103« First...23456...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: