CW Developer Network

Page 3 of 10712345...102030...Last »

March 8, 2018  5:25 PM

Densify Cloe aims to wise up resource-dumb smart-apps  

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is the age of women in technology and male appreciate of feminism and the drive for equality (although of course the first computer programmer was indeed a woman), so should we be using the ‘she’ pronoun for technology products and services as we sometimes do with countries and ships?

Cloud data management company Densify (the artist formerly known as Cirba) might think it’s okay, because the firm has named its latest product Cloe.

Actually not intended to convey any element of femininity, CLOE stands for CLoud Optimisation Engine – it is a software tool designed to provide machine-learning based analytics to optimise public cloud consumption.

But that makes this a channel story, not a developer story, doesn’t it?

Skynet self-awareness

No it doesn’t, Densify says that its technology enables applications to be self-aware of their resource needs. The implication for developers being that with a larger percentage of architectural provisioning and application resource management taken care of by this layer, they themselves will be able to focus on things developers enjoy like functional features along with look and feel.

Densify claims that Cloe beta customers are now saving an average of 40% on public cloud costs, with some exceeding 80%.

As we move to cloud (and the notion of cloud native application development) it is important to remember that application demands can fluctuate every day, hour and minute of the week.

Add this complexity to the fact that in parallel, every single compute instance of cloud can have millions of permutations.

What Cloe does is to analyse cloud usage patterns so that Densify (the higher level product from which the company takes its brand name) to proactively make applications self-aware of their resource needs – matching application need to available cloud resources.  

Smart-apps, resource-dumb

“While applications perform an infinite number of critical tasks, they aren’t resource-smart and they are often allocated massive amounts of unneeded public cloud resources,” said Gerry Smith, CEO, Densify.  “Cloe becomes the resource intelligence of an application, allowing that application to be self-aware of its resource usage patterns and to re-align public cloud use to its needs.”

According to a magical analyst house Gartner, cloud services can [typically] have a 35% underutilisation rate in the absence of effective management, as resources are oversized and left idling.

Densify Cloe offers multicloud support so that applications are provided with the right resources even when simultaneously using multiple cloud vendors; Cloe recommends the best cloud technologies for the given application.

March 6, 2018  12:45 PM

What’s so ‘unified’ about universal data analytics?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Databricks styles itself as an analytics software company with a ‘unified’ approach – the unification is this sense is supposed to suggest that the software can be applied across a variety of datasets, exposed to a variety of programming languages and work with a variety of call methods to extract analytical results from the data itself.

This UAP (as in universal analytics platform) product also does the job of data warehouse, data lake and streaming analytics in one product, rather than three different ones.

So, more unifying unity, essentially… and that unification spans three basic data domains.

  • Data warehouse – trusted data that has to be reliable, kept for years – but is difficult to get everyone using
  • Data lake – great for storing huge volumes of data, but data scientists are typically the only ones able to use it
  • Data that exists in streaming analytics engines – great for working on data mid-stream – can’t do the jobs of the other products.

It also unifies the data into one place so you can make it easier for data engineering, data science and business teams to all get what they need out of big data.

Product updates

Ground level definitions out of the way, what has Databricks been doing to add to unified utopia?

The company has this month announced Apache Spark open-source cluster-computing framework. 2.3.0 on Databricks’ Unified Analytics Platform. This means that the company is the vendor to support Apache Spark 2.3 within a compute engine, Databricks Runtime 4.0, which is now generally available.

In addition to support for Spark 2.3, Databricks Runtime 4.0 introduces new features including Machine Learning Model Export to simplify production deployments and performance optimizations.

“The community continues to expand on Apache Spark’s role as a unified analytics engine for big data and AI. This is a major milestone to introduce the continuous processing mode of Structured Streaming with millisecond low-latency, as well as other features across the project,” said Matei Zaharia, creator of Apache Spark and chief technologist and co-founder of Databricks. “By making these innovations available in the newest version of the Databricks Runtime, Databricks is immediately offering customers a cloud-optimised environment to run Spark 2.3 applications with a complete suite of surrounding tools.”

The Databricks Runtime, built on top of Apache Spark, is the cloud-optimised core of the Databricks Unified Analytics Platform that focuses on making big data and artificial intelligence accessible.

In addition to introducing stream-to-stream joins and extending new functionality to SparkR, Python, MLlib and GraphX, the new release provides a millisecond-latency Continuous Processing mode for Structured Streaming.

Data developers

Instead of micro-batch execution, new records are processed immediately upon arrival, reducing latencies to milliseconds and satisfying low-level latency requirements.

This means that developers can elect either mode—continuous or micro-batching—depending on their latency requirements to build real-time streaming applications with fault-tolerance and reliability guarantees.

The new model export capability also enables data scientists to deploy machine learning models into real-time business processes.

There is, arguably, unification here of many types and at many levels.


March 5, 2018  10:24 AM

Phantom SOARs into Splunk, at machine-speed

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Machine data analytics company Splunk has acquired security orchestration firm Phantom Cyber Corporation.

Phantom is more widely known as a SOAR player – Security, Orchestration, Automation and Response (SOAR).

Splunk CEO Doug Merritt is on the record with the customary niceties designed to resonate with similar platitudes from Oliver Friedrichs in his capacity as founder and CEO of Phantom.

Both chiefs have suggested that Splunk plus Phantom is a positive for software engineers involved with security orchestration.

It is, in effect, big data plus SOAR.

SOAR, at machine-speed

SOAR platforms bid to improve the efficiency of security operations by automating tasks, orchestrating workflows, improving collaboration and enabling security software/data developers and their operations counterparts to respond to incidents ‘at machine speed’, as they say.

According to the magical box-loving analysts at Gartner, by year-end 2020, 15% of organisations with a security team larger than five people will be using SOAR tools for orchestration and automation reasons – and that’s up from less than 1% today in 2018.

According to a press statement, “Customers will be able to use Splunk technology for orchestration and automation as an integral part of their Security Operations Center (SOC) platform to accelerate incident response while addressing the skills shortage.”

Splunk now talks about using automation capabilities to help solve automation challenges in a widening range of use cases, including Artificial Intelligence for IT Operations (AIOps).

 


March 5, 2018  7:37 AM

What shape is software automation DNA?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Automation is driving software application development architectures, but its shape, form and process is rarely defined with any clarity.

In real terms, we might explain automation’s application within applications (if you will pardon the expression) by pointing to the categorisation, validation and contextualisation of data loads that move in specific directions in light of their relationship with each line of business function.

When and where we can do this, we can automate the next (similarly categorised, validated and contextualised) flow of data that has to move through the total system.

Automation DNA

It is at this point that we can start to ‘automatically discover business processes’ based upon defined and described logic.

This key element is part of what Worksoft is doing with its automation software. The company has now re-architected its core platform for automating cloud-based applications with functions from its partner network and third party mobile solutions in areas such as ALM, defect management and Continuous Integration/Continuous Delivery (CI/CD).

“We spent three years completely re-architecting our platform to be more modular and [to support industry[ with mobile, HTML5 web apps, process discovery [and] pre-built optimisations,” said Shoeb Javed, chief technology officer, Worksoft.

The company now claims to span core software testing capabilities and functions in automation for business process discovery, compliance, documentation, risk analysis and robotic process automation (RPA).

Transferable automation

The platform offers what has been called ‘transferable automation’ – so that ‘automatically discovered business processes’ can be used at scale across the business, IT application teams, QA departments and operations.

Worksoft integrations with ALM and defect management software include HP ALM, SAP Solution Manager, IBM RQM, JIRA and ServiceNow.

Support for CI/CD tools include Jenkins, Visual Studio and TeamCity. Integration with open source tools includes SoapUI and Selenium.

Mobile support includes Perfecto and Experitest.

Image source: Worksoft


February 26, 2018  3:31 PM

SAP Cloud Platform SDK for iOS reflects ‘componentised’ era for software

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

German softwarehaus SAP has pointed to a changing set of models which could (and for many arguably should) change the way software is bought.

In the world of composable cloud-based componentised decoupled software, we now have the chance to use Platform-as-a-Service (PaaS) technologies in a way that suits the more modular way that software application developers will naturally want to work inside Agile programming teams running in Continuous Delivery (CD) cycles.

To get to the product news, SAP is now offering a new consumption-based commercial model for customers to ‘consume’ the SAP Cloud Platform.

The company reminds us that the SAP Cloud platform itself is the raison d’etre behind the creation of its higher level design and architecture templating offering SAP Leonardo.

SAP Cloud Platform president and chief technology officer Björn Goerke has claimed that the company’s the new consumption-based commercial model makes it easier to use the ever-increasing SAP portfolio of platform and business services.

“Innovations in mobile delivery with consumer-grade experiences give organisations [and the developer teams that build the software they use] more freedom and agility to create and reinvent,” said Goerke.

SAP using the term ‘low-touch customer experience’ as a new way of explaining how software will be bought. This is meant to explain the way that software can be configured and subsequently used by software developers using SAP Cloud Platform services.

Cloud credits

The purchasing process is facilitated by what SAP has called ‘cloud credits’, so that any available SAP Cloud Platform service can be activated via a single ‘provisioning cockpit’.

The commercial model is supposed to provide transparency into the usage of each service consumed through regular metering, reporting and detailed accounting analytics on the customer’s cloud credit consumption and balance.

To put all of that in layperson’s terms… a software developer who wants to use lots of cloud technologies and drink deep from the SAP stack of essentially cloud-based platforms, applications, tools and related services can now agree a state of work with his or her manager that allows them access to a specified level of functions (amount of storage, number of applications, number of analytics calls) but also work with an entitlement to use all other elements of the stack within defined limits so that when they want to initiate a new project using new parts of the total SAP stack they can do so with minimal fuss because that facility is there so it is, therefore, called low touch software delivered with a consumer like experience.

SAP Cloud Platform SDK for iOS

Related to this news, SAP also announced an enhanced SAP Cloud Platform SDK for iOS to provide consumer-grade mobile work experience with new controls and tighter integration with the Xcode integrated development environment, as well as integrating with other SAP Cloud Platform capabilities for mobile apps.

Developers can now access APIs from SAP API Business Hub within the apps and translate apps using the SAP Cloud Platform Translation Hub.

In addition, analytic controls enable real-time data analytics and visualisations within mobile apps from SAP S/4HANA, SAP Big Data Services and SAP HANA.

Additional updates also supposed to make it easier to use iOS device capabilities in end-to-end processes for administrative tasks, such as onboarding. Also included are code examples, which explain how to trigger SAP Leonardo processes with a mobile device, such as image recognition.

SAP also introduced two new iOS mobile applications:

  • SAP Insurance Sales Assistant enables an insurance agent to manage sales activities by providing customer insights via a customer view and an overview of performance KPI’s.
  • SAP Asset Manager is as an iOS app that uses SAP S/4HANA as well as SAP Cloud Platform as the IoT platform for managing work orders, notifications, condition monitoring, material consumption, time management and failure analysis.

So once again we’re looking at automation as a key element of the activity going on here. This is automation in the sense of automated control options within the wider software stack and cloud PaaS being used … and also automation here in the sense of pre-integrated options for deploying more software (with low-touch consumer grade experiences) from the total stack and platform itself.

The consumption-based model of software is here… the road ahead promises to bring the outcomes-based model where software delivery is priced based upon its effectiveness for a given use case… but that’s another story, soon to be told.

Image credit: SAP newsroom


February 26, 2018  11:22 AM

Mobile World Congress 2018 #MWC18: confessions from the show floor

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Mobile World Congress is a love it or hate it affair. Three days of treading what is now Europe’s largest technology conference ought to be enough to put anybody off for life, yet most of us seem to come back year after year for more. So why do we do it exactly?

Mobile World Congress is all about mobile (smartphone) phones, obviously. But is also a frenzy of industry activity that centres around the emerging technologies in telco carrier service provision and network management.

It is also an exhibition forum to showcase all of the new advances in Artificial Intelligence (AI), big data analytics and cloud computing that drive every device from the desktop, to the handheld and onward to the machines that populate the Internet of Things (IoT).

We’re not even supposed to call them smartphones anymore anyway.

High-end mobile devices now outstrip the basic desktop PCs of perhaps a decade ago — at least in terms of storage, application sophistication and access to network-based resources.

This is of course the real point here, that is – the mobile device today often plays the role of ‘front end interface’ with all the real power processing, analytics and intelligence going on back in the cloud datacentre.

But this story is not meant to be a short history of mobile computing. This is meant to be an insight into the event itself. What’s it like to walk the floors of the Fira de Barcelona (literally, the ‘trade fair’ of Barcelona) and what’s going on this year?

Battle ready

Make no mistake, Mobile World Congress (MWC) is not for the faint hearted. The walk from hall 1 through to Hall 8 measures out perfectly at a mile. Fitbit walking points are easily earned, shoes leather is quickly worn and hearts are frequently crushed when you realise your 9am, 10am and 11am meetings are in halls 7, 1 and 8 respectively.

Technology journalists often run entire MWC preview stories based on what items they’re fitting into their backpack for the show. If you’re looking for some advice on this season’s ‘most trusty camera monopod’, then these are are stories you should read.

Arguably more pressing (than tripods at least) is the subject of food. The press centre offers not much more than a plate of dried out cookies and the lines for industrially produced sandwiches are never appealing.

The smart money is on packing some new age nut and oat bars with an additional supply of those little tins of tuna that come in three packs with ring pulls. Additional essential bag paraphernalia (monopods excepted) should also include hand sanitizer, phone chargers and photo ID.

Getting into the venue is a task in itself.

The Spanish door guards at the Fira are notoriously unwavering in their insistence for ID and pleas of ‘look, I have a badge and why would I fly all the way here to Spain and stand here with all these muesli bars and handwash’ hold very little water.

As a special note to the GSMA (who has implemented new rulings), although the show promotional materials do suggest that European ID cards are accepted as entrance ID, they refused a British (EU, for now) driving license point blank.

“Go back to your hotel and get your passport, we will see you again in about an hour,” said the security guard, who looked pained at how strict the rules were… as he took his glove off to shake my hand as a human-level apology.

Once more unto the breach

Walking this show floor this year is the same heady mix or beeps, blips and bings that it always is. It’s like Las Vegas, but with smart city dioramas and 5G roadmap discussion instead of fruit machines and free cocktails.

Despite both Formula 1 and professional Darts now banning grid girls and walk on models respectively, it appears that most stands here are fronted by girls in short skirts rather than boys in tight t-shirts. That being said, the audience is so deeply genuinely geeky that they appear to be more interested in the gadgets.

Briefings are already underway and the key themes as suggested here are already showing up as the main talking points: 1. Artificial Intelligence, 2. 5G and LTE, 3. IoT & edge computing, 4. Software-Defined Networks (SDN), 5. Big data & data analytics, 6. Enhanced Voice Services (EVS), 7. Net neutrality and borderless connectivity.

Somewhere in among all this madness there will be a few device product launches… but the backbone appears to be significantly more of a discussion point than the shiny shiny in your pocket.

For the real product news please see MWC 2018: Samsung Galaxy S9 launch kicks off annual mobile festival and more at computerweekly.com/news.


February 21, 2018  10:10 AM

Visa Developer launches: that priceless API will do nicely

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Mastercard has ‘Priceless’ and American Express has ‘That’ll do nicely’ and ‘Don’t leave home without it’, but what does Visa have?

The answer appears to be the (arguably less catchy) ‘It’s everywhere you want to be’ and ‘More people go with Visa’, or so it seems.

Perhaps Visa will come up with a catchier catchline for its newly launched Visa Developer programme.

Targeted at software application developers (obviously), Visa developer is hoped to help move the Visa retail payments network to an open platform that will drive new software developments in payments and commerce — and those developments are hoped to come from financial institutions, merchants and technology companies

Charlie Scharf, chief executive officer, Visa Inc has said that at launch, the new platform offers access to some of Visa’s payment technologies and services including account holder identification, person-to-person payment capabilities, secure in-store and online payment services such as Visa Checkout, currency conversion and consumer transaction alerts.

Beta trial partners have already created prototype applications including Capital One, CIBC, Emirates NBD, National Australia Bank (NAB), RBC, TD Bank, Scotiabank, TSYS, U.S. Bank and VenueNext.

Open platform components here include:

  • A globally accessible developer portal.
  • An open platform that provides access to Visa APIs and software development kits.
  • A testing sandbox that offers application developers a plug and play experience, as well as access to Visa test data.

Visa’s vision for its global developer engagement program includes the creation of a marketplace for financial institutions, merchants and technology companies to collaborate on digital commerce applications and services.

Does Visa need a developer catchiline now?

How about ‘That priceless API will do nicely, don’t enter your SDK without it’? Nah… turns out it’s ‘Deliver the commerce experience of tomorrow’… hmm, still needs work.


February 12, 2018  8:28 AM

RHEL trends: secure automation, (any) cloud-native performance… and automation, again

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

As part of a series of analysis posts designed to look at the major (and some slightly lesser) open source distributions and what they have achieved over the last year, let us briefly revisit Red Hat’s major release, which came in the shape of Red Hat Enterprise Linux 7.4, or simply RHEL, to its friends.

As with so many platform level software releases at this time, Red Hat went for workload automation controls as a key theme.

In particular, Red Hat went for automation efficiency that can be applied to cloud native applications, securely, obviously.

Any cloud

The firm wants RHEL to be seen as an operating system that works across physical servers, virtual machines and hybrid, public and multi-cloud footprints. So that was many cloud, any cloud – often employing containers – in case you weren’t counting.

Even in 2018 we stand at a point where open source is criticised for its security stance over and above proprietary systems.

Security updates

Red Hat has tried to address this issue with updated audit capabilities to help simplify how administrators filter the events logged by the audit system, gather more information from critical events and to interpret large numbers of records.

The firm also introduced a USB Guard feature that allows for greater control over how plug-and-play devices can be used by specific users to help limit both data leaks and data injection.

Enhanced container security functionality with full support for using SELinux with OverlayFS helps secure the underlying file system and provides the ability to use Bocker and use namespaces together for fine-grained access control.

Other key trends

Other key trends RHEL 7.4 included new features designed to improve the performance of both networking and storage.

For the deep geek value here – Red Hat added support for Over Fabric – which helps connect to high performance NVMe storage devices located in a datacentre on both Ethernet or Infiniband fabric infrastructures.

For Linux containers – the latest version of Red Hat Enterprise Linux Atomic Host refines enterprise-grade Linux container security and support for package layering providing a means of adding packages like monitoring agents and drivers to the host operating system.

We also saw the introduction of LiveFS as a Technology Preview, which enables users to install security updates and layer packages without a reboot… so, kind of more automation goodness again really.

Red Hat’s affable public relations department contrived to put these words in the mouth of Jim Totton, vice president and general manager, platforms business unit at Red Hat – as noted below:

“The modern enterprise will not be solely based in physical servers or cloud services; rather, the path to digital transformation weaves across four distinct technology footprints. The latest version of the world’s leading enterprise Linux platform supports each of these deployment methodologies with new security features, improved performance and introduces new automation capabilities to cut through the inherent complexities of heterogeneous datacenters.”

Those four footprints (in case you missed the reference) are:

  • Traditional physical servers — in server rooms or in-house datacenters.
  • Virtual machines — In public cloud datacenters
  • Next-generation cloud — Servers in a combination of the two above locations to provide ‘hybrid’ cloud services with workload orchestration.
  • Container services – Which encapsulate discrete components of application logic provisioned only with the minimal resources needed.

So in summary, RHEL is all about automation performed with an (even more) focused eye on security and cloud-native performance for any cloud and any virtualisation layer within the total fabric of computing… oh yes, and it’s all about automation, again, throughout, that’s how it works.


February 9, 2018  10:21 AM

Progress goes on louder on React, Xamarin & Fluent-themed design

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Application development and deployment company Progress has come forward with new releases of the Progress Telerik and Progress Kendo UI tools.

New capabilities including ‘Fluent’ design and support for frameworks including React, Angular, Vue and Xamarin.

The firm insists that it is the first vendor to offer React UI components, built from the ground up… and the first to add Fluent-inspired themes to its component suites.

Progress Kendo UI is a UI library for data-rich web applications with an accompanying toolkit of functions.

Faris Sweis, senior VP and GM of developer tooling at Progress says that the Kendo UI library now includes, “Native support for React, giving developers the option to select the framework of their choice and move between them freely, all while relying on the robust UI components available in Kendo UI,” he said.

Written from the ground up for React architectures, Progress delivers pure, high-performance React UI components without any jQuery dependencies.

Kendo UI for Angular ehancements come in response to customer feedback – here we can see support for Angular with new components, including TreeView, Window, Splitter and Gauges.

“Kendo UI tightly integrates with Angular, empowering developers to build next-generation UIs that live up to any modern website design requirement, including native and responsive Angular web applications,” said Sweis.

Kendo UI for Vue adds four new Vue components: Pivot Grid, Gantt Chart, Progress Bar and Splitter, allowing Vue developers to carry over productivity benefits usually experienced when creating web UI with the Kendo UI library.

Other new features include new Fluent-Theme: A Fluent-inspired theme comes to Telerik control suites targeting WPF and WinForms. The new theme allows them to fit in the appearance of the Windows operating system.

“Telerik UI for Xamarin now enhances the design time experience previously unavailable for Xamarin.Forms developers. This feature allows .NET developers to drag and drop components on the screen quickly. In addition, the new release includes five new controls that enable fast development of Xamarin apps,” said Sweis.

In addition to support of Microsoft SQL Server, Microsoft SQL Azure Database and Oracle, MySQL Telerik Reporting and Telerik Report Server now enable easy access to more than 30 new data sources such as Salesforce, MongoDB and Amazon Redshift.


February 5, 2018  8:34 AM

Snowflake aims to unify cloud data toolsets

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Why did Snowflake Computing call itself Snowflake Computing?

Was it because its founders are snowflake generation kids who are incapable of taking a few hard knocks here and there? Um, ah no, it’s not that.

Is it because Snowflake Computing is headquartered in Alaska, Iceland or some other chilly clime? Nope, wrong again.

In truth, it’s probably because snowflakes are all one-off creations and corporate spin doctors like to slap the word the word ‘unique’ on every product and service they attempt to bring to market. So just how different (if not wholly unique) from other vendors is Snowflake?

Snowflake sells a cloud-based data storage and analytics service called Snowflake Elastic Data Warehouse that allows corporate users to store and analyse data using cloud-based hardware and software.

The company’s approach appears to be differentiated enough to earn it some weighty investment input. Snowflake this January 2018 secured $263 (£187) million round of funding, which it will now use to expand its cloud data warehouse business In Europe and Asia. The company is now valued at $1.5 (£1.06) billion.

That expansion effort appears to be happening already, the firm’s 2018 Cloud Analytics World Tour starts this February 2018 in London.

Unifying cloud data tools

“Across the world, people at organisations of all sizes are trying to unify their tools, people and data for more effective and insightful analytics,” said Snowflake’s chief marketing officer, Denise Persson. “The Cloud Analytics World Tour will help attendees escape the restraints of legacy technology, empower everyone in the organisation with data and create a truly global enterprise data warehouse that unlocks all the value hidden in all of their data.”

Centring on the theme ‘Unite the Data Nation’, these events will share best practices and lessons learned in data warehousing and data analytics. Snowflake grandly promises ‘evidence-based insights’, as opposed to corporate conjecture, presumably.

Spanning February 6th – May 10th, the tour will traverse 14 cities across the USA, Europe and Australia.


Page 3 of 10712345...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: