CW Developer Network

Page 1 of 9412345...102030...Last »

June 21, 2017  8:41 PM

Boomi accelerators, Dell really does make software

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Let’s not beat about the bush because we already know that software runs the world – and that this means that every company is now a software company.

So resounding is this truth that we must now accept those bastions of the IT industry that grew up as hardware companies now also attempting to convince us that they too are software firms.

IBM’s move to offload its hardware business in the latter Gerstner years and focus on becoming a services company (clue: services are made of software) was more obvious (and perhaps successful) than some.

Machinations & manoeuvrings

Arguably less obvious were the software-centric machinations & manoeuvrings that have emanated from Dell.

Let’s be clear, any predominantly hardware-centric business will have plenty of software application developers and a good portion of its own hand-crafted proprietary code running inside its own operation. Intel is a great example of this.

Intel made software even before we knew of the Intel Developer Forum (IDF) event and all the latter moves the chip giant has made in software. Even if some of Intel’s software moves (ouch! Mashery  – help, do we want it or not?) were not as successful as its wider approach, we still knew Intel was a software business.

Can we say the same of Dell? Don’t we still think of ‘configure to order’ desktop towers, (arguably) overpriced replacement laptop batteries and a declining business model beset with post-millennial plant closures?

Of course Michael Dell is a smart cookie and his firm had actually started its relationship with EMC as far back as 2001 (the merger happened in 2016). Storage means data and data means software just as Heinz makes beans and beans means Heinz. So, given this rumination-fuelled preamble, what has Dell done recently and is it of command line level software substance for real developers?

Does Dell software have substance?

Let’s look at Boomi. Dell Boomi AtomSphere is an on-demand multi-tenant cloud integration platform for connecting cloud and on-premises applications and data. As detailed on TechTarget, the platform enables users to design cloud-based integration processes called Atoms and transfer data between cloud and on-premises applications. Each Atom defines what is necessary for the integration.

The Dell Boomi boys and girls and gender-neutral individuals have been busy recently and this month sees the team announce the availability of its Spring 2017 updates. With a promise to move, manage, govern and orchestrate data across hybrid IT architectures, the latest release is distinguished in its use of so-called ‘no code application integration accelerators’ and new enterprise scale DevOps features.

What is a no code application integration accelerator?

A no code application integration accelerator isn’t actually a thing i.e. not in the same sense as banana or a operating system is a thing. It is Dell’s preferred term used to describe Boomi’s pre-defined data mappings, pre-built tools and reusable components that can be used by developers in what manifests itself as a drag-and-drop (hence, no code) data integration and application development environment.

According to Dell, “Re-using common components increases flexibility, reduces duplication and simplifies updates. This saves time and development effort, especially for organisations with varied businesses, organisations growing through acquisition, as well as conglomerates and organisations with semi-autonomous business units.”

Fortified DevOps features

Boomi’s fortified DevOps features provide new enterprise capabilities including options for Docker container deployment. The firm points to the ability to create pre-configured containers and says that Docker containers simplify and provide flexibility to spin up and down instances of Boomi with pre-defined configurations to increase developer productivity and speed up application integration.

There are also data governance and security goodies in here for those with an interest.

Dell, the software company then? Yes of course it is.

Dell, the key go-to player for hard core command line dwelling programmers really cutting it in the new world of emerging languages, microservices and polymorphic data types reverberating around services-centric virtualised computing environments? Well, perhaps.

June 14, 2017  1:17 PM

Do HR software developers build ‘healthy’ applications?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Do human resources (HR) software application developers build ‘healthy’ applications? This is the question.

To be clearer – we talk about ‘application health’ all the time in terms of a piece of software’s ability to exist successfully inside the use case, system and environment that it was built to serve.

But that is not what we are talking about here.

HR applications have to exhibit ‘application health’ (as above) in just the same way as any other app. But they also have to be constructed to deal with the nuances of human behaviour and so be inherently engineered to serve the real needs of human beings in the workplace.

SAP SuccessFactors

German softwarehaus SAP staged its SuccessConnect 2017 conference and exhibition in London this week to examine how its HR software division is expanding its product line and tuning its work in this space.

As noted here, SAP SuccessFactors suggests that the architectural structure of HR software has to be able to work to the unpredictable nuances of the ‘human mental model’ and so much be inherently online, mobile (so accessible anywhere) and flexible. If we can make HR software ‘touchy-feely’ in this way, then it may be more likely accepted and successfully used for future business success.

So there is a need not just for healthy applications, but for apps that are capable of bending to serve the needs of workers in the workplace.

Boycott factor

If developers fail to get HR applications right, then the suggestion here is that:

  • a) they may have developed an amazing HR application with superb functionality, performance, robustness and overall operational excellence but…
  • b) users end up baulking at the way the app operates (and indeed the workplace functions that it compels them to perform) and so they logically start to boycott the application itself which renders the business function it seeks to deliver useless.

Speaking at SAP SuccessConnect 2017 this year was Julian Simée in his role as senior manager for corporate HR strategy & project lead for ‘Digitalization HR’ at Lufthansa Group.

“We have to think about what transformative processes mean for actual employees [when we implement SAP SuccessFactors], otherwise, they will end up boycotting the transformative change that we are trying to apply on an end-to-end basis across the company,” said Simée.

SAP has also pointed to recent research which may suggest shows that healthy, thriving employees are key to creating sustainable, profitable organisations.

  • An internal study at SAP showed that a 1 point increase in our internal well-being survey score impacts our business by $75-85 MM in operating costs.
  • Organisations with cultures and programs supporting employee health and well-being are perceived as more desirable by job candidates.
  • Increases in employee well-being have been associated with 35% lower turnover rate. This translates into savings of approximately $19.5 million for a typical 10,000 person company.

Health & wellbeing

SAP says that its SAP SuccessFactors HCM Suite supports organisational health and wellbeing goals — and that establishing sustainable cultures to support health and wellbeing should be a part of all application design.

Speakers at SuccessConnect did acknowledge press questions relating to ‘rejection and push-back’ from employees who do not want to be forced to adopt healthier lifestyles. The responsibility it appears sits with management to ‘push down’ these actions – great news for workers, not such great news for the self-employed.

The truth here appears to suggest that health and wellbeing are a complex subject and perhaps not always an initial consideration for software applications developers throughout the app design and build process.


June 12, 2017  2:40 PM

New Relic prescribes data health for DevOps

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Digital intelligence company New Relic has added extra functions to its eponymously named Digital Intelligence Platform.

The company has brought forward what it calls a so-called Health Map.

High-density APM

This is the firm’s play to offer its own Application Performance Management (APM) software alongside its own (similarly eponymously named) infrastructure products.

The end result offers what has been called a “high-density view” of applications and the infrastructure supporting those applications.

“By standardizing monitoring within a single cloud platform, customers will be able to work better together to pinpoint issues and optimise their dynamic cloud or hybrid environments, in particular those leveraging Amazon Web Services (AWS) products,” said the company, in a press statement.

It is true, as firms adopt microservice architectures, it becomes more difficult to pinpoint performance issues within the application stack.

New Relic’s new Health Map feature claims to bring together data on application and infrastructure performance into a single prioritised view.

DevOps factor

This unified view enables is supposed to allow DevOps teams to understand if the source of a performance issue is from the application code or in the infrastructure layer.

New Relic Infrastructure has 20 out-of-the-box integrations to AWS products, including new connection to Amazon Kinesis Firehose, Amazon Elasticsearch Service, Amazon Route 53, Amazon EC2 Container Service and Amazon EC2 Container Registry.

In addition, AWS Billing and Cost Management features have been introduced.

The wider concepts expressed here are intended to suggest that developers and system administrators in DevOps teams can standardise the monitoring of customised services alongside dynamic infrastructure instances within the context of the applications that they support.


June 9, 2017  12:00 PM

Bi-directional curious? Veritas tackles multi-cloud migration maladies

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Information management company Veritas Technologies is targeting the headaches that the firm suggests might be thrown up by running multi-cloud operations.

Hybrid cloud deployments featuring a mix of public and on-premises deployments are conceptually sound, but the reality of application (and the consequent need for data storage) deployment in these scenarios throws up not just migration (from one cloud to another) headaches, but the (arguably) more painful idea of what Veritas has labelled as bi-directional cloud migration.

Bi-direction curious?

Bi-directional cloud migration? Yes, bi-directional cloud migration i.e. data, applications and smaller computing components moving from one cloud (could be the same type, could be different) to another.

What Veritas has worked on (and now released) are technologies to handle bi-directional cloud migration for critical workloads and snapshot-based data protection optimised for multi-cloud environments.

These latest technologies come in part as a result of recent alliance and partnership announcements with a variety of public cloud providers including Google, IBM and Microsoft.

“Today, organisations looking to fast track cloud adoption in the multi-cloud world face compounding challenges, like a lack of data visibility, complicated migration paths that also frequently result in cloud lock-in, and a mistaken perception that because data is in the cloud, it does not need to be protected. Also, as organisations globally become more reliant on digital information, with data spread across multiple clouds, ensuring governance across the board is becoming increasingly complex,” said the company, in a press statement.

The technical details 

Veritas Information Map and S3 Connector — the Information Map S3 Connector provides a real-time picture and interactive view of unstructured data assets residing in S3-enabled cloud storage repositories.

According to Veritas, “Visibility is the critical first step to making informed data retention, migration and deletion decisions. Data visibility also helps to ensure compliance with data regulations around the world, including the GDPR in May 2018.”

The Veritas 2017 GDPR Report suggests that only one in 10 enterprises polled globally believe GDPR compliance of data in the cloud is their organisation’s responsibility. This of course is a false assumption. Under GDPR, organisations are just as responsible for adhering to the same data privacy mandates for data stored in the cloud—including any public cloud—as well as data stored on-premises.

“Organisations moving to a multi-cloud model find that it is increasingly difficult to visualise, protect and manage their cloud-based data across different environments,” said Jason Tooley, VP for Northern Europe at Veritas. “This challenge impacts decision-making and creates unnecessary risk, especially as the legislative landscape grows in complexity and severity. The new Veritas solutions focused on data visualisation, workload protection and cloud migration address customer concerns as they continue down the path of digital transformation and ensure compliance as GDPR comes into effect next year.”

Other news here includes notes on Veritas CloudMobility, a product engineered to migrate complex workloads from an on premises datacentre to the cloud with a single-click.

The automated solution gives organisations the flexibility to cost-effectively migrate workloads to the cloud of their choice. If business goals change, the workloads can migrate back to on-premises.


June 7, 2017  2:21 PM

What to expect from Nutanix .NEXT 2017

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Nutanix

Darling of the hyperconverged enterprise cloud space Nutanix holds its annual .NEXT conference, exhibition, symposium, expo (Ed – we get it, it’s a ‘do’) in the American capital of Washington DC later this month… but what can we expect?

The event itself is staged from June 28 – 30, 2017 in Washington D.C. at the Gaylord National Resort and Convention Center.

Building the future of the enterprise cloud

No conference today is staged without a snappy/cheesy tagline and Nutanix has not failed to disappoint on that level — we will be ‘building the future of the enterprise cloud’ at .NEXT.

Last year’s statistics claim to tally up something just over 2500+ attendees — and in addition we saw 67 breakout sessions with over 120 speakers.

This year’s full agenda is presented here and is (arguably) highly likely to surpass those figures.

This year the Computer Weekly Developer Network also notes that Nutanix has added speakers from Google, Dropbox, Harvard Business School and Indiana Pacers.

Indiana Pacers who? Well yes… that’s a team dedicated to the a sport reputedly popular in the new world colonies known as basketball. Like a lot of modern sports teams we will no doubt be hearing how the gentlemen involved with this organisation have been fuelled by the use of data analytics and cloud computing advancements in association with Nutanix.

Other speaker highlights include:

  • Diane Greene, senior vice president, Google Cloud;
  • Kirk Skaugen, executive vice president and president of Lenovo’s datacentre group;
  • Deepak Malhotra, Eli Goldston professor of business administration at Harvard Business School;
  • Adriana Gascoigne, CEO and Founder of Girls in Tech;
  • John Bates, author of Thingalytics;
  • Former NASA principal investigator and CTO of Houston Mechatronics, Nicolaus Radford;
  • Not Impossible Labs founder and CEO Mick Ebeling;
  • Nasdaq President and CEO Adena Friedman;
  • SAP CEO Bill McDermott.

Women in tech

According to Nutanix, “With our new CIO Wendy M. Pfeiffer (previously from GoPro) onboard, we continue to support diversity at Nutanix. The prominent female speakers at .NEXT DC include: Diane Greene, Senior Vice President, Google Cloud, Adriana Gascoigne, CEO and Founder of Girls in Tech and Nasdaq President and CEO Adena Friedman.”

IoT: How to increase profits and avoid disasters

Let’s pick one session from the many (after all, why not?) and choose someone who we have featured before on this column — Dr. John Bates, CEO of Plat.ONE and Author of Thingalytics.

As the Iot changes the world and each device, from tractors to refrigerators to ships, is digitised and connected to the cloud model of service-based computing, Nutanix says it presents an opportunity for businesses to profit from the digital data it creates.

However, for many organisations there is trepidation around how to get started in IoT – particularly with regard to finding a profitable business model that makes the investment worthwhile, as well as avoiding expensive disasters.

Bates will discuss and explore new business modes that have been proven to increase profits, as well as techniques to minimise the risk of disasters.

What to expect

A good event to come?

Yes for sure, Nutanix is deeply technical to the extent that it can normally tell a story without necessarily using the words ‘digital transformation’. Certainly we would hope to find the company explaining how the mechanics of the cloud model are really impacting the shape and form of applications (in terms of behaviour and connection and requirements for memory, processing and data throughput and so on) in this modern age.

Tell us how cloud works – on the inside – please Nutanix. We’re listening.


June 7, 2017  8:57 AM

TIBCO puts more AI into Spotfire data discovery

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Artificial intelligence, data discovery

TIBCO is in the middle of a raft of product updates and extensions.

The data integration firm’s TIBCO Spotfire Data Catalog is describe as a data connectivity and data management product with a ‘composite join capability’ that automatically finds and relates structured and unstructured data.

Smart data unification

This is, if you will, something that we could call ‘smart data unification’.

The Spotfire Data Catalog aims to connect multiple relational and big data sources.

“Spotfire Data Catalog makes navigating your corporate data as easy as searching the web for content,” said Brad Hopper, vice president, product strategy, TIBCO. “With machine learning under the hood and the ability to drag-and-drop your way to a custom Spotfire data mart, our data catalog product helps customers break down data silos and increase the value of their data assets.”

NOTE: A data mart is a repository of data that is designed to serve a particular community of knowledge workers.

How does it work?

Spotfire Data Catalog continuously indexes databases and data lakes, applying text-mining algorithms to discover relationships among tables and documents, as well as concepts and sentiment.

These elements can be discovered in business user searches, or may be presented as recommendations. The user can then pick and choose what to add to what becomes a virtual data mart and then share a connection to this virtual data mart for analysis with Spotfire.


June 7, 2017  5:02 AM

What to expect from TIBCO Now Berlin 2017

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Digital transformation blah blah blah, there’s usually more to TIBCO (The Integration Bus COmpany) than the usual industry padding and rhetoric, so what could its Now 2017 conference and exhibition hold in store?

The firm positions itself as an integration specialist (the clue is the I in the name) as well as a purist in the areas of API management and analytics.

A somewhat sweaty humid Berlin then plays host to TIBCO this week at the Estrel Hotel and Conference Center.

TIBCO CMO Thomas Been thinks its all about “going beyond” what might have previously been considered to be traditional Business Intelligence (BI) tools.

“TIBCO focuses on efforts on interconnecting data and augmenting the results. This conference is a testament to the industry demand for platforms to fuel digital initiatives. The variety of product announcements that will be unveiled at the event are TIBCO’s response to that need,” said Been.

Attendees (you can call them ‘thought leaders’, if you absolutely must) at this event span a range of industries from transportation to finance.

Conference highlights will include keynotes from TIBCO executives such as Murray Rode, chief executive officer, Matt Quinn, executive vice president and chief technology officer, and Thomas Been, chief marketing officer.

  • Additional keynote presentations feature customers including Matt Harris, head of IT, Mercedes-AMG Petronas Motorsport, Wim Liet and Hans Tonissen, IT leaders, Nederlandse Spoorwegen, and Erwin Vezin, vice president, enterprise architecture, AccorHotels;

This event has sister conferences stages in San Diego and Singapore.


June 1, 2017  1:31 PM

Automation everywhere, Redgate automatic SQL code formatting

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

SQL (Structured Query Language) is a standardised programming language used for managing relational databases and performing various operations on the data in them. So then, SQL databases – and the SQL code used to manage them – have become central to business operations.

All is well in SQL land so far then.

Yes but there are multiple styles for writing SQL code — and this means it can be difficult for companies to ensure consistency, particularly in collaborative DevOps environments.

Could the answer lie in automation? Specifically… automation to provide formatting.

Formatting finesse

Redgate Software’s SQL Prompt introduces a new automatic SQL code formatting option for this exact purpose.

Developers write in their own style, click a button in SQL Prompt and it automatically reformats their work to match the chosen company style. This means developers can code in their own way, while the company gets the benefits of standardisation and speed when it comes to managing and updating SQL code.

Developers can copy and paste code into an online coding engine on the Redgate website and try out the feature for themselves.

IntelliSense-style code completion

According to Redgate, “SQL Prompt is already the industry standard tool for writing, formatting and refactoring SQL Code in both SQL Server Management Studio (SSMS) and Visual Studio (VS). Using IntelliSense-style code completion, it strips away the repetition of coding, allows users to share frequently used snippets of SQL, and dramatically reduces the time required to write SQL.”

The new formatting engine in SQL Prompt v8 lets users create, share and save multiple code formatting styles, switch between them and try out new styles created by others in either SSMS or VS.

 “This may not sound important but it’s a huge step,” says Grant Fritchey, Microsoft Data Platform MVP. “As companies move towards a DevOps style of working, we’re seeing the responsibility for developing databases using SQL code spread across teams and between teams. Now, rather than having the code written in a myriad of different ways, it can all be formatted in one style in one click. That makes sharing it, rewriting it, and developing it further incredibly simple.”

Opinions vary wildly on capitalisation, how objects are referenced, spacing and aligning, text coloring, and even commas and semi-colons. This in turn leads to database developers and Database Administrators (DBAs) having many different coding rules and styles. As a direct consequence, developers often have to work with code that – for them – is difficult to read and understand, which slows work down and can introduce errors into the code.

Jamie Wallis, Redgate product marketing manager sums up by saying that, with the new tools, while developers can still argue about the best way of writing SQL, they can now format it in one click to their preferred style, work on it… and then return it to the original style if needed.

Wallis nicely concludes, the arguments will probably continue, but the delays with working with SQL won’t.


May 31, 2017  8:16 AM

BlueData: developer data science is a team sport

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Big data analytics and infrastructure deployment company BlueData has attempted to explain why developers and data scientists working on the front line of big data implementations need to consider themselves as team players.

VP of marketing Jason Schroedl spoke to the Computer Weekly Developer Blog to explain that the data science community has to play a team game to the extent that it actually adopts its own kind of DevOps model.

#1 job in the USA

The claim is that this new role of ‘data scientist’ is the #1 job in the USA for the second year in a row. 

But who are these people?

Schroedl says they are highly skilled at developing advanced analytical models and prototypes and often come with a core background in software engineering as a developer with an appreciation for higher-level system architecture needs.

But there’s a problem with data science

The issue is that the siloed efforts and custom-crafted prototypes of individual data scientists can be difficult to scale, reproduce and share across multiple users. 

“What works for an ad-hoc model in development may not necessarily work in production; what works as a one-off prototype on a laptop might not work as a consistent and repeatable process in a distributed computing environment,” said Schroedl.

How do we fix this issue? There’s you team sport twist.

The suggestion from BlueData’s Schroedl is that there now needs to be a coming together of involving multiple data scientists, data engineers, data analysts and developers that have different skillsets and different knowledge of specialised tools. 

“What’s needed is an approach that brings the agility, automation, and collaboration of DevOps to these data science and engineering teams. They need to operationalise the data science lifecycle in a streamlined and repeatable way. They require an Agile and lean process that enables them to iterate quickly and fail fast. They need the ability to easily share data, models and code in a secure distributed environment. Finally, they need the flexibility to use their own preferred tools and try out new technologies in the rapidly changing field of data science,” said Schroedl.

Team sport DevOps in a big data-as-a-service world? That’s bdass.

 


May 23, 2017  5:11 PM

What is data warehouse automation?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

WhereScape is a data warehouse automation firm.

But what does that mean?

WhereScape is the company name. Data (in its plural form) and datum (in singular) we will take as read. A data warehouse is a federated repository for all the data that an enterprise’s various business systems collect and that repository may be physical or logical.

So that still doesn’t tell us what a data warehouse automation firm does.

In general terms, WhereScape exists to provide software/data management tools that designing, building and operating enterprise data warehouses — that’s the automation factor.

To quote the Data Warehouse Institute (TDWI) from WhereScape’s own pages, “Data warehouse automation is much more than simply automating the development process. It encompasses all of the core processes of data warehousing including design, development, testing, deployment, operations, impact analysis, and change management.”

Warehouse, or Vault?

Not content with logically naming its product WhereScape Data Warehouse Express™ (a product which does not exist), the firm has cleverly employed the use of a marketing department to call its latest product WhereScape Data Vault Express™.

Ah but there’s a reason for that i.e. the Vault Express (Ed – no, it’s not a Drive-Thru service) uses the Data Vault 2.0 system of Business Intelligence (BI).

User Mike Magalsky commands the role of enterprise data architect at memory & storage firm Micron. Magalsky explains that WhereScape allows his team to prototype data jobs and use its visualisation power.

“If a picture is worth a thousand words, a prototype is worth a million,” said Magalsky.

Parallel power

In terms of function, Data Vault 2.0 uses  parallel database processing for large data sets. There are also up-to-date documentation goodies, plus native code generation and change management capabilities.

“IT teams continuously wrestle with delivering solutions at a pace fast enough to answer the intelligence needs of the business today, while ensuring their implemented infrastructure will also position them to quickly respond as business evolves in the future,” said Mark Budzinski, CEO of WhereScape.

WhereScape Data Vault Express includes Data Vault 2.0 enabled-versions of:

·      WhereScape 3D – a design tool for quickly designing and reality testing Data Vault and analytics projects.

·      WhereScape RED –  an integrated development environment for rapidly building, deploying, managing and updating Data Vault.


Page 1 of 9412345...102030...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: