CW Developer Network

Page 10 of 103« First...89101112...203040...Last »

June 12, 2017  2:40 PM

New Relic prescribes data health for DevOps

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Digital intelligence company New Relic has added extra functions to its eponymously named Digital Intelligence Platform.

The company has brought forward what it calls a so-called Health Map.

High-density APM

This is the firm’s play to offer its own Application Performance Management (APM) software alongside its own (similarly eponymously named) infrastructure products.

The end result offers what has been called a “high-density view” of applications and the infrastructure supporting those applications.

“By standardizing monitoring within a single cloud platform, customers will be able to work better together to pinpoint issues and optimise their dynamic cloud or hybrid environments, in particular those leveraging Amazon Web Services (AWS) products,” said the company, in a press statement.

It is true, as firms adopt microservice architectures, it becomes more difficult to pinpoint performance issues within the application stack.

New Relic’s new Health Map feature claims to bring together data on application and infrastructure performance into a single prioritised view.

DevOps factor

This unified view enables is supposed to allow DevOps teams to understand if the source of a performance issue is from the application code or in the infrastructure layer.

New Relic Infrastructure has 20 out-of-the-box integrations to AWS products, including new connection to Amazon Kinesis Firehose, Amazon Elasticsearch Service, Amazon Route 53, Amazon EC2 Container Service and Amazon EC2 Container Registry.

In addition, AWS Billing and Cost Management features have been introduced.

The wider concepts expressed here are intended to suggest that developers and system administrators in DevOps teams can standardise the monitoring of customised services alongside dynamic infrastructure instances within the context of the applications that they support.

June 9, 2017  12:00 PM

Bi-directional curious? Veritas tackles multi-cloud migration maladies

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Information management company Veritas Technologies is targeting the headaches that the firm suggests might be thrown up by running multi-cloud operations.

Hybrid cloud deployments featuring a mix of public and on-premises deployments are conceptually sound, but the reality of application (and the consequent need for data storage) deployment in these scenarios throws up not just migration (from one cloud to another) headaches, but the (arguably) more painful idea of what Veritas has labelled as bi-directional cloud migration.

Bi-direction curious?

Bi-directional cloud migration? Yes, bi-directional cloud migration i.e. data, applications and smaller computing components moving from one cloud (could be the same type, could be different) to another.

What Veritas has worked on (and now released) are technologies to handle bi-directional cloud migration for critical workloads and snapshot-based data protection optimised for multi-cloud environments.

These latest technologies come in part as a result of recent alliance and partnership announcements with a variety of public cloud providers including Google, IBM and Microsoft.

“Today, organisations looking to fast track cloud adoption in the multi-cloud world face compounding challenges, like a lack of data visibility, complicated migration paths that also frequently result in cloud lock-in, and a mistaken perception that because data is in the cloud, it does not need to be protected. Also, as organisations globally become more reliant on digital information, with data spread across multiple clouds, ensuring governance across the board is becoming increasingly complex,” said the company, in a press statement.

The technical details 

Veritas Information Map and S3 Connector — the Information Map S3 Connector provides a real-time picture and interactive view of unstructured data assets residing in S3-enabled cloud storage repositories.

According to Veritas, “Visibility is the critical first step to making informed data retention, migration and deletion decisions. Data visibility also helps to ensure compliance with data regulations around the world, including the GDPR in May 2018.”

The Veritas 2017 GDPR Report suggests that only one in 10 enterprises polled globally believe GDPR compliance of data in the cloud is their organisation’s responsibility. This of course is a false assumption. Under GDPR, organisations are just as responsible for adhering to the same data privacy mandates for data stored in the cloud—including any public cloud—as well as data stored on-premises.

“Organisations moving to a multi-cloud model find that it is increasingly difficult to visualise, protect and manage their cloud-based data across different environments,” said Jason Tooley, VP for Northern Europe at Veritas. “This challenge impacts decision-making and creates unnecessary risk, especially as the legislative landscape grows in complexity and severity. The new Veritas solutions focused on data visualisation, workload protection and cloud migration address customer concerns as they continue down the path of digital transformation and ensure compliance as GDPR comes into effect next year.”

Other news here includes notes on Veritas CloudMobility, a product engineered to migrate complex workloads from an on premises datacentre to the cloud with a single-click.

The automated solution gives organisations the flexibility to cost-effectively migrate workloads to the cloud of their choice. If business goals change, the workloads can migrate back to on-premises.


June 7, 2017  2:21 PM

What to expect from Nutanix .NEXT 2017

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Nutanix

Darling of the hyperconverged enterprise cloud space Nutanix holds its annual .NEXT conference, exhibition, symposium, expo (Ed – we get it, it’s a ‘do’) in the American capital of Washington DC later this month… but what can we expect?

The event itself is staged from June 28 – 30, 2017 in Washington D.C. at the Gaylord National Resort and Convention Center.

Building the future of the enterprise cloud

No conference today is staged without a snappy/cheesy tagline and Nutanix has not failed to disappoint on that level — we will be ‘building the future of the enterprise cloud’ at .NEXT.

Last year’s statistics claim to tally up something just over 2500+ attendees — and in addition we saw 67 breakout sessions with over 120 speakers.

This year’s full agenda is presented here and is (arguably) highly likely to surpass those figures.

This year the Computer Weekly Developer Network also notes that Nutanix has added speakers from Google, Dropbox, Harvard Business School and Indiana Pacers.

Indiana Pacers who? Well yes… that’s a team dedicated to the a sport reputedly popular in the new world colonies known as basketball. Like a lot of modern sports teams we will no doubt be hearing how the gentlemen involved with this organisation have been fuelled by the use of data analytics and cloud computing advancements in association with Nutanix.

Other speaker highlights include:

  • Diane Greene, senior vice president, Google Cloud;
  • Kirk Skaugen, executive vice president and president of Lenovo’s datacentre group;
  • Deepak Malhotra, Eli Goldston professor of business administration at Harvard Business School;
  • Adriana Gascoigne, CEO and Founder of Girls in Tech;
  • John Bates, author of Thingalytics;
  • Former NASA principal investigator and CTO of Houston Mechatronics, Nicolaus Radford;
  • Not Impossible Labs founder and CEO Mick Ebeling;
  • Nasdaq President and CEO Adena Friedman;
  • SAP CEO Bill McDermott.

Women in tech

According to Nutanix, “With our new CIO Wendy M. Pfeiffer (previously from GoPro) onboard, we continue to support diversity at Nutanix. The prominent female speakers at .NEXT DC include: Diane Greene, Senior Vice President, Google Cloud, Adriana Gascoigne, CEO and Founder of Girls in Tech and Nasdaq President and CEO Adena Friedman.”

IoT: How to increase profits and avoid disasters

Let’s pick one session from the many (after all, why not?) and choose someone who we have featured before on this column — Dr. John Bates, CEO of Plat.ONE and Author of Thingalytics.

As the Iot changes the world and each device, from tractors to refrigerators to ships, is digitised and connected to the cloud model of service-based computing, Nutanix says it presents an opportunity for businesses to profit from the digital data it creates.

However, for many organisations there is trepidation around how to get started in IoT – particularly with regard to finding a profitable business model that makes the investment worthwhile, as well as avoiding expensive disasters.

Bates will discuss and explore new business modes that have been proven to increase profits, as well as techniques to minimise the risk of disasters.

What to expect

A good event to come?

Yes for sure, Nutanix is deeply technical to the extent that it can normally tell a story without necessarily using the words ‘digital transformation’. Certainly we would hope to find the company explaining how the mechanics of the cloud model are really impacting the shape and form of applications (in terms of behaviour and connection and requirements for memory, processing and data throughput and so on) in this modern age.

Tell us how cloud works – on the inside – please Nutanix. We’re listening.


June 7, 2017  8:57 AM

TIBCO puts more AI into Spotfire data discovery

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Artificial intelligence, data discovery

TIBCO is in the middle of a raft of product updates and extensions.

The data integration firm’s TIBCO Spotfire Data Catalog is describe as a data connectivity and data management product with a ‘composite join capability’ that automatically finds and relates structured and unstructured data.

Smart data unification

This is, if you will, something that we could call ‘smart data unification’.

The Spotfire Data Catalog aims to connect multiple relational and big data sources.

“Spotfire Data Catalog makes navigating your corporate data as easy as searching the web for content,” said Brad Hopper, vice president, product strategy, TIBCO. “With machine learning under the hood and the ability to drag-and-drop your way to a custom Spotfire data mart, our data catalog product helps customers break down data silos and increase the value of their data assets.”

NOTE: A data mart is a repository of data that is designed to serve a particular community of knowledge workers.

How does it work?

Spotfire Data Catalog continuously indexes databases and data lakes, applying text-mining algorithms to discover relationships among tables and documents, as well as concepts and sentiment.

These elements can be discovered in business user searches, or may be presented as recommendations. The user can then pick and choose what to add to what becomes a virtual data mart and then share a connection to this virtual data mart for analysis with Spotfire.


June 7, 2017  5:02 AM

What to expect from TIBCO Now Berlin 2017

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Digital transformation blah blah blah, there’s usually more to TIBCO (The Integration Bus COmpany) than the usual industry padding and rhetoric, so what could its Now 2017 conference and exhibition hold in store?

The firm positions itself as an integration specialist (the clue is the I in the name) as well as a purist in the areas of API management and analytics.

A somewhat sweaty humid Berlin then plays host to TIBCO this week at the Estrel Hotel and Conference Center.

TIBCO CMO Thomas Been thinks its all about “going beyond” what might have previously been considered to be traditional Business Intelligence (BI) tools.

“TIBCO focuses on efforts on interconnecting data and augmenting the results. This conference is a testament to the industry demand for platforms to fuel digital initiatives. The variety of product announcements that will be unveiled at the event are TIBCO’s response to that need,” said Been.

Attendees (you can call them ‘thought leaders’, if you absolutely must) at this event span a range of industries from transportation to finance.

Conference highlights will include keynotes from TIBCO executives such as Murray Rode, chief executive officer, Matt Quinn, executive vice president and chief technology officer, and Thomas Been, chief marketing officer.

  • Additional keynote presentations feature customers including Matt Harris, head of IT, Mercedes-AMG Petronas Motorsport, Wim Liet and Hans Tonissen, IT leaders, Nederlandse Spoorwegen, and Erwin Vezin, vice president, enterprise architecture, AccorHotels;

This event has sister conferences stages in San Diego and Singapore.


June 1, 2017  1:31 PM

Automation everywhere, Redgate automatic SQL code formatting

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

SQL (Structured Query Language) is a standardised programming language used for managing relational databases and performing various operations on the data in them. So then, SQL databases – and the SQL code used to manage them – have become central to business operations.

All is well in SQL land so far then.

Yes but there are multiple styles for writing SQL code — and this means it can be difficult for companies to ensure consistency, particularly in collaborative DevOps environments.

Could the answer lie in automation? Specifically… automation to provide formatting.

Formatting finesse

Redgate Software’s SQL Prompt introduces a new automatic SQL code formatting option for this exact purpose.

Developers write in their own style, click a button in SQL Prompt and it automatically reformats their work to match the chosen company style. This means developers can code in their own way, while the company gets the benefits of standardisation and speed when it comes to managing and updating SQL code.

Developers can copy and paste code into an online coding engine on the Redgate website and try out the feature for themselves.

IntelliSense-style code completion

According to Redgate, “SQL Prompt is already the industry standard tool for writing, formatting and refactoring SQL Code in both SQL Server Management Studio (SSMS) and Visual Studio (VS). Using IntelliSense-style code completion, it strips away the repetition of coding, allows users to share frequently used snippets of SQL, and dramatically reduces the time required to write SQL.”

The new formatting engine in SQL Prompt v8 lets users create, share and save multiple code formatting styles, switch between them and try out new styles created by others in either SSMS or VS.

 “This may not sound important but it’s a huge step,” says Grant Fritchey, Microsoft Data Platform MVP. “As companies move towards a DevOps style of working, we’re seeing the responsibility for developing databases using SQL code spread across teams and between teams. Now, rather than having the code written in a myriad of different ways, it can all be formatted in one style in one click. That makes sharing it, rewriting it, and developing it further incredibly simple.”

Opinions vary wildly on capitalisation, how objects are referenced, spacing and aligning, text coloring, and even commas and semi-colons. This in turn leads to database developers and Database Administrators (DBAs) having many different coding rules and styles. As a direct consequence, developers often have to work with code that – for them – is difficult to read and understand, which slows work down and can introduce errors into the code.

Jamie Wallis, Redgate product marketing manager sums up by saying that, with the new tools, while developers can still argue about the best way of writing SQL, they can now format it in one click to their preferred style, work on it… and then return it to the original style if needed.

Wallis nicely concludes, the arguments will probably continue, but the delays with working with SQL won’t.


May 31, 2017  8:16 AM

BlueData: developer data science is a team sport

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Big data analytics and infrastructure deployment company BlueData has attempted to explain why developers and data scientists working on the front line of big data implementations need to consider themselves as team players.

VP of marketing Jason Schroedl spoke to the Computer Weekly Developer Blog to explain that the data science community has to play a team game to the extent that it actually adopts its own kind of DevOps model.

#1 job in the USA

The claim is that this new role of ‘data scientist’ is the #1 job in the USA for the second year in a row. 

But who are these people?

Schroedl says they are highly skilled at developing advanced analytical models and prototypes and often come with a core background in software engineering as a developer with an appreciation for higher-level system architecture needs.

But there’s a problem with data science

The issue is that the siloed efforts and custom-crafted prototypes of individual data scientists can be difficult to scale, reproduce and share across multiple users. 

“What works for an ad-hoc model in development may not necessarily work in production; what works as a one-off prototype on a laptop might not work as a consistent and repeatable process in a distributed computing environment,” said Schroedl.

How do we fix this issue? There’s you team sport twist.

The suggestion from BlueData’s Schroedl is that there now needs to be a coming together of involving multiple data scientists, data engineers, data analysts and developers that have different skillsets and different knowledge of specialised tools. 

“What’s needed is an approach that brings the agility, automation, and collaboration of DevOps to these data science and engineering teams. They need to operationalise the data science lifecycle in a streamlined and repeatable way. They require an Agile and lean process that enables them to iterate quickly and fail fast. They need the ability to easily share data, models and code in a secure distributed environment. Finally, they need the flexibility to use their own preferred tools and try out new technologies in the rapidly changing field of data science,” said Schroedl.

Team sport DevOps in a big data-as-a-service world? That’s bdass.

 


May 23, 2017  5:11 PM

What is data warehouse automation?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

WhereScape is a data warehouse automation firm.

But what does that mean?

WhereScape is the company name. Data (in its plural form) and datum (in singular) we will take as read. A data warehouse is a federated repository for all the data that an enterprise’s various business systems collect and that repository may be physical or logical.

So that still doesn’t tell us what a data warehouse automation firm does.

In general terms, WhereScape exists to provide software/data management tools that designing, building and operating enterprise data warehouses — that’s the automation factor.

To quote the Data Warehouse Institute (TDWI) from WhereScape’s own pages, “Data warehouse automation is much more than simply automating the development process. It encompasses all of the core processes of data warehousing including design, development, testing, deployment, operations, impact analysis, and change management.”

Warehouse, or Vault?

Not content with logically naming its product WhereScape Data Warehouse Express™ (a product which does not exist), the firm has cleverly employed the use of a marketing department to call its latest product WhereScape Data Vault Express™.

Ah but there’s a reason for that i.e. the Vault Express (Ed – no, it’s not a Drive-Thru service) uses the Data Vault 2.0 system of Business Intelligence (BI).

User Mike Magalsky commands the role of enterprise data architect at memory & storage firm Micron. Magalsky explains that WhereScape allows his team to prototype data jobs and use its visualisation power.

“If a picture is worth a thousand words, a prototype is worth a million,” said Magalsky.

Parallel power

In terms of function, Data Vault 2.0 uses  parallel database processing for large data sets. There are also up-to-date documentation goodies, plus native code generation and change management capabilities.

“IT teams continuously wrestle with delivering solutions at a pace fast enough to answer the intelligence needs of the business today, while ensuring their implemented infrastructure will also position them to quickly respond as business evolves in the future,” said Mark Budzinski, CEO of WhereScape.

WhereScape Data Vault Express includes Data Vault 2.0 enabled-versions of:

·      WhereScape 3D – a design tool for quickly designing and reality testing Data Vault and analytics projects.

·      WhereScape RED –  an integrated development environment for rapidly building, deploying, managing and updating Data Vault.


May 22, 2017  9:24 AM

MarkLogic 9 targets bitemporal messy data

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

MarkLogic has hit iteration version 9 of its eponymously named operational & transactional enterprise NoSQL database.

What marks this product out is its data integration capabilities i.e. it is positioned as a database for integrating data from silos.

The firm aims to fire a blow across the bows of (as MarkLogic puts it, “Cumbersome relational databases with ETL (Extract, Transform and Load) tools to integrate data from silos.”

Flexible toughness?

What you have then, if this technology proposition works for intended data stacks and workflow use cases, the flexibility of NoSQL combined with enterprise-hardened features like certified security and high availability.

Advanced features here include bi-temporal (or bitemporal, if you prefer), semantics and cloud support.

Bitemporal Modeling is an information modeling technique designed to handle historical data along two different timelines — this makes it possible to rewind the information to “as it actually was” in combination with “as it was recorded” at some point in time.

Messin’ with messy data

MarkLogic 9 also includes what are known as ‘Entity Services’ — this function helps manage “messy” ever-changing data sources by allowing them to define and evolve a data model (and corresponding vocabulary) that “harmonises” real-world entities, such as customers and products and the relationships between them.

… and for developers?

There is also Optic, a new API that lets developers view their data as documents, graphs or rows… and there are also new monitoring and enhancements for non-disruptive operations.

“In order to innovate and drive positive business outcomes from their most critical data assets, companies must be able to move and integrate their data across company silos and multi–hosted environments without compromising data lineage, security or governance,” said Carl Olofson, Research Vice President, International Data Corporation (IDC). 

“Few databases on the market today can offer a full package of enterprise grade data agility, governance and security features to address those requirements. MarkLogic has been able to deliver capabilities in each of these areas with its NoSQL database technology. As more companies look to gain a competitive advantage from their data, NoSQL can be a key digital transformation technology that provides a bridge to the future where companies can blend emerging third platform data with conventional managed data, taking advantage of cheap infrastructure, and elastic compute power to achieve a 360-degree view of the company’s data assets.”

Other product enhancements

·       Ops Director is a foundational new tool that eases management for system administrators across multiple clusters, cloud and on-premises systems and production, test and development environments.

·       Telemetry is an opt-in feature that enables better and faster support by collecting, encrypting and sending diagnostic system-level information to MarkLogic so it is there immediately when needed.


May 18, 2017  11:44 AM

Why SAP says ‘Run Simple’

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Obviously, the Computer Weekly Developer Network is not interested in advertising, public relations spin or any of the elements that go to make up the so-called ‘marketing mix’ in any shape or form.

But the SAP ‘Run Simple’ campaign does throw up a discussion point.

Extra cheese please?

The ‘Run Simple’ is nothing more than a cheesy strapline designed to be emblazoned upon airport advertising hoardings and other corporate collateral paraphernalia, clearly.

Except it isn’t. It does actually have a meaning and purpose.

The runtime simplicity element that SAP is eluding to comes back to the word that we are repeatedly using in the context of contemporary software application development — automation.

Automation matters matter

The automation element here comes back to the core elements of the total codebase that can be automated where defined models from one use case or another can be applied as best practice templates to another dataset, another database computation or indeed another software application task at a wider level.

This in many senses is why SAP has packaged it’s latest SAP Leonardo offering as it has.

SAP co-founder and all round lovable eccentric geekmeister Dr Hasso Plattner has called SAP Leonardo a ‘bounding box‘ — he borrows the geometric term to describe the way SAP Leonardo surrounds and encapsulates a family of SAP products and tools to present data engineers and developers with a more automated set of functions.

The idea, believe it or not, is that makes things run more simple.

Industry accelerators

In tangible terms, we are talking about elements such as the recently announced industry accelerators.

“Industry accelerator packages for SAP Leonardo will focus on use cases for specific industries that are a powerful differentiator for SAP. Customers will not have to assemble pieces and parts to solve a business problem. We will use included services that tailor pre-defined software elements for the specific customer implementation. Everything will come at a pre-defined price and our engagement is time-bound, so every customer has an accelerated time-to-value,” said SAP’s Leonardo lead Mala Anand.

In brief then, this is the deeply technical justification behind why SAP uses this ad slogan.

Not everybody is sold on this line of course and industry commentators have called out SAP’s simplicity play as something of a myth saying that it is not possible to provide massive configuration for multiple markets in the way that the firm claims is possible. We provide this note just for fairness and balance.

NOTE: To other firms. No, we don’t want to deconstruct your advertising slogan on this column unless you have a specific developer runtime relevance to clarify.

1business-innovation-through-technology-at-sap-select-berlin-26-638


Page 10 of 103« First...89101112...203040...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: