SOA Talk

October 9, 2015  5:16 PM

The ugly side of container technology

Fred Churchville Fred Churchville Profile: Fred Churchville
Application containerization, Container

If you’re like me, you’ve probably heard a lot of hype surrounding application containers being used in the data centers of today’s organizations. What I’ve found is that there are some important issues and pain points app managers need to be aware of if they are serious about leveraging this technology. Here are a few issues that I’ve identified through a little research:

Application sprawl

One of the issues is something that those who have worked with virtual machines (VMs) will recognize: sprawl. According to tech analyst Chris Riley, the “easily accessible” nature of containers can potentially create an “ecosystem of unmanaged containers.”

A developer can pull a container image from the public library and provision an instance on their local machine very quickly. They can then make changes to that instance and publish it again on multiple other host machines, or even multiple new instances on the same machine.

If this ability is granted to a large group of developers, modification and provisioning of containers can have a “viral effect.” What happens when instances of containers they never knew existed or what they are even supposed to be used for.


According to tech writer Steven J. Vaughan-Nichols in an article examining the security of container services like Docker, it’s unclear just how secure containers really are. And furthermore, he says, there seems to be a lot of disagreement about how to actually keep them secure.

Vaughn Nichols points out that running container with Docker requires that you provide root privileges to the Docker daemon, opening up worrisome vulnerabilities in the event that a hacker is able to access your containers – or that “trusted” app managers may get up to some untrustworthy activities.

Vaughn-Nichols also points out that the software you are running within your container could be problematic too. Many companies are simply picking up from container repositories, but can you verify the software’s validity? Docker does include features that verify integrity of all Official Docker Repos, but it is still being perfected – in fact, it will often provide a warning about suspicious software but may not necessarily prevent the software from running.

As Vaughan-Nichols puts it, you can’t just pick up a container from a place like GitHub and expect everything to be a-ok.

Application integrity

Containers may pose a frustrating issue when it comes to managing legacy applications, as well. Those critical business applications built years before your container adopting may suddenly fail to operate properly once they are containerized.

According to tech writer Brien Posey in an article weighing the pros and cons of containers, many legacy applications were designed to be connected to root systems. By placing these apps in a container, you are potentially breaking an essential connection to these root processes, causing the app to malfunction (and any apps it may be connected to).

There are steps that you can take to properly migrate legacy apps into containers. Posey points out that some providers are designed for it, but simply dropping a legacy app into a container can provoke some pretty pesky performance and security issues.

Resource management

Finally, resource management is another big issue to pay attention to. Posey points out in his article that if one application were to consume a huge amount of resources, either on purpose or on accident, then it is more than likely that that excessive consumption will have a negative impact on all the other applications it is bundled up in the container with, causing a sort of “domino effect” of underperforming applications.

This issue can be mitigated by running each container on its own virtual machine. However, this fix also has the potential to create VM sprawl – creating even more costs and resource management issues.

So what?

I don’t mean to sit here and bash containers or say it is a useless technology – far from it. It is a promising technology and there are clear benefits for organizations that leverage this technology properly, intelligently and safely. But it is important to be aware of the dangers before diving in head first.

Be on the lookout for upcoming articles on, and other sites in the TechTarget network that delve deeper into the issues surrounding containers and what you can do to fully enjoy the benefits of this technology.

October 1, 2015  9:52 PM

How do you hire the right developers – and make them stay?

Jan Stafford Jan Stafford Profile: Jan Stafford
Application development, Microservices, Software developer

The creative side of software development attracted Shalom Keynan to the profession. Now director of application development for Boston Heart Diagnostics, he is still fired up about improving patient care through software creation. To keep the ideas and projects flowing, however, he faces the challenge of hiring and training developers to handle mobile, cloud, microservices and other new technologies.

“Every morning, I feel very excited to get to work because I know I will discover new ways to build better software,” Keynan said. “That’s what motivates me, and I think my excitement and openness to new ideas motivate my development team.”

A day in Keynan’s work life largely consists of engagements with application development team members, who aren’t just developers. The team includes marketing, clinical researchers, scientists, developers, software QA and, most importantly, users of the applications in a project.

While Keynan uses some outside consultants for design work, most development is done internally. The team builds and manages a wide range of health care applications, such as apps that generate personalized reports for specific physicians and patients and physician and patient web portals.

Over the past decade, he and his team have had to increase their mobile development skills to meet the demand for smartphone, tablet and custom mobile device applications. Getting into mobile started a domino effect. “Creating applications for mobile has taken us deeper into the cloud and recently into building microservices,” said Keynan. In most cases, he’s worked on training his existing team on the skilss needed for these new technologies.

Skills shortage? What to do?

When recruiting developers, Keynan has not seen a shortage of developers, but too few developers with the skills he needs. In some cases, he’s taken another approach to hiring. “I looked for the qualities we need in a developer, instead,” he said. These qualities include the following:

• Can the developer learn quickly?
• Does her resume show initiative and motivation?
• Does he engage in conversation and sharing ideas easily?

If these qualities are present, a lack of specific advanced skills may not matter. “If you ask me to choose between the developer who knows everything and a developer who learns quickly, I would choose the second one,” he said. “Technologies evolve quickly, and that’s why that quality is so important.”

Advice for job seekers

Keynan advises job-hunting developers to cultivate an understanding of how to relate business objectives and user needs to app requirements. Understand the business needs. Listen carefully and understand who the consumer is. To deliver what is needed, an insightful approach is as important as technology knowledge.

In too many projects, Keynan said, the user sees the end result and says: “You built what we told you to, but that’s not what we wanted.”

Retaining developers

When it comes to retaining employees, Kenyan thinks managers should empower them by sharing responsibilities. “No manager can master everything,” he said. Work in tandem with other developers, not in a separated, top-down way. Also, make sure that team members share their knowledge with each other. “Being exposed to others’ techniques makes people excited about work.” Keep in mind that boredom and developer turnover go hand-in-hand.

Most importantly, Kenyan said, team managers can retain developers by showing them how their work makes a difference in people’s lives. “That feeling that I’m helping people is what gets me to work each morning,” he said. “It’s an important motivator.”

What techniques do you employ to keep developers motivated and on board? Let us know with your comments.

October 1, 2015  3:14 PM

Keeping the lights on (and off) with IoT-enabled APIs

Fred Churchville Fred Churchville Profile: Fred Churchville
API, API development, API management, Application Programming Interface, iot

In an effort to reduce utility costs, some businesses are leveraging the power of APIs and IoT, the Internet of Things, to control smart devices, such as digitized light bulbs.

Recently, I talked with an IoT expert, Phani Pandrangi, about the levels of technologies and services needed to remotely control lighting, as well as the role of APIs in the process. Pandrangi, chief product officer at Kii, explains how a solution such as this in engineered and discusses how APIs are exposed and managed during the process.

The typical IoT application development team consists of firmware, service-level and API designers and developers. “A company might not have all of these skill sets in-house, because they never needed them until now,” said Pandrangi. So, enterprise architects and CIOs charged with IoT projects have to analyze what can be built with in-house staff versus what requires the support an IoT platform vendors’ technical teams. For example, Kii’s technical team recently worked with Yankon Ltd., an LED lighting manufacturer, on the afore-mentioned light bulb project.

To enable remote control of a bulb, technologies are needed at the bulb, cloud services, application and other levels, Pandrangi said. For example, at the device level there are ‘device agents,’ like RAM, inside the firmware of the device – in this case, a bulb. Another layer consists of back-end services, which do many things, including managing data devices and providing analytics reports. Then, there are application, architecture and infrastructure layers, some of which exist before the IoT project is started.

Management of smart devices requires remote connectivity and interoperability. “The user is turning the bulb on, off, changing brightness, etc.,” Pandrangi said. “That comes back to back-end services level.” And then, at the application level, details of functionality of that app are exposed through an API In the light bulb case, remote management can control lighting turn on, turn off, brightness and things like that.

Next, beyond simply being able to do remote control, back-end services must include business intelligence functionality that can, for example, show patterns of usage such as average brightness levels.

Typically, back-end services are integrated and tailored to a specific architecture through APIs. “They are exposed as APIs for apps, Web apps, etc.,” Pandrangi said. “And then, when an API call is made on the cloud side, what happens is, in non-technical terms, is a command that says, ‘Turn on the bulb in Phani’s living room.'” Then the cloud service – which contains all the information relevant to the lighting in Phani’s home, automatically sends an MQTT protocol message telling the bulb what to do.

The APIs device manufacturers used for IoT device development are very solution-dependent, so generic APIs won’t do, said Pandrangi. The device manufacturers customize the APIs and expose them to the device application through an IoT platform like Kii’s, which contains back-end services, like user management, data management, device management, analytics and so on (as well as APIs).

Written in conjunction with Jan Stafford

September 25, 2015  5:49 PM

Will big data make kids smarter?

Fred Churchville Fred Churchville Profile: Fred Churchville
Big Data, Big Data analytics, Data Analytics

From Big Data Innovation Summit 2015, Boston

teacher-44735_1280I recently wrote a blog post about how big data is being leveraged in the public sector. But that is not where the use of big data analytics for the public good ends – today’s educators are teaming up with data scientists to determine how these analytics can be used to create tools to help students get more out of their education.

Unfortunately, at the moment, the field of education is “almost a data-free zone,” according to Henry Kelley, former chief scientist at the Energy Policy and Systems Analysis (EPSA), saying that the space is plagued by small sample sizes, flawed methods and a lack of testing methods that generate needed data.

But big data is making an entrance nonetheless. In one example, providers of massive open online course (MOOCs) are conducting massive analytics on student performance, creating what Stanford University calls a valuable data cauldron.

Likewise, the Kahn Academy, a non-profit, Web-based educational organization created in 2006, used data analytics in the form of A/B testing to determine what aspects of their curriculum were leading to higher learning results – and which ones actually created poor learning results. By looking at data collected they were able to determine that providing a “sneak peek” of certain programming courses actually discouraged students from moving on to the lecture, resulting in much lower engagement results – and learning outcomes – compared to the group that was not offered a preview.

In a final example, the renowned educational publisher McGraw Hill has been leveraging big data analytics in order to produce Connect Insights, a program that allows both students and teachers to track classroom performance as a function of time in an AWS-based application. Using a MongoDB database, they are able to store student’s grades, submission times, upcoming assignment and more in the form of JSON data and present that data in a simplified, easy-to-use format for the user. By providing this continuous feedback to students, the creators of this application believe it will help both students and teachers discover where students may be experiencing “gaps” in their performance and ultimately determine how to improve overall learning outcomes.

As exciting as the prospect is, however, there are still a number of hurdles that need to be overcome before big data can truly enable the next generation of learning. According to Kelley, these data-based programs run into familiar obstacles such as interoperability issues, formatting problems and challenges around using metadata. He also notes that the sheer complexity of education in the US and worldwide multiplies these challenges enormously – especially when trying to track the performance of students involved in complex tasks.

But there is one issue in particular that proves to be a major stumbling block for the proponents of big data in education: the privacy of the students themselves. Those familiar in the space may recall the establishment of inBloom, a $100 billion program sponsored by the Bill and Melinda Gates foundation that was aimed towards creating a nationwide database that educators could upload data related to student performance in order to better track how certain schools and districts are performing (formerly known as the Shared Learning Collaborative). That program was shut down in 2014 after concerns over the sharing of student data led parents to believe that the privacy of their children was at risk.

Lawmakers are keeping a watchful eye on those attempting to use big data analytics in the field of education as well. In 2015, 46 states introduced a total of 182 bills regarding student privacy and 15 states passed a total of 28 data privacy laws. This is on top of the 110 bills introduced and 24 student privacy laws passed in 2014 – a whopping amount of legislation.

Hopefully educators, parents and lawmakers can finally come to an accord as to how big data can be safely leveraged in schools to improve the quality of education in the U.S. and beyond – until then testing of these capabilities will be left only to institutions such as privately-held universities. But that’s certainly not stopping data scientists from trying to create the tools they believe will usher a new generation of learning – and perhaps produce the smartest generation of students yet.

What impact do you foresee big data to have on education? Let us know with your comments.


September 18, 2015  2:31 PM

Developers say: “Give us more documentation!”

Fred Churchville Fred Churchville Profile: Fred Churchville


Remember when you were young, and your parents demanded that you leave a note if you left the house? Well, now today’s developers and your parents have more in common than ever.

What do developers want more than anything? According to the Developer Insights Report released by Application Developers Alliance, it’s documentation. The report, conducted in conjunction with IDC, points out that one of the top reason development projects fail is because of changing or poorly documented requirements.

It’s not only a reason projects fail – documentation seems to, overall, just be a pain in the neck of many developers. Working in a large company often requires working on or with other developers’ code, a task that is frustrating in itself even if said code is well documented and explained. It’s cited as one of the top ten developer frustrations and some developers even say that they spend more time maintaining poorly documented code rather than actually writing new code.

And it’s not just documentation from higher ups or other developers that frustrate programmers – sometimes API management vendors don’t document things well either. In a review of Forrester’s 2014 report on the top API management platforms, customers said that they were disappointed with the documentation provided by even some of the most veteran vendors.

People are listening to these demands, thankfully. WSO2, another API management platform reviewed by Forrester, says that one of the things his company prides themselves on is solving the “poor documentation barrier.”

“You can go to our API store, which is very similar to a Google Play store or an Apple iTunes App Store, and you can view information about the API…read the documentation,” says Haddad. “We have a lot of ideas on the roadmap and that we are incorporating into the API management platform, such as better notification features so that you can let the community know that a new version is available or that new documentation has been released.”

Vendors such as Alpha Software are stepping up the documentation game as well. At the 2015 Alpha DevCon conference, co-founder Selwyn Rabins announced – to the sound of thunderous applause, no less – that they are actively looking to solve the “documentation scatter” problem.

“The documentation is pretty good,” says Rabins. “It’s just not always clear where to find it.”

To solve this, Selwyn and his team are working on enabling easy (read: fast) searching of existing documentation topics and adding features to make it simpler to edit and contribute documentation in order to ensure that information is up to date.

“Alpha is very large,” says Rabins. “Finding documentation is going to be a big boost to productivity.”

In terms of documentation within organizations themselves – well, that is on clients and developers to just hear the calls of their associates. Hopefully the rise of open source development can help, but in the meantime, it seems, one of the best things that developers and clients can do for their peers is, as your parents would say, “leave a note!”

Do you experience documentation issues within your organization? Let us know what they are – and what you plan to do about it.

September 16, 2015  1:08 PM

How do you run a country using big data?

Fred Churchville Fred Churchville Profile: Fred Churchville
Big Data, Big Data analytics, Data Analytics, Data governance

From Big Data Innovation 2015, Boston

How do you lower the cost of sequencing a human genome? Big data. How do you accurately predict the movement and severity of deadly weather patterns like hurricane Sandy? Big data. How do you create heat maps to I.D. high risk areas and plan for large-scale emergency operations?

See a pattern developing? Big data analytics are being leveraged in all sorts of ways within the public sector in order to tackle “big picture” problems that go way beyond traditional CRM applications, like tracking how often a customer buys batteries at WalMart.

Timothy Pearsons, head scientist for the U.S. Government Accountability Office is one of the most excited when it comes to leveraging big data to, in his words, “make good government better government.”

These efforts have certainly paid off, as the numbers show. For example, the USDA was able to successfully leverage big data analytics in order to prevent the payment of about $2.5 billion in fraudulent insurance payments. In another famed example, New York’s former mayor Michael Bloomberg was given the unofficial title of “director of analytics” when he set up a full-force “geek squad” to revolutionize the city’s ability to crack down on illegal housing conversions.

But, using big data in the public sector does not come without its challenges. For a start, warns Pearsons, many data sets are far too poorly organized in order to be truly usable. And when the data is unreliable, the results can be disastrous, as with Google’s grossly exaggerated estimation of flu outbreaks in 2013.

And, of course, privacy is another issue. Even data tracking programs conducted with the best intentions can unintentionally result in violating people’s privacy, as was showcased in a 2013 Cambridge University study which showed that simply tracking the patterns of people’s “likes” on Facebook can easily be used to determine people’s private political views, drug use, marital status, sexuality, race and more.

The problem of working with poorly organized data may simply be a matter of waiting for the correct data to become available as data management systems improve. Acts such as the DATA Act of 2014 and GPRA Modernization Act were put in place to ensure cooperation in data mining efforts across government sections.

The privacy issue is a trickier one to tackle. As it stands, the Privacy Act of 1974 has been the go-to standard for protecting citizen’s sensitive information, but does that act – along with the Fair Information Practice Principles (FIPPs) it ushered in – contain enough reach to protect people in the era of big data, or does it need to be revised to keep up?

It’s pretty clear that big data use in government is still somewhat in its infancy – and even Pearsons admits that key decision makers still need to get up to speed with focus on big data management. But it’s getting there – one set of data at a time.

What impact do you expect big data to have on the way government works?

September 15, 2015  6:19 PM

How big data could help prevent food shortages

Fred Churchville Fred Churchville Profile: Fred Churchville
Big Data, Big Data analytics, big data applications, Data Analytics

From Big Data Innovation 2015, Boston

In 2014, Randy Dowdy, a farmer from Georgia, set the highest yield ever in the National Corn Growers Association National Corn Yield Contest with 503 bushels per acre. What was the secret behind his record breaking yield? According to Erik Andrejko, head of Data Science at the Climate Corporation, it is a result of data science – using big data to support decision making processes that are crucial to production results.

And, believe it or not, big data could help prevent a potential famine in the future. According to current growing and consumption rates, agriculture experts expect a 60% shortfall in crop production by 2050. But by using technology that helps analyze and model big data, Andrejko believes farmers can prevent this from happening.

The use of technology in agriculture in order to increase crop yields is not new – farmers have been leveraging the use of self-driving tractors for years. The use of big data analytics in farming is still in its early stages, but exponential decreases in storage in compute costs, increases in mobile connectivity, and advances in IoT technology is opening the door for big data to make a big impact.

So how exactly can applying data science can impact farming efforts – or “farm to fork,” as Andrejko calls it? It starts with the installation of ubiquitous sensors in agricultural fields – 100 billion over the next four years. These sensors can detect key information, such as soil moisture, that will help ensure better care for hundreds of acres of crops. If these sensors can help farmers better regulate the application of nutrients like nitrogen – on which farmers spend almost $2.5 billion a year – it could revolutionize the business.

But there are still plenty of challenges in applying data science to agricultural needs: namely, a prominent amount of missing and sparse data. And even once you have the data, how do you determine which pieces of data actually indicate a potential impact on yield? In other words, how do you actually turn most of the data into useful information?

Andrejko says the answer to this is creating structural, usable models that explicitly lay out how each piece of data fits into the yield equation. By applying these models – and teaching farmers how to use them appropriately – he hopes they can use them to make higher yields a reality.

September 11, 2015  9:18 PM

Red Sox aim to make big plays with big data

Fred Churchville Fred Churchville Profile: Fred Churchville
Big Data, Big Data analytics, Data Analytics

From Big Data Innovation 2015, Boston

BigDataBoston2015 - Red Sox

Ted Zue of the Red Sox organization explains how big data is being put to use at Fenway at the 2015 Big Data Innovation Summit, Boston

How is big data impacting America’s favorite pastime?

Big data first came to fame within the baseball world with the Moneyball story – the Oakland Athletics’ use of high-level data analytics to find the best players available. However, big data use in baseball is going beyond the desks of talent scouts. In the last few years, for example, it has been used by many organizations to adjust ticket pricing to more accurately reflect real-world demand and buying patterns, theoretically resulting in fairer prices for everyone. The Milwaukee Brewers are even using big data analytics in order to figure out how to turn sporadic attendees into season ticket holders.

So how are the Boston Red Sox leveraging big data? While they are still very much in the early stages, Red Sox vice president of business operations Tim Zue has a lot to say when it comes to how they are making big data analytics an integral part of their business model.

“We’re still in the 2nd or 3rd inning of a nine inning game,” admitted Zue during his talk at the 2015 Big Data Innovation Summit in Boston. But while they might not have been so quick on the uptake, the Red Sox are now using big data to move away from their traditionally static ticket pricing model to a more variable one that adjusts prices based on how popular (or unpopular) certain games are during the year. By examining trends surrounding attendance, they were able to successfully create a data model that indicated where tickets for certain games may be priced too low or too high.

And their use of big data does not stop at the ticket sale. Zue went on to talk about how the organization is working on aggregating data around fan behavior once they are at Fenway Park in order to improve the visitor experience. So, for example, when are fans buying the most beer? Are they looking for a particular kind? Do they have a favorite concession stand they go to?

Using this data, Zue says they will be able to create a sort of “concessions heat map” that creates a clear picture of fan behavior at games. This, he hopes, will lead to the development of a “fan dashboard” application that keeps track of common purchases and even uses geolocation beacons to deliver important information or alert fans when they are walking by a stand that offers one of their favorite items – an idea that is already being put into practice in some ballparks.

As Zue said, the Red Sox are still in the very early stages of turning their big data dreams into reality. But it will be interesting to see how they use this technology to turn America’s oldest ballpark into a place that offers a high-tech fan experience.

September 1, 2015  3:46 PM

Five API management products worth noticing

Fred Churchville Fred Churchville Profile: Fred Churchville
API, API management, Apigent Technologies, Application Programming Interface

With API management becoming increasingly important in the enterprise application development space, it’s not a bad idea to consider your options when it comes to a vendor. Forrester recently named their top API management vendors, and here are five that I think are interesting players in the field:

Layer 7 (CA Technologies)

A known veteran to the API scene, the Layer 7 management suite has had a head start on the competition and appears to still hold the lead after being acquired by CA Technologies. Notable aspects include its strong integration features, big data analytics capabilities, and the fact that it has top-tier support for mobile development – which can’t hurt given a wide interest among enterprises in a mobile-first approach. And it provides all this while still maintaining a mid-range price.


Formerly known as SOA Software, Akana still manages to provide strong API lifecycle management features and its ability to federate support for multiple providers of APIs. In fact, it is making a name for itself by moving into the healthcare space, with goals of increasing electronic and multi-device access to health records while at the same time complying with governmental regulations and lowering operational costs.

Forrester states that Akana is “consistently strong across all of our primary evaluation criteria.”


WS02 is an interesting provider to look at given that it is one of the few open source API management platform – and the only one to be reviewed by Forrester. This allows customers to readily extend and customize their management features to suit their business needs. While WS02 seemingly lends itself to a “do-it-yourself” approach, Chris Haddad, vice president of Technology Evangelism at WSO2, insists that the company strives to provide copious amounts of documentation to assist developers.

Forrester notes that WS02 has “among the best features for API design and creation, as well as strong transformation and integration” – while remaining one of the lowest cost products.


The API manager Mashery has switched owners once again, being traded off from Intel to infrastructure provider Tibco. The API management offering, however, should continue to provide strong support for software as a service and data analytics, and it will be interesting to see what Tibco can do with their product line.


Mulesoft continues to display “strengths in API design and integration” according to the Forrester report, offering strong testing capabilities and resources for collaborative API building. It is worth noting that Mulesoft has been named a leader across three different Gartner Magic Quadrants: On-Premises Application Integration Suites, Application Services Governance and Enterprise iPaaS. The recently released Anypoint Platform for Mobile aims to help developers quickly develop apps that need integration with backend apps, data and services.

MuleSoft founder Ross Mason aims to turn Mulesoft into the next Cisco of application networking.

Which API management platforms have caught your attention? Let us know with your comments.

August 31, 2015  2:57 PM

Really, how mature is mobile development?

Fred Churchville Fred Churchville Profile: Fred Churchville
Application development, Mobile Application Development, mobile application management, Mobile development

“There are two types of companies out there: those who use mobile appropriately and those who are still scratching their heads.” – Matthew David, Senior Manager of Mobile Management at Kimberly-Clark

The release of a survey by Red Hat regarding the hiring practices of today’s organizations when it comes to mobile development brought to mind the following question: While it’s hard to find a company that doesn’t have plans to ramp up mobile capabilities these days, how “mature” are today’s companies when it comes to mobile?

“That’s a tough question,” said Red Hat’s vice president of Mobility, Cathal McGloin. “I would say a minority are at that point of truly understanding mobile as being a change for their business. The vast majority of them are still trying to find a way and dabbling with internal apps…and then you have the laggards at the end.”

But what is it exactly that holds an enterprise back when it comes to mobility? Is it the technology? The expertise?  Or something else?

“I think the challenge is that they don’t know where to start,” said Matthew David, senior manager at Kimberly Clark’s Center of Mobile Excellence. “You can do mobile solutions that improve the efficiency of your staff, there are mobile solutions for your sales organization, you have solutions that can help improve communication with customers…and each of those are different types of challenges.”

Another thing that David notes is a critical misunderstanding of the pure nature of mobile application development.

“One of the challenges I see – specifically among older CIOs – is that they think developing for mobile is the same as developing for Windows or for the Web. And it’s not: it’s a completely different environment. The technologies are the same, but the paradigms are completely different.”

David and McGloin are definitely in agreement in regards to the maturity of the mobile development space, with David placing most organizations at about a “one or a two out of five” when it comes to mobile development. But both McGloin and David are hopeful when it comes to where organizations will take mobile in the future.

“We’re seeing a growing understanding that success in mobile is not just about building client-sides, and hoping that somehow they’ll integrate with your organization,” says McGloin. “It’s recognizing that the back end integration is an important thing, and finding the skills to do that.”

So when can we expect mobile development to actually “mature” for most organizations? Probably in about three to five years, according to David. In fact, a big portion of organizations still consider email on smartphones and tablets to be a “mission-critical” aspect of their mobile development – which is sort of laughable considering how many business processes can be improved via strong mobile app development.

And David expects APIs to big a big part of that maturity process. In his opinion, an organization’s ability to cultivate strong API management will separate the strong from the weak, so to speak.

“When you’re looking at designing solutions, they need to be able to work across multiple platforms. And the only way to effectively do that is with a strong, scalable API infrastructure,” he says. “That API – that’s your lifeblood.”

So, in short, there’s certainly no reason to give up on mobile development, but don’t hold your breath waiting for the next big mobile revolution within the enterprise.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: