The creative side of software development attracted Shalom Keynan to the profession. Now director of application development for Boston Heart Diagnostics, he is still fired up about improving patient care through software creation. To keep the ideas and projects flowing, however, he faces the challenge of hiring and training developers to handle mobile, cloud, microservices and other new technologies.
“Every morning, I feel very excited to get to work because I know I will discover new ways to build better software,” Keynan said. “That’s what motivates me, and I think my excitement and openness to new ideas motivate my development team.”
A day in Keynan’s work life largely consists of engagements with application development team members, who aren’t just developers. The team includes marketing, clinical researchers, scientists, developers, software QA and, most importantly, users of the applications in a project.
While Keynan uses some outside consultants for design work, most development is done internally. The team builds and manages a wide range of health care applications, such as apps that generate personalized reports for specific physicians and patients and physician and patient web portals.
Over the past decade, he and his team have had to increase their mobile development skills to meet the demand for smartphone, tablet and custom mobile device applications. Getting into mobile started a domino effect. “Creating applications for mobile has taken us deeper into the cloud and recently into building microservices,” said Keynan. In most cases, he’s worked on training his existing team on the skilss needed for these new technologies.
Skills shortage? What to do?
When recruiting developers, Keynan has not seen a shortage of developers, but too few developers with the skills he needs. In some cases, he’s taken another approach to hiring. “I looked for the qualities we need in a developer, instead,” he said. These qualities include the following:
• Can the developer learn quickly?
• Does her resume show initiative and motivation?
• Does he engage in conversation and sharing ideas easily?
If these qualities are present, a lack of specific advanced skills may not matter. “If you ask me to choose between the developer who knows everything and a developer who learns quickly, I would choose the second one,” he said. “Technologies evolve quickly, and that’s why that quality is so important.”
Advice for job seekers
Keynan advises job-hunting developers to cultivate an understanding of how to relate business objectives and user needs to app requirements. Understand the business needs. Listen carefully and understand who the consumer is. To deliver what is needed, an insightful approach is as important as technology knowledge.
In too many projects, Keynan said, the user sees the end result and says: “You built what we told you to, but that’s not what we wanted.”
When it comes to retaining employees, Kenyan thinks managers should empower them by sharing responsibilities. “No manager can master everything,” he said. Work in tandem with other developers, not in a separated, top-down way. Also, make sure that team members share their knowledge with each other. “Being exposed to others’ techniques makes people excited about work.” Keep in mind that boredom and developer turnover go hand-in-hand.
Most importantly, Kenyan said, team managers can retain developers by showing them how their work makes a difference in people’s lives. “That feeling that I’m helping people is what gets me to work each morning,” he said. “It’s an important motivator.”
Recently, I talked with an IoT expert, Phani Pandrangi, about the levels of technologies and services needed to remotely control lighting, as well as the role of APIs in the process. Pandrangi, chief product officer at Kii, explains how a solution such as this in engineered and discusses how APIs are exposed and managed during the process.
The typical IoT application development team consists of firmware, service-level and API designers and developers. “A company might not have all of these skill sets in-house, because they never needed them until now,” said Pandrangi. So, enterprise architects and CIOs charged with IoT projects have to analyze what can be built with in-house staff versus what requires the support an IoT platform vendors’ technical teams. For example, Kii’s technical team recently worked with Yankon Ltd., an LED lighting manufacturer, on the afore-mentioned light bulb project.
To enable remote control of a bulb, technologies are needed at the bulb, cloud services, application and other levels, Pandrangi said. For example, at the device level there are ‘device agents,’ like RAM, inside the firmware of the device – in this case, a bulb. Another layer consists of back-end services, which do many things, including managing data devices and providing analytics reports. Then, there are application, architecture and infrastructure layers, some of which exist before the IoT project is started.
Management of smart devices requires remote connectivity and interoperability. “The user is turning the bulb on, off, changing brightness, etc.,” Pandrangi said. “That comes back to back-end services level.” And then, at the application level, details of functionality of that app are exposed through an API In the light bulb case, remote management can control lighting turn on, turn off, brightness and things like that.
Next, beyond simply being able to do remote control, back-end services must include business intelligence functionality that can, for example, show patterns of usage such as average brightness levels.
Typically, back-end services are integrated and tailored to a specific architecture through APIs. “They are exposed as APIs for apps, Web apps, etc.,” Pandrangi said. “And then, when an API call is made on the cloud side, what happens is, in non-technical terms, is a command that says, ‘Turn on the bulb in Phani’s living room.'” Then the cloud service – which contains all the information relevant to the lighting in Phani’s home, automatically sends an MQTT protocol message telling the bulb what to do.
The APIs device manufacturers used for IoT device development are very solution-dependent, so generic APIs won’t do, said Pandrangi. The device manufacturers customize the APIs and expose them to the device application through an IoT platform like Kii’s, which contains back-end services, like user management, data management, device management, analytics and so on (as well as APIs).
Written in conjunction with Jan Stafford
From Big Data Innovation Summit 2015, Boston
I recently wrote a blog post about how big data is being leveraged in the public sector. But that is not where the use of big data analytics for the public good ends – today’s educators are teaming up with data scientists to determine how these analytics can be used to create tools to help students get more out of their education.
Unfortunately, at the moment, the field of education is “almost a data-free zone,” according to Henry Kelley, former chief scientist at the Energy Policy and Systems Analysis (EPSA), saying that the space is plagued by small sample sizes, flawed methods and a lack of testing methods that generate needed data.
But big data is making an entrance nonetheless. In one example, providers of massive open online course (MOOCs) are conducting massive analytics on student performance, creating what Stanford University calls a valuable data cauldron.
Likewise, the Kahn Academy, a non-profit, Web-based educational organization created in 2006, used data analytics in the form of A/B testing to determine what aspects of their curriculum were leading to higher learning results – and which ones actually created poor learning results. By looking at data collected they were able to determine that providing a “sneak peek” of certain programming courses actually discouraged students from moving on to the lecture, resulting in much lower engagement results – and learning outcomes – compared to the group that was not offered a preview.
In a final example, the renowned educational publisher McGraw Hill has been leveraging big data analytics in order to produce Connect Insights, a program that allows both students and teachers to track classroom performance as a function of time in an AWS-based application. Using a MongoDB database, they are able to store student’s grades, submission times, upcoming assignment and more in the form of JSON data and present that data in a simplified, easy-to-use format for the user. By providing this continuous feedback to students, the creators of this application believe it will help both students and teachers discover where students may be experiencing “gaps” in their performance and ultimately determine how to improve overall learning outcomes.
As exciting as the prospect is, however, there are still a number of hurdles that need to be overcome before big data can truly enable the next generation of learning. According to Kelley, these data-based programs run into familiar obstacles such as interoperability issues, formatting problems and challenges around using metadata. He also notes that the sheer complexity of education in the US and worldwide multiplies these challenges enormously – especially when trying to track the performance of students involved in complex tasks.
But there is one issue in particular that proves to be a major stumbling block for the proponents of big data in education: the privacy of the students themselves. Those familiar in the space may recall the establishment of inBloom, a $100 billion program sponsored by the Bill and Melinda Gates foundation that was aimed towards creating a nationwide database that educators could upload data related to student performance in order to better track how certain schools and districts are performing (formerly known as the Shared Learning Collaborative). That program was shut down in 2014 after concerns over the sharing of student data led parents to believe that the privacy of their children was at risk.
Lawmakers are keeping a watchful eye on those attempting to use big data analytics in the field of education as well. In 2015, 46 states introduced a total of 182 bills regarding student privacy and 15 states passed a total of 28 data privacy laws. This is on top of the 110 bills introduced and 24 student privacy laws passed in 2014 – a whopping amount of legislation.
Hopefully educators, parents and lawmakers can finally come to an accord as to how big data can be safely leveraged in schools to improve the quality of education in the U.S. and beyond – until then testing of these capabilities will be left only to institutions such as privately-held universities. But that’s certainly not stopping data scientists from trying to create the tools they believe will usher a new generation of learning – and perhaps produce the smartest generation of students yet.
Remember when you were young, and your parents demanded that you leave a note if you left the house? Well, now today’s developers and your parents have more in common than ever.
What do developers want more than anything? According to the Developer Insights Report released by Application Developers Alliance, it’s documentation. The report, conducted in conjunction with IDC, points out that one of the top reason development projects fail is because of changing or poorly documented requirements.
It’s not only a reason projects fail – documentation seems to, overall, just be a pain in the neck of many developers. Working in a large company often requires working on or with other developers’ code, a task that is frustrating in itself even if said code is well documented and explained. It’s cited as one of the top ten developer frustrations and some developers even say that they spend more time maintaining poorly documented code rather than actually writing new code.
And it’s not just documentation from higher ups or other developers that frustrate programmers – sometimes API management vendors don’t document things well either. In a review of Forrester’s 2014 report on the top API management platforms, customers said that they were disappointed with the documentation provided by even some of the most veteran vendors.
People are listening to these demands, thankfully. WSO2, another API management platform reviewed by Forrester, says that one of the things his company prides themselves on is solving the “poor documentation barrier.”
“You can go to our API store, which is very similar to a Google Play store or an Apple iTunes App Store, and you can view information about the API…read the documentation,” says Haddad. “We have a lot of ideas on the roadmap and that we are incorporating into the API management platform, such as better notification features so that you can let the community know that a new version is available or that new documentation has been released.”
Vendors such as Alpha Software are stepping up the documentation game as well. At the 2015 Alpha DevCon conference, co-founder Selwyn Rabins announced – to the sound of thunderous applause, no less – that they are actively looking to solve the “documentation scatter” problem.
“The documentation is pretty good,” says Rabins. “It’s just not always clear where to find it.”
To solve this, Selwyn and his team are working on enabling easy (read: fast) searching of existing documentation topics and adding features to make it simpler to edit and contribute documentation in order to ensure that information is up to date.
“Alpha is very large,” says Rabins. “Finding documentation is going to be a big boost to productivity.”
In terms of documentation within organizations themselves – well, that is on clients and developers to just hear the calls of their associates. Hopefully the rise of open source development can help, but in the meantime, it seems, one of the best things that developers and clients can do for their peers is, as your parents would say, “leave a note!”
From Big Data Innovation 2015, Boston
How do you lower the cost of sequencing a human genome? Big data. How do you accurately predict the movement and severity of deadly weather patterns like hurricane Sandy? Big data. How do you create heat maps to I.D. high risk areas and plan for large-scale emergency operations?
See a pattern developing? Big data analytics are being leveraged in all sorts of ways within the public sector in order to tackle “big picture” problems that go way beyond traditional CRM applications, like tracking how often a customer buys batteries at WalMart.
Timothy Pearsons, head scientist for the U.S. Government Accountability Office is one of the most excited when it comes to leveraging big data to, in his words, “make good government better government.”
These efforts have certainly paid off, as the numbers show. For example, the USDA was able to successfully leverage big data analytics in order to prevent the payment of about $2.5 billion in fraudulent insurance payments. In another famed example, New York’s former mayor Michael Bloomberg was given the unofficial title of “director of analytics” when he set up a full-force “geek squad” to revolutionize the city’s ability to crack down on illegal housing conversions.
But, using big data in the public sector does not come without its challenges. For a start, warns Pearsons, many data sets are far too poorly organized in order to be truly usable. And when the data is unreliable, the results can be disastrous, as with Google’s grossly exaggerated estimation of flu outbreaks in 2013.
And, of course, privacy is another issue. Even data tracking programs conducted with the best intentions can unintentionally result in violating people’s privacy, as was showcased in a 2013 Cambridge University study which showed that simply tracking the patterns of people’s “likes” on Facebook can easily be used to determine people’s private political views, drug use, marital status, sexuality, race and more.
The problem of working with poorly organized data may simply be a matter of waiting for the correct data to become available as data management systems improve. Acts such as the DATA Act of 2014 and GPRA Modernization Act were put in place to ensure cooperation in data mining efforts across government sections.
The privacy issue is a trickier one to tackle. As it stands, the Privacy Act of 1974 has been the go-to standard for protecting citizen’s sensitive information, but does that act – along with the Fair Information Practice Principles (FIPPs) it ushered in – contain enough reach to protect people in the era of big data, or does it need to be revised to keep up?
It’s pretty clear that big data use in government is still somewhat in its infancy – and even Pearsons admits that key decision makers still need to get up to speed with focus on big data management. But it’s getting there – one set of data at a time.
From Big Data Innovation 2015, Boston
In 2014, Randy Dowdy, a farmer from Georgia, set the highest yield ever in the National Corn Growers Association National Corn Yield Contest with 503 bushels per acre. What was the secret behind his record breaking yield? According to Erik Andrejko, head of Data Science at the Climate Corporation, it is a result of data science – using big data to support decision making processes that are crucial to production results.
And, believe it or not, big data could help prevent a potential famine in the future. According to current growing and consumption rates, agriculture experts expect a 60% shortfall in crop production by 2050. But by using technology that helps analyze and model big data, Andrejko believes farmers can prevent this from happening.
The use of technology in agriculture in order to increase crop yields is not new – farmers have been leveraging the use of self-driving tractors for years. The use of big data analytics in farming is still in its early stages, but exponential decreases in storage in compute costs, increases in mobile connectivity, and advances in IoT technology is opening the door for big data to make a big impact.
So how exactly can applying data science can impact farming efforts – or “farm to fork,” as Andrejko calls it? It starts with the installation of ubiquitous sensors in agricultural fields – 100 billion over the next four years. These sensors can detect key information, such as soil moisture, that will help ensure better care for hundreds of acres of crops. If these sensors can help farmers better regulate the application of nutrients like nitrogen – on which farmers spend almost $2.5 billion a year – it could revolutionize the business.
But there are still plenty of challenges in applying data science to agricultural needs: namely, a prominent amount of missing and sparse data. And even once you have the data, how do you determine which pieces of data actually indicate a potential impact on yield? In other words, how do you actually turn most of the data into useful information?
Andrejko says the answer to this is creating structural, usable models that explicitly lay out how each piece of data fits into the yield equation. By applying these models – and teaching farmers how to use them appropriately – he hopes they can use them to make higher yields a reality.
From Big Data Innovation 2015, Boston
How is big data impacting America’s favorite pastime?
Big data first came to fame within the baseball world with the Moneyball story – the Oakland Athletics’ use of high-level data analytics to find the best players available. However, big data use in baseball is going beyond the desks of talent scouts. In the last few years, for example, it has been used by many organizations to adjust ticket pricing to more accurately reflect real-world demand and buying patterns, theoretically resulting in fairer prices for everyone. The Milwaukee Brewers are even using big data analytics in order to figure out how to turn sporadic attendees into season ticket holders.
So how are the Boston Red Sox leveraging big data? While they are still very much in the early stages, Red Sox vice president of business operations Tim Zue has a lot to say when it comes to how they are making big data analytics an integral part of their business model.
“We’re still in the 2nd or 3rd inning of a nine inning game,” admitted Zue during his talk at the 2015 Big Data Innovation Summit in Boston. But while they might not have been so quick on the uptake, the Red Sox are now using big data to move away from their traditionally static ticket pricing model to a more variable one that adjusts prices based on how popular (or unpopular) certain games are during the year. By examining trends surrounding attendance, they were able to successfully create a data model that indicated where tickets for certain games may be priced too low or too high.
And their use of big data does not stop at the ticket sale. Zue went on to talk about how the organization is working on aggregating data around fan behavior once they are at Fenway Park in order to improve the visitor experience. So, for example, when are fans buying the most beer? Are they looking for a particular kind? Do they have a favorite concession stand they go to?
Using this data, Zue says they will be able to create a sort of “concessions heat map” that creates a clear picture of fan behavior at games. This, he hopes, will lead to the development of a “fan dashboard” application that keeps track of common purchases and even uses geolocation beacons to deliver important information or alert fans when they are walking by a stand that offers one of their favorite items – an idea that is already being put into practice in some ballparks.
As Zue said, the Red Sox are still in the very early stages of turning their big data dreams into reality. But it will be interesting to see how they use this technology to turn America’s oldest ballpark into a place that offers a high-tech fan experience.
With API management becoming increasingly important in the enterprise application development space, it’s not a bad idea to consider your options when it comes to a vendor. Forrester recently named their top API management vendors, and here are five that I think are interesting players in the field:
Layer 7 (CA Technologies)
A known veteran to the API scene, the Layer 7 management suite has had a head start on the competition and appears to still hold the lead after being acquired by CA Technologies. Notable aspects include its strong integration features, big data analytics capabilities, and the fact that it has top-tier support for mobile development – which can’t hurt given a wide interest among enterprises in a mobile-first approach. And it provides all this while still maintaining a mid-range price.
Formerly known as SOA Software, Akana still manages to provide strong API lifecycle management features and its ability to federate support for multiple providers of APIs. In fact, it is making a name for itself by moving into the healthcare space, with goals of increasing electronic and multi-device access to health records while at the same time complying with governmental regulations and lowering operational costs.
Forrester states that Akana is “consistently strong across all of our primary evaluation criteria.”
WS02 is an interesting provider to look at given that it is one of the few open source API management platform – and the only one to be reviewed by Forrester. This allows customers to readily extend and customize their management features to suit their business needs. While WS02 seemingly lends itself to a “do-it-yourself” approach, Chris Haddad, vice president of Technology Evangelism at WSO2, insists that the company strives to provide copious amounts of documentation to assist developers.
Forrester notes that WS02 has “among the best features for API design and creation, as well as strong transformation and integration” – while remaining one of the lowest cost products.
The API manager Mashery has switched owners once again, being traded off from Intel to infrastructure provider Tibco. The API management offering, however, should continue to provide strong support for software as a service and data analytics, and it will be interesting to see what Tibco can do with their product line.
Mulesoft continues to display “strengths in API design and integration” according to the Forrester report, offering strong testing capabilities and resources for collaborative API building. It is worth noting that Mulesoft has been named a leader across three different Gartner Magic Quadrants: On-Premises Application Integration Suites, Application Services Governance and Enterprise iPaaS. The recently released Anypoint Platform for Mobile aims to help developers quickly develop apps that need integration with backend apps, data and services.
MuleSoft founder Ross Mason aims to turn Mulesoft into the next Cisco of application networking.
“There are two types of companies out there: those who use mobile appropriately and those who are still scratching their heads.” – Matthew David, Senior Manager of Mobile Management at Kimberly-Clark
The release of a survey by Red Hat regarding the hiring practices of today’s organizations when it comes to mobile development brought to mind the following question: While it’s hard to find a company that doesn’t have plans to ramp up mobile capabilities these days, how “mature” are today’s companies when it comes to mobile?
“That’s a tough question,” said Red Hat’s vice president of Mobility, Cathal McGloin. “I would say a minority are at that point of truly understanding mobile as being a change for their business. The vast majority of them are still trying to find a way and dabbling with internal apps…and then you have the laggards at the end.”
But what is it exactly that holds an enterprise back when it comes to mobility? Is it the technology? The expertise? Or something else?
“I think the challenge is that they don’t know where to start,” said Matthew David, senior manager at Kimberly Clark’s Center of Mobile Excellence. “You can do mobile solutions that improve the efficiency of your staff, there are mobile solutions for your sales organization, you have solutions that can help improve communication with customers…and each of those are different types of challenges.”
Another thing that David notes is a critical misunderstanding of the pure nature of mobile application development.
“One of the challenges I see – specifically among older CIOs – is that they think developing for mobile is the same as developing for Windows or for the Web. And it’s not: it’s a completely different environment. The technologies are the same, but the paradigms are completely different.”
David and McGloin are definitely in agreement in regards to the maturity of the mobile development space, with David placing most organizations at about a “one or a two out of five” when it comes to mobile development. But both McGloin and David are hopeful when it comes to where organizations will take mobile in the future.
“We’re seeing a growing understanding that success in mobile is not just about building client-sides, and hoping that somehow they’ll integrate with your organization,” says McGloin. “It’s recognizing that the back end integration is an important thing, and finding the skills to do that.”
So when can we expect mobile development to actually “mature” for most organizations? Probably in about three to five years, according to David. In fact, a big portion of organizations still consider email on smartphones and tablets to be a “mission-critical” aspect of their mobile development – which is sort of laughable considering how many business processes can be improved via strong mobile app development.
And David expects APIs to big a big part of that maturity process. In his opinion, an organization’s ability to cultivate strong API management will separate the strong from the weak, so to speak.
“When you’re looking at designing solutions, they need to be able to work across multiple platforms. And the only way to effectively do that is with a strong, scalable API infrastructure,” he says. “That API – that’s your lifeblood.”
So, in short, there’s certainly no reason to give up on mobile development, but don’t hold your breath waiting for the next big mobile revolution within the enterprise.
The idea of microservices appears to be quickly gaining fame among today’s enterprises, however some IT experts warn that organizations may not be “culturally” prepared to leverage this new paradigm in a way that is beneficial for their business.
“People are asking about it a lot. They’re interested in what it is and how it’s helping companies become successful,” says Christian Posta, Principal Middleware Specialist/Architect at Red Hat and committer to the Apache Software Foundation. “But since it’s much more, at least in my opinion, about how an organization structures itself, it’s going to be difficult for some of these large enterprises that are structured the way that they are to adopt that architecture and really take it to its full extent. I think that’s the big challenge.”
But why is this organizational structure so important when it comes to delivering microservices? And what does it take to establish the “culture” needed to realistically adopt this approach to application development?
Arun Gupta, director of technical marketing and developer advocacy at Red Hat, says that this cultural requirement can be directly linked to Conway’s Law, a technology-based “philosophy” which claims that an organization’s application structure is going to be a very close mimic of their IT organizational structure.
“If your organizational structure is one UI team, one database team, one middle-tier team, etc., your application code will start showing those layers,” says Gupta. “[These layers] might not work well with each other, and they will have to have a ‘workaround’ to work with the other layers.”
“With microservice,” he adds, “because we are following mostly the domain-driven approach, the idea is to have a cross-functional team.”
So how do you create these types of microservice-friendly teams? The first step, says Posta, is to think small and take what Amazon calls the “two-pizza box mentality” – as in, create smaller, multi-functional teams that are no bigger than what two pizzas can feed.
“You need small teams that are focused on their one service or set of services – and that team is typically made up of both developers and operations-type people, or at least developers who do a lot of operations-type stuff,” says Posta. “And to me, that’s the pinnacle of what people want when they talk about DevOps.”
But not only do you need the right team structure, says Posta – the quality of these team members matters too. Do you have people do that are willing to learn? Are they interested in building a better product? Do you have developers that understand the business side of things as well?
This is a challenge for many larger enterprises, Posta believes, where business-decision makers may not see technology as core part of their business’s ability to innovate. However, he stresses that maintaining this focus is one of the three major qualifications for success with microservices, alongside having the right team structure and creating a culture for rapid innovation.
Gupta stresses that establishing this DevOps culture should absolutely be the first step an organization takes in their microservices journey, even if it doesn’t necessarily pan out. Establishing these strong DevOps practices will still benefit your organization.
“Today it’s microservices…tomorrow it’s going to be picaservices or nanoservices,” says Gupta. “So look at DevOps as the first, because that’s in general good. Whether you’re doing microservices, monolith, or whatever it is that you’re doing…that organizational alignment is one of the biggest challenges I’m seeing.”
Have you recently made, or do you plan to make, changes in your IT organizational structure in order to prepare for microservices? Tell us about your experiences with your comments.