The move to smart devices can’t be stopped. That means new opportunities for developers, especially in transaction processing and payments. But, don’t lose sight of the legacy applications that drive many of those transactions.
By 2018, consumers in mature markets will expand, rather than consolidate their device portfolio, resulting in the use of more than three personal devices, according to researcher Gartner’s “Predicts 2016” report. In a separate report, Gartner expects that by 2018 a majority of users will, for the first time, turn to a mobile device for all of their online activities. Yes, Gartner says “all.”
Do all of your online activities from a smartphone or tablet, and you’re essentially turning away from desktop technology. For developers it means interface design, data integration, transaction processing, and querying home-automation devices via the Internet rather than just the local in-home Wi-Fi setup.
Two key concerns, according to cloud consultant Judith Hurwitz are scalability and compliance. Apps on the mobile device, and the corresponding back-end server processing and data serving must be capable of scaling to peaks that might seem unrealistic, she says. Compliance becomes increasingly important. It might an a portal app for managing a patient-physician relationship or filling prescriptions through an online pharmacy. It could be placing equities orders with an online broker. Eventually, it will be interacting with your car’s diagnostics system.
Without a doubt, the rush to these new technologies is on. Developers are learning new skills and new languages. Analytics is an increasingly big part of dealing with big data. Despite this, we should not lose sight of systems that have been running at corporations for years and even decades. “Old software never dies,” Hurwitz says.
It’s an excellent point. Batch processes, such as monthly statement rendering programs at banks, can go untouched for what seem like eternity. Updated only to reflect the appearance of printed statements (my bank recently added color and changed typefaces), the underlying logic can go for decades without being touched. The original programmers may have long since retired or died, yet these applications (we used to call them programs) remain vital. And, there may no financial gain for the business in throwing old, fully functional applications on the scrap heap.
Sure, new technologies to support the Internet of Things are vital. Learning Apache Spark for big data streaming, Hadoop for distributed data processing, or Docker for containerization are vital for today’s work (and tomorrow’s). It’s also prudent not to lose sight of where a lot of corporate data remains.
When explaining the concept of cloud computing to friends unfamiliar with it, I usually turn to my imaginary recipe-of-the-day mobile and Web app as an illustrative example. Something that should be seen by users as the very model of simplicity gets very complicated very fast under the hood. It’s enough to make any developer lose his or her appetite. Are your apps doing something similar?
The premise is simple: suggest a daily recipe based on a variety of factors. The genius is blending my ingredient list (multiple data sources) to produce a finished product that’s easy to use, sports a great-looking user interface, and that will entice users to take some revenue-generating action, such as ordering ingredients or cookware online, or subscribing to a magazine.
Data source #1: User info. To get recipes, the user has to sign up, at minimum, with a user name, password, e-mail address, and postal code. The postal code is crucial, because a key function of the app is to suggest recipes based on specific location and weather conditions. That ensures a user in Maine won’t get recipes calling for collard greens, and that users in Phoenix in July won’t get recipes for steaming hot soups. Also, recipes might be sent not only for today, but for several days out, allowing the user to acquire ingredients not on hand.
Data source #2: Weather forecast based on postal code to ensure that a hearty stew, best for a cold, snowy day isn’t sent in the midst of a rare December heatwave. Obtained via API calls from a third-party service, such as Weather Underground.
Data source #3: The app owner’s database of recipes along with photos and links to discussion threads. Maybe links to how-videos also.
Data source #4: Analytics that reveals which recipes are most popular by region and time of year. It’s another ingredient in determining which recipe to suggest.
Data source #5: This could also be fields in the user info database. It includes user preferences — favorite and least-favorite cuisine types, self-rated level of cooking expertise, food allergies, how often to suggest a recipe, ingredients to avoid (George H.W. Bush famously hated broccoli), etc. Capture family birthdates, and the app could suggest birthday cake recipes and gifts 10 days in advance.
Data source #6: Current and future farm-fresh ingredients availability by location. It’s no good to suggest a recipe calling for fresh cranberries if they’re out of season. Another API call to somewhere.
Data source #7: Coupon codes and other promotional enticements for purchasing non-perishable ingredients and cookware through the application.
Data source #8: Pricing comparisons at local supermarkets for meats and veggies, likely extracted via APIs from a service that collects this type of data.
Data source #9: If the user makes a purchase, multiple data sources come into play for credit-card processing, shipping address, shipment tracking, and so on.
Data source #10: History database of user actions, including which recipes were viewed, saved, printed, rated, and commented on. What items did the user purchase? Re-display a favorite recipe a year later? What other recipes did the user seek and display? Analytics could prevent five straight days of soups, even though the weather outside is frightful and might suggest that.
In addition, there might also be integration opportunities via APIs with retail sponsors: if you’re making clam chowder and the weather is snowy, can we suggest the following cold-weather apparel items, or winter sporting goods, or vacation trips to a tropical resort?
After all this comes the matter of designing and building an application that looks great, presents all the aforementioned data as a completely seamless experience, and performs blazingly fast.
The point here is that nothing is simple. The specs for this app would be complicated. And, it takes a huge amount of talent to build an app that users enjoy and look forward to using repeatedly.
Are you building cloud and mobile apps that integrate data from a large number of sources? What’s your data mashup process and how do ensure stellar performance? Teach us how you solved these problems; we’d like to hear from you.
The Internet of Things, it seems, has been 80% hype and 20% real products and services. That’s going to change in 2016. The technology is mature and reliable. Security is getting better. APIs to access and leverage data from IoT sensors are becoming more commonplace. And, most importantly, IoT, so far largely a consumer novelty, is expanding from the home to the industrial sector.
This is all great except for one thing. There aren’t enough developers with enough technical ability in IoT combined with an understanding of business principles. At least not in the United States.
It is an issue that ETwater has been dealing with for years. The company, based in Novato, Calif., designs cloud-based IoT smart lawn irrigation systems for the consumer, industrial, and commercial sectors. It builds Wi-Fi hardware controllers that manage the multiple zones of a typical lawn sprinkler system. It also does billing of customers and provides a breadth of reports about water usage and savings. In the middle is an integration analytics engine that calculates when and how much to water, based on dozens of factors pulled in via APIs from a variety of sources. These include weather forecasts, humidity levels, what type of plantings are in each sprinkler zone, time of year and day, sun and wind conditions, sensor readings of soil moisture levels, and a whole lot more. It’s not the kind of application that comes to mind when I think IoT, but, when you look at all the pieces, it’s an exquisite blend of data that results in specific actions.
But, there’s a problem, according to CEO Lee Williams. He can’t find enough qualified developers with expertise in IoT. Call it an IoT talent shortage or gap. The company develops its hardware, software, and analytics with a co-located technical staff, consisting of a primary engineering team located in the Ukraine, two development teams in India, and a group of architects and user experience designers in the San Francisco area.
Williams told me, “It is difficult to find talent in the U.S. that is as sophisticated and capable as what some of the European teams can do in radio and wireless technology in particular.” And he was even more blunt about developers specializing in cloud-based mobile apps. “I would not say good senior mobile developers are widely available in the U.S. where I do feel they are available elsewhere.”
I wouldn’t go so far as to label this an indictment of how we grow our talent on these shores, but, it should serve as something of an alarm. The U.S. is not alone; the talent gap exists in Europe, too.
What is your experience in finding qualified developer talent to work on your company’s IoT, mobile, or cloud-based projects? Are you able to fill your open positions? Are you forced to hire expensive outside contractors for temporary help? Or are you turning to offshore technical expertise to get the job done? Share your opinions and experiences; we’d like to hear from you.
You’ve built all kinds of apps for cloud and mobile — retail, medical, financial, navigational, IoT and more. Most have sign-ons with security and authentication. Almost all integrate data from numerous disparate sources and combine to create something entirely new. You’ve designed user interfaces. You’ve streamed music and video. You’ve built user experiences.
But, have you built a game for the cloud? Way back in 2010, it was a concept big enough to be covered by CNN. Even Forbes magazine said cloud gaming would be a “game changer,” and the Wall Street Journal called gaming the killer app of cloud computing. Would you believe that IBM is “creating a business infrastructure for games? Serious stuff, this game playing is.
Graphics card maker NVIDIA has a section of its website devoted to GaaS. Called NVIDIA GRID it promises the ability to stream video games like any other streaming media. It “renders 3D games in cloud servers, encodes each frame instantly and streams the result to any device with a wired or wireless broadband connection.” The company already lists nine middleware suppliers and four IaaS providers that are playing along. An SDK is available, if you’re ready to get into building cloud games.
GamingAnywhere describes itself as an open-source clouding gaming platform designed to be extensible, portable, and reconfigurable. In this environment, games run on cloud servers while players interact via networked thin clients. The biggest challenge may not even be technical: The site notes that gamers are hard to please. They demand high responsiveness and high video quality, “but do not want to pay too much.” Truer words may never have been spoken.
The giant on the block, Amazon Web Service is a player, too. To quote from the AWS gaming website, “Amazon offers a comprehensive suite of services and products for game developers in any games industry vertical, for every major platform: Mobile, Social/Online and PC/Console.”
Sure, you’re building great cloud apps for the bank or insurance company you work for. But, inside of many of us lurks a cape-wearing superhero who wants to save the world. If that’s you, share your experiences about building games for the cloud. We’d like to hear from you.
Not even two years. HP’s Helion public cloud service, launched on May 6, 2014. It’s being euthanized on Jan. 31, 2016. If your organization has workload and data assets in the HP cloud, they must be moved elsewhere by that cutoff date.
Not even one year. In reality, HP pretty much threw in the towel after less than a year. Just 11 months after its launch, in an April 7, 2015 New York Times story, Bill Hilf, HP’s senior vice president of cloud products and services sounded the death knell: “We thought people would rent or buy computing from us. It turns out that it makes no sense for us to go head-to-head.” That’s head to head with the big three, Amazon Web Services, Microsoft Azure, and Google Cloud.
Selling public cloud services was a logical step. HP, the company with the largest share of the worldwide server market, would sell you compute services instead. According to figures published on Aug. 25, 2015 by researcher IDG, HP had a 25.4% share of server revenue worldwide. Dell came in second with 17.5% followed by IBM at 14.8%.
According to one published report from Sept. 2015, AWS alone operates between 1.5 million and more than 5 million servers of its own custom designs, optimized for specific workload types, with no more than 100,000 in any one data center. Amazon itself says that “the AWS Cloud operates in 30 Availability Zones within 11 geographic regions around the world, with more coming in 2016.” HP simply wasn’t equipped to compete on this scale.
Take a step back and gain a good understanding of where, precisely, your cloud services are running. If anything is currently on HP Helion, there’s precious little time remaining to find another home, then migrate and test data and applications. Indeed, you will have been to Helion and back.
Even the most cursory glance at cloud service offerings makes you immediately see why nearly every company is there today. It’s cheap, it’s fast, and at the larger providers like AWS, it’s even very secure. (AWS always gets high marks for security). Who wouldn’t be seduced by all those computers — bought, paid for and managed by someone else — available for 2 cents a minute, or less?
Seduced may, in fact, be exactly the right word for it. I recently spoke at length with an IT manager of a 500 person company in the food service industry. (To protect her job, I’m not naming her or her employer.) She’s been the manager there for just over a year and inherited what some might call a total mess. The previous boss bought best of breed, but unfortunately many of those BoB systems don’t communicate with each other. The net result: she has 25 different systems in a company that is not huge, and with a very, very small IT staff.
Her boss felt the cloud had to be the answer…it’s fast, cheap, etc. So she did her homework and the results were eye-opening. Her research showed it would cost $12 per person per month — or $72,000 per year — and that cost did not include basic services such as email, Sharepoint, etc. “We’re too big for the cloud to be cost effective and too small to have money to spend on it,” she said. Cloud providers make the case that companies can “get rid of their programmers,” she said, but those key services like email still have to be in-house and someone has to support it.
And she looked at the 10 year ROI, which she described as horrible. “It’s like the difference between renting a house and buying a house,” she said. “In the food service industry you just can’t waste money like that.”
So instead of the cloud, she’s planning to slowly knit her disparate solutions together by phasing in an ERP system. With user-defined fields and a lot of flexibility she thinks it will ultimately eliminate the need for serious programming skills either internally and externally. And more important, she’s predicting the company will see a return on investment starting at the 2.5 year mark.
So I share this cautionary tale about jumping on to the cloud wagon not to be a luddite but to remind us all that doing what everyone else is — whether cloud or automating testing or DevOps — may not always be right for every company or every situation.
Microsoft today launched Microsoft PowerApps, a major new platform that lets anyone build cross-platform mobile applications. Anyone. And for now, it’s free.
Love or hate it, this latest salvo in the no-code tool wars certainly provides the validation that other makers of similar tools have been seeking. I examined no-code tools a couple of months ago, pondering whether giving line-of-business departmental employees the power to build apps was a good thing for IT or not. On that question, the jury is still out. Regardless, the launch of PowerApps renders the question moot. No-code is here to stay.
In a blog entry posted today, Bill Staples, Microsoft’s corporate vice president of application platforms, makes some entirely valid points. “The mobile revolution, together with nearly limitless compute and data in the cloud, has transformed our professional experience,” he writes. Can’t argue with that. He goes on to say that “the apps we use to do business have been slow to keep pace with employee demand.” On that point, I see some room for debate.
Staples’ point is that while SaaS has become the platform of choice for CRM, expense reporting, and other the majority of business applications still run on premises, “dependent on corporate connected PCs.” Is he kicking Windows in the teeth? Not in so many words, but he’s not coming to its defense, either.
The problem, Staples believes, is a lack of mobile developers, proliferation of data that spans on-premises and cloud, and problems with app distribution.
What does PowerApps promise? Plenty. There’s the ability to create personalized apps that unite popular services, including Microsoft’s own Office 365, Dynamics CRM, and OneDrive; along with Salesforce, Dropbox, OneDrive and more.
Here’s the key point for me, directly quoted from the PowerApps website: “Employees can use their Office skills to create business apps tailored to their needs. Pro developers can use Azure App Service to build apps and connections faster than ever.” (That means pros get to use the APIs and others do not.)
You’re bound to be asked about this, so spend a few minutes checking out the resources and signing up for the beta would seem to be a good idea. After you’ve done that, come back here and share your opinions. Is this something you or other executives allow inside the company? Let us know what you think.
One of Gartner’s predictions for 2016 is that by 2020, smart agents “will facilitate 40 percent of mobile interactions, and the post-app era will begin to dominate.”
In its annual list of looking ahead, Gartner says that smart agent technologies, which it characterizes as virtual personal assistants (VPAs) or other agents, “will monitor user content and behavior in conjunction with cloud-hosted neural networks to build and maintain data models from which the technology will draw inferences about people, content and contexts.”
In other words, the apps being written over the next few years are going to get smarter about dealing with data. That’s undoubtedly the result of underlying algorithms growing increasingly sophisticated. That’s certainly one reason why the job market for people with analytics expertise is exploding.
Gartner goes on to say that VPAs will be able predict users’ needs, become trusted, and eventually act autonomously on the user’s behalf.
That’s powerful stuff. Whether we’re ready for this from a societal perspective certainly is fodder for profound discussion. Have at it.
Aside from the impact on civilization, I’m interested in what this means for the development community. This concept goes well beyond traditional coding we’ve all done — transaction processing, database queries, report generation, etc. It’s also very different from the coding that drives applications, such as data communications, packet processing, validation, authentication, load balancing, failover switching, and dozens of others.
As a software developer, do you see smart agents advancing to a degree that they become predictive and able to act on a user’s behalf? Share your opinions, we’d like to hear from you.
Looking back at Oracle OpenWorld 2015, all one can see is clouds. “Cloud, cloud, cloud” was the conference summation by 451 Research analyst Alan Pelz-Sharpe. The interesting thing about the cloud theme, he added, was that no one, not even Oracle chairman Larry Ellison, was saying “jump on the cloud now.” Instead, Oracle came to town with a cloud transition message.
Sure, Oracle was pushing its cloud technologies, but OOW 2015 keynote speakers largely talked about cloud strategies, said Pelz-Sharpe, 451 research director for social business. This stance was refreshingly realistic and rare in the realm of vendor conferences.
Pelz-Sharpe opines on Oracle’s cloud and Java strategies in this video interview with Jan Stafford, SearchCloudApps executive editor.
TechTarget’s coverage of OOW 2015 supports Pelz-Sharpe’s description of Oracle’s cloud and cloud strategy focus there. For example, Ellison touted Oracle’s cloud services in his keynote, but he also stressed that Oracle and its customers just getting started in adopting cloud services.
To help customers ease legacy applications and development platforms into the hybrid cloud, Oracle touted its Java Standard Edition (SE) Cloud Service and Oracle Integration Cloud Service (ICS). The new SE Cloud Service is a cloud-based platform for Java development that provides the means to move Java SE 7 and 8 applications onto the Oracle cloud platform. Oracle ICS is an application integration PaaS that facilitates point-and-click usage of several Oracle integration suites, including SOA and API management cloud services.
Oracle users do need help with adopting cloud, because cloud will bring substantive changes their entire enterprise environment, said Melissa English, president of the Oracle Applications Users Group (OAUG) in an OOW 2015 interview with SearchOracle reporter Jessica Sirkin. Indeed, English said, many Oracle users are asking if they have to implement cloud at all. She appreciated OOW speakers’ focus on the benefits of the cloud, which pieces of the cloud to start with and other how-to, why-to, when-to adopt cloud topics.
What OOW 2015 didn’t bring was big news about Java strategies, said Pelz-Sharpe. Check out this video to hear his views on why and why this fact doesn’t belittle Java’s importance.
We’re seeing an increasing number of cloud and mobile apps turn to Bluetooth for beacon-based location services and other peripheral device connectivity. The number of devices using Bluetooth continues to explode, outstripping the technology’s earlier vision and capabilities. Are you ready to leverage the more-powerful Bluetooth that’s coming your way?
Earlier this week, the Bluetooth Special Interest Group, keeper of the technology’s specs, trademark, and licensing, announced a roadmap for 2016 that encompasses longer range, faster speed, and standardized mesh networking. The key driver for this? IoT, of course.
It’s all summed up on one sentence from Mark Powell, executive director of the Bluetooth SIG:
“The new functionality we will soon be adding will further solidify Bluetooth as the backbone of IoT technology.”
And there you have it.
According to Statista, 2012 saw 3.5 billion BT-enabled devices installed worldwide. The market researcher is projecting the number will soar to 10 billion in 2018. I think that may be low, although slowing growth in the tablet market may have in impact.
There’s several interesting aspects to this. The BT SIG plans to:
- Quadruple the range of Bluetooth Smart. That will give a huge boost to smart home and infrastructure applications, allowing them to deliver an extended, more-robust connection for full-home or outdoor use cases.
- Double the speed. Increasing BT speed by 100 percent, without any increase in energy consumption (think battery drain), will enable faster data transfers in critical applications, such as medical devices. You get better responsiveness and lower latency.
- Mesh networking will enable Bluetooth devices to connect together in networks that can cover an entire building or home, opening up home and industrial automation applications.
The mesh aspect allows device-to-device communications, eliminating the need for everything to pass through a central air-traffic control tower. While it’s still too early to know where this roadmap will eventually lead, the people who dream up new apps must be salivating at the possibilities.
Look no further than Apple’s push into its Health Kit platform, and we can surmise the market for BT-based medical devices is about to explode. Personally, I’d like to see my car communicate and log its own status and health information in detail to my iPhone; that’s a “healthcare” app, too.
Think of this as industrial IoT-based BT, rather than consumer-oriented BT headphones and other similar toys. After all, IoT now represents an enormous market. How big? Toby Nixon, chairman of the Bluetooth SIG Board of Directors, has the answer. “Current projections put the market potential for IoT between $2 trillion and $11.1 trillion by 2025. The technical updates planned for Bluetooth technology in 2016 will help make these expectations a reality and accelerate growth in IoT.” (And yes, that’s quite a “between” fudge factor of $9.1 trillion.)
the addition of mesh networking will strengthen Bluetooth connections by allowing them to bridge from device to device, rather than routing each product through a central hub. These upgrades are just a “technical roadmap” for now, but SIG says we’ll hear about “additional features” and more details in the coming months.
What are your organization’s plans for leveraging Bluetooth? What apps and products are you going to build to beat up your competitors and make the world a better place? Share your thoughts, we’d like to hear from you.