IoT is the latest thing in embedded systems technology, and it’s not a fad. IT research firm Gartner Inc. projects installation of 26 billion IoT units by 2020, resulting in $1.8 trillion in revenue. IoT is not a consumer-only technology, either. Enterprise architects should plan now for the entrance of connected device technologies into their IT environments, according to experts quoted in SearchSOA’s feature, “The enterprise IoT wave rolls in: How to prepare”. Those who hesitate will soon be behind the curve. “It’s not a matter of where IoT is entering but where isn’t IoT going to push into the enterprise,” said Mike Walker, an analyst at Gartner, in this article.
In recent articles, SearchSOA contributors have provided a wealth of security tips and tricks for IoT development. This post links to those pieces, putting experts’ and users’ advice at your fingertips.
Developing IoT applications will require rethinking quality standards when building embedded and Internet of Things (IoT) applications. For example, enterprise executives release flawed software daily, a developer told me recently, in order to compare the cost of defects with the cost of delaying release. That luxury isn’t an option in these types of software releases, where failure and security breaches can put user safety at risk. That approach won’t work with IoT.
Failure is much less acceptable when it comes to embedded software. Unfortunately, people expect software and networks to fail or get hacked. Their expectations for mechanical devices are much higher, however. “You shouldn’t have to worry about a blue screen of death on your toaster oven,” said Carnegie Mellon University’s Philip Koopman, in the lead story of SearchSOA’s three-article handbook, “Embedded software, IOT development demand careful scrutiny”.
Likewise, people shouldn’t have to worry about a software hacker turning off his auto-ignition system. Billions of connected devices present a big target, so we’ve gathered advice on creating hack-breaking IoT approaches in this article, “How to cook up the right IoT security strategy for your enterprise”. A huge challenge in beating hackers is that IoT devices bypass firewalls and create ongoing connections to third party services, reported IT security consultant Mark Stanislav, Sr. of Rapid7. In this tip, he and other experts give advice on how to create outside-the-firewall security strategies.
Our sister site, SearchCloudApps, digs into the cloud side of IoT development. Asked how IoT will alter developers’ application strategies, resident expert Chris Moyer pointed to integration as a challenge. A sample of Moyer’s advice here is: “If you create APIs and integrate with popular IoT integration services like IFTTT, you’ll be better able to take advantage of all the devices in a user’s life.”
Watch SearchCloudApps for updates on cloud services and tools for IoT development and deployment, too. A recent report describes a new mobile and IoT application development toolset from Embarcadero, called RAD Studio XE8.
What do you need to know about IoT? Let us know. Our resident experts will address your questions.
Has DevOps adoption hit the mainstream? Well, IT analyst firm Gartner Inc. predicts that 66% of enterprises will be using DevOps tools and practices in the cloud – as will 25% of Global 2000 companies – by 2016. Gartner also expects sales of DevOps tools to reach $2.3 billion this year. That doesn’t look like a niche market.
First posited in 2009, the concept entails merging organizations’ software development, QA and IT operations groups. Cloud computing is driving the need for DevOps adoption by enabling faster development and deployment of applications and complicating what was once a simple housebound IT environment, according to Gartner’s and other reports such as 451 Research’s 2014 study.
Need drives the decision to find a solution, but evaluation leads to the purchase. To help IT and Ops managers make DevOps decisions, TechTarget’s app dev sites have published a slew of expert advice articles on that subject. Here are just a few:
- Cloud consultant Tom Nolle explores how cloud management tools are both influencing and incorporating traditional DevOps features in “The evolution of DevOps in the cloud”. In this advice article, he surveys some current cloud DevOps tools, such as Puppet, and explains the benefits of using frameworks and specifications. In the latter category, he recommends evaluating TOSCA (Topology and Orchestration Specification for Cloud Applications), an open standard which facilitates describing complex application structures.
- Let’s not forget mobile development. The SearchSOA article “Integration tools that bridge the mobile DevOps gap” examines how enterprises are using mobile DevOps integration tools services, such as PagerDuty, BigPanda and VictorOps. Here, you’ll find out how eHarmony used PagerDuty to streamline IaaS alerts.
- SearchAWS offers a handbook on DevOps called “Survive and thrive in cloud DevOps”. Even though DevOps adoption is on the rise, managers of siloed IT, operations and business departments may be hard to sell on the concept. This collection includes:
- Advice on pitching DevOps to department managers in the article “Gaining acceptance for DevOps in the cloud”.
- Various approaches for using AWS OpsWorks, a DevOps automation and management toolset, to enhance application security in “Using OpWorks’ configuration automation”. Here, expert Dan Sullivan examines the toolset’s uses for improving app security and reducing app policy and procedure missteps. The key is using OpWork’s standardized configuration options. “Automating configuration and update operations with OpsWorks can eliminate inconsistencies in application policies and procedures,” he writes.
- An explanation of why and how to implement security operations management in AWS by contributor George Lawton. The SecOps approach calls for continuous threat testing and monitoring in a secure software development lifecycle practice. The article describes several tools that automate those functions in AWS environments and shares steps for removing vulnerabilities during the application design phase.
Stay tuned – there are more DevOps tips to come! Meanwhile, let us know if there are DevOps challenges you’d like addressed by SearchSOA’s resident experts.
A photograph from a Mars Rover may be breathtaking, but it will not deliver the complex data space scientists seek. Scientists like Washington University-St. Louis computer systems manager Thomas Stein need broader sets of data in formats that work with modern data analysis software. Stein helped create Analyst’s Notebook, a tool that documents geological findings from space missions and organizes that data in an online offering that is made accessible to scientists and the public.
Look at the data coming from just one instrument, say, the Mars Rover Opportunity. Some scientists are focused on a certain type of data from that one instrument, some on others. Meanwhile, said Stein, the general science community may want broader data from that instrument to do research in other disciplines. In addition, many scientists are doing cross-instrument and cross-mission searches and correlations to study a variety of topics.
“Today’s scientists cannot simply convert an image to a .JPG and use it, because you lose so much of the science quality of the data,” said Stein, who works in the University’s Department of Earth and Planetary Sciences. Analyst’s Notebook helped enable replay and archiving mission images and data, but that information must still be archived in formats accessible to scientists using many different software applications and devices.
Stein’s group works with NASA (National Aeronautics and Space Administration) to archive planetary data for the long term – as in the next 50 to 100 years. “We wanted to develop a value-added tool that helps scientists bind this data in a meaningful way,” said Stein. “By giving them data previews, we’d help them understand what they’re getting before they actually hit the download button.”
Developing software for geological studies of space rocks wasn’t Stein’s intention when he got an after-college job in in the Smithsonian Institution’s Mineral Sciences Department. Yet, it was there that he was asked to develop software for a traveling exhibit on volcanoes. The success of the three applications he delivered led to more projects for the Smithsonian.
After achieving success in these geological software projects, Washington University contacted Stein about programming software for scientists studying “space rock” data from the Giant Magellan Telescope. The immediate problem Stein addressed was a flaw in the way scientists were doing field-testing. “Nobody was taking notes about the decision-making process,” he said. “After a week of field-tests, they realized, ‘Hey, we don’t even remember why did we decided to look at this rock instead of that rock.’”
Of the many challenges for building scientific applications, two in particular really perplexed Stein and the NASA team: unpredictability of data from Rovers and feature glut.
For an orbital mission, an obvious objective is to map the planet systematically, but the Rovers don’t make this process easy, because they, well, rove. “Scientists often don’t know where the rover will drive and what it’s going to find,” said Stein. Another goal is determining the characteristics of natural objects, such as rocks. The scientists need to know where and in what context, which is hard to tell from an image. To deal with this problem, the development team used Microsoft Image Composite Editor, which was built on Microsoft SQL Server. The Editor can be used to create images that aggregate the surroundings of a finding in context mosaic image.
The feature glut issue comes from the length of today’s space missions. “Keeping up with what our users need over 10-15 years is unbelievably hard,” Stein said. “Think of how different the expectations of software users were 15 years ago – nobody asked for one-click ordering online.”
The development team, focused on Opportunity and other NASA Rovers, sought an automated development platform that set up the back end so they could devote more time to building value-added tools specific to planetary data coming from Rovers. “We shouldn’t be building basic code, laboring over documentation and doing cross-platform testing,” he said. Telerik Platform, a cross-platform development suite, was chosen to help the software teams focus on high-level challenges and bypass earlier phases of software development.
A web-based application running on the Microsoft ASP.NET platform, Telerik Platform provides a user interface (UI) that NASA uses for framework controls. In addition, Telerik’s automated test and quality assurance tools reduce the time needed to build a feature. An example is a documentation feature Stein’s team built that enables rapid online searches. “Documentation becomes very difficult when doing rapid application development and dealing with such huge sets of data,” he said. Telerik’s toolset helped him build a feature that enables a user looking for images of a certain target find it quickly online “at the push of a button, instead of the user having to do literature searches.”
Being able to react quickly to user needs is a necessity today, one that automated test and development platforms makes possible. “In reality, I’m still not a computer scientist, I’m a geologist,” he said. “A foundation development tool really helps me not worry so much about the computer science side and focus on the science side.”
Any organization NOT doing an application modernization project this year is in the minority. Over 70% of businesses worldwide are modernizing their application environments to handle mobile, cloud and other emerging digital platforms in 2015, according to a recent survey.
Not only are most businesses engaging in application modernization (also called legacy application modernization) efforts, at least 61% consider it a strategic asset to drive business forward and a tool to help maintain market position and even survival. This is according to a recent survey of CIOs by app modernization vendor CSC.p
Application modernization projects will be a higher priority for businesses in Europe (80%) and Asia (73%) in 2015, largely because the technology refresh cycles have been slower in those areas than in North America, where 55% will invest this year.
A challenge in tallying application modernization projects is that they can be a part of larger projects, such as cloud modernization and portfolio modernization, both of which focus largely on app modernization. In some minds, the latter sounds old school, identified only with mainframe-to-server migrations, which are not irrelevant — as Moorcroft Debt Recovery Group’s recent project affirms.
Of course, the rapid rise in business use of cloud services is driving many businesses worldwide to modernize their app portfolios; that and the fact that their competitors are doing the same. Likewise, the rise in workers’ use of enterprise apps on mobile devices is exceeding IT’s ability to support either.
This cloud rush encourages a “fools rush in” modernization mentality. The sober but savvy enterprise architect approaches modernization from more than the cloud apps angle. By examining their app portfolios and on-premise IT infrastructure, they’re considering which applications should be re-platformed to hybrid or public cloud models. Which applications should be replaced by software as a service? Which strategic applications should be rewritten to take advantage of microservices architecture and platform as a service?
The complexity of modernizing applications and the importance and risk of these projects to business success is boggling. From my observations, the most successful projects started with intense application portfolio assessments. They choose app modernization products and services that are designed to be used in a systematic fashion and are flexible, the latter because every business will have different business needs, app portfolios and infrastructures.
There are many approaches to modernizing software, more than I can cover in this post. Fortunately, we’ve put together a library of articles on app modernization. Enjoy!
With 2014 coming to an end, we close the door on an eventful year. Technologies such as Hadoop, Docker and microservices made their way into our everyday lexicon. Some tried and true predictions held on for another year.
For example, in an interview with Forrester’s principal analyst Brian Hopkins at the end of 2013 he asserted that mobile applications should make waves in 2014. “Everybody is suddenly realizing that the platform is not the differentiator, it’s the apps,” he said.
In many ways, Hopkins’ prediction was correct. Over the course of the year, mobile application development proved to be important, especially in corporate environments. Several enterprise architects and developers shared with SearchSOA.com how they selected tools to help them gain an edge in the mobile sphere.
Indeed, it appears that mobile technology, big data and the Internet of Things, took the spotlight in 2014 – a position they will likely hold for quite some time. Some experts assert that SOA should have been right up there in the limelight too, but ended up in the wings. In an interview with Christine Parizo, 451 Research’s Carl Lehman said, “SOA is a wallflower … It was brought to the dance, but it’s not on the dance floor.”
Poor SOA. Will 2015 be the year SOA is crowned king?
Given what some industry insiders recently said, it doesn’t sound like SOA per se is going to be voted most popular — yet again. It appears that flashy mobile technology and the cloud will be most thought of, even though SOA may be the true underpinning of how everything is synched together.
Although the term mircorservices may be relatively new, some experts, like Gartner vice president and senior analyst Anne Thomas, believe it’s going to become increasingly prevalent. “I think that a small number of people, maybe 10% of organizations, will start trying to play with microservices and bounded context during the next year,” she said.
In fact, Thomas said she feels that microservices really is just SOA, but under a different name. So maybe SOA really will edge upwards in popularity, just under a different guise.
What are your SOA trend predictions for 2015? What do you think should have happened in 2014 that didn’t?
Integrating applications deployed in traditional enterprises or data centers with those in the cloud is a common headache enterprise architects face. Red Hat recently released OpenShift Enterprise 2.2 and new cloud services, JBoss Fuse for xPaaS (integration) and JBoss A-MQ for xPaaS (messaging) to make it easier for developers to update applications and integration platforms.
The cloud-based messaging tools aim to speed-up application development, particularly in organizations with a hybrid IT architecture. With it, enterprise customers can use PaaS for applications running in their own data centers and private clouds, according to Joe Fernandes, OpenShift director of product management.
We’ve all heard the terms iPaaS, IaaS, and SaaS, but what the heck is xPaaS? In short, xPaaS is a term Red Hat coined for uniting various integration and application-centric tools under one offering. “A lot of traditional middleware solutions are becoming available as a service,” noted Fernandes.
The new release of OpenShift Enterprise 2.2 with the addition of private iPaaS was done to help organizations with future development in mind. “It’s not traditional 2005 architecture,” said Pierre Fricke, director of product marketing for Red Hat JBoss Middleware. “It’s a 2015 type of architecture for microservices with a center piece around Apache Camel.”
Microservices are small, highly-distributed applications composed of logic and services that have to be connected and wired together, said Fernandes. “In many ways it’s the new SOA.”
Microservices were a hot topic at JavaOne 2014 this year. During that event, Java Champion and consultant Jeff Genender and developer Rob Terpilowski said that microservices offered a streamlined means of integrating cloud services.
Apache Camel brings standardized integration to the xPaaS offerings. “Camel actually implements the book that everyone uses, that makes it the closest thing to a standard for integration,” said Fricke. “It’s almost the de facto emerging standard for integration than anything else.”
Bridging the gap between development and operations to support applications is really where xPaaS comes in to play, according to Fernandes. “As you get in to enterprise application, inherently they tend to be more complex than some of the applications you see on the consumer side running in the public cloud today,” he said.
Some Fortune 1000 companies, such as statistic tools provider Fair Isaac Corporation (FICO) have already leveraged this Red Hat technology, Fernandes said. He noted, however, that xPaaS can be used by SMBs who need to reduce the amount of times it takes to develop and deploy applications.
It’s not uncommon for IT and business leaders to want to reap the benefits of having employees collaborate amongst each other, but the same perspective isn’t always seen when it comes to sharing technology resources. Those decision-makers need an attitude adjustment when it comes to shared workloads, according to Susan Eustis, president and CEO, co-founder of WinterGreen Research.
It may seem like common sense, but not everyone seems to get it. “It’s far more efficient to share a resource than it is to build and not use it all the time,” Eustis said. “It’s a message people don’t want to hear, but in fact, people who invested in shared workloads are the leaders in their industry segment.” Such organizations include Wal-Mart and Travelers Insurance, she noted.
In an era where organizations are trying to stay afloat, or simply get off the ground, looking towards the cloud can seem like a logical move. By sharing a workload in the cloud, costs can edge downwards. This can lead to a competitive advantage because organizations that adopt such a model can afford to offer products and services at a lower price point.
In a traditional set-up, every department within an organization would have its own set of servers and thus individually pay for the service. That however, is starting to change. Now, some companies are only paying for the portion of servers that are in use. “The virtualization workload moves on and off the cloud in the way it hasn’t happened before,” Eustis said.
The cloud isn’t the only area cost savings can be found. IT leaders might be surprised to learn that mainframes may not be the money-draining resource they have a reputation for being. “I’ve done a lot of work over the years and I’m showing the mainframe is 10 times cheaper than the servers,” Eustis said.
The message from Eustis is clear – archaic thinking isn’t going to get an organization ahead. “People have to stop being afraid of losing their job and start looking at what the reality is,” she said.
Have you seen stubborn thinking and practices hinder an organization’s ability to succeed? What are some common pitfalls you’ve seen leaders take when it comes to making IT decisions?
A growing number of organizations are jumping on the big data bandwagon, according to research from Gartner. Within the next two years, more than 70% of just over 300 survey respondents said they will invest in the technology. The figure represents a nearly 10% upswing from last year.
While the notorious three Vs: Volume, variety, and velocity have plagued most who attempt to wrangle loads of data, survey respondents paid the most attention to volume, that is the sheer amount of data.
Even though certain types of information have been gathered for quite some time, the quantity of that data has rapidly risen. If not properly managed, the structured or unstructured information that was once a profit point could turn into a costly, headache-inducing problem.
The research points out that data variety, the different types of information, can be one of the more problematic areas of big data to manage. With the upswing of social media, for example, a new set of skills and tools, plus expanded storage, is needed to make use of the information.
That may explain why more organizations aren’t attempting to get information from log data, often derived from social media. The survey revealed that the number of organizations attempting to glean insights from profiles and interactions dipped 2%. Gartner believes issues integrating social media with other data may be the root of the trend.
Figuring out what to do with, and how to manage big data from social media, isn’t the only problem IT professionals are facing. Mobile devices are also a pain point for developers as needs and goals can vary depending on the application’s target audience.
Despite the problems big data can present, the opportunities to extract valuable information cannot be overlooked. Given the uptick in organizations planning to deploy a big data project in the near future, it seems business leaders are getting the picture. Now it’s up to IT professionals to figure out how to deal with big data in a cost-effective and timely fashion.
Has your organization struggled to integrate information gathered from new sources, such as social media, with more traditional big data sources? How have you gone about overcoming the obstacle?
Public cloud spending is on the rise, according to research from International Data Corporation. Software as a service accounts for more than 70% of the market, the Worldwide Semiannual Public Cloud Services Tracker revealed. Platform as a service and Infrastructure as a service round out the other product groups.
One of the main factors driving the uptick in spending is developer migration to the cloud, according to IDC Chief Analyst and SVP Frank Gens. “Over the next few years, anyone looking for the best enterprise apps will almost certainly be adopting them as cloud services,” he predicted. The second major force Gens said is increased comfort of cloud services being “enterprise ready.”
With the increased funds going towards cloud services, more vendors are entering the space, which can be a good thing for developers and enterprises as a whole. “We’ll continue to see a rapid growth in the diversity of options IT shops can find in the public cloud world,” Gens said.
Some options will include on-demand private clouds, more specialized cloud instances, improved security options, and new developer services, Gen said. The biggest trend on the horizon, however, will use the cloud and big data. “We see most of these having a very industry-and/or role-specific focus,” he noted.
What improved cloud service options would you like to have available?
When people talk about “going to the cloud,” it’s often viewed within the context of what it means for an organization in terms of workflow and costs. Anjoy Willy, a director at business transformation product provider Trace3, sees the cloud in a different light. “This is really a push for much greater democracy,” he said.
Radio and television are often cited as revolutionary technologies that transformed the way we transmit ideas. While the platforms made it possible to reach large audiences, both are designed for one-way communication. Furthermore, creating content for radio or television initially involved knowing the right people and having the proper resources. That all changed when access to the Internet became mainstream.
The Internet made it possible for people to not only receive messages, but created an opportunity for a broad range of people to participate in content development and dissemination. Now, it’s simple for anyone with access to a computer or wireless device to create a blog or go on to YouTube and upload their own video.
“That is what is incredible about the Internet and the cloud in general,” Willy said. “Take that computing power and give it to somebody for a fraction of the cost of what it used to take.”
As access to the cloud becomes easier and cost-efficient, the road is being paved for more innovation. “We have more startups now than we did in the dot com boom,” Willy said.
Do you see the cloud as having a democratizing affect? How have you seen the cloud change the enterprise landscape?