Has DevOps adoption hit the mainstream? Well, IT analyst firm Gartner Inc. predicts that 66% of enterprises will be using DevOps tools and practices in the cloud – as will 25% of Global 2000 companies – by 2016. Gartner also expects sales of DevOps tools to reach $2.3 billion this year. That doesn’t look like a niche market. Continued »
A photograph from a Mars Rover may be breathtaking, but it will not deliver the complex data space scientists seek. Scientists like Washington University-St. Louis computer systems manager Thomas Stein need broader sets of data in formats that work with modern data analysis software. Stein helped create Analyst’s Notebook, a tool that documents geological findings from space missions and organizes that data in an online offering that is made accessible to scientists and the public. Continued »
Any organization NOT doing an application modernization project this year is in the minority. Over 70% of businesses worldwide are modernizing their application environments to handle mobile, cloud and other emerging digital platforms in 2015, according to a recent survey. Continued »
With 2014 coming to an end, we close the door on an eventful year. Technologies such as Hadoop, Docker and microservices made their way into our everyday lexicon. Some tried and true predictions held on for another year. Continued »
Integrating applications deployed in traditional enterprises or data centers with those in the cloud is a common headache enterprise architects face. Red Hat recently released OpenShift Enterprise 2.2 and new cloud services, JBoss Fuse for xPaaS (integration) and JBoss A-MQ for xPaaS (messaging) to make it easier for developers to update applications and integration platforms.
The cloud-based messaging tools aim to speed-up application development, particularly in organizations with a hybrid IT architecture. With it, enterprise customers can use PaaS for applications running in their own data centers and private clouds, according to Joe Fernandes, OpenShift director of product management.
We’ve all heard the terms iPaaS, IaaS, and SaaS, but what the heck is xPaaS? In short, xPaaS is a term Red Hat coined for uniting various integration and application-centric tools under one offering. “A lot of traditional middleware solutions are becoming available as a service,” noted Fernandes.
The new release of OpenShift Enterprise 2.2 with the addition of private iPaaS was done to help organizations with future development in mind. “It’s not traditional 2005 architecture,” said Pierre Fricke, director of product marketing for Red Hat JBoss Middleware. “It’s a 2015 type of architecture for microservices with a center piece around Apache Camel.”
Microservices are small, highly-distributed applications composed of logic and services that have to be connected and wired together, said Fernandes. “In many ways it’s the new SOA.”
Microservices were a hot topic at JavaOne 2014 this year. During that event, Java Champion and consultant Jeff Genender and developer Rob Terpilowski said that microservices offered a streamlined means of integrating cloud services.
Apache Camel brings standardized integration to the xPaaS offerings. “Camel actually implements the book that everyone uses, that makes it the closest thing to a standard for integration,” said Fricke. “It’s almost the de facto emerging standard for integration than anything else.”
Bridging the gap between development and operations to support applications is really where xPaaS comes in to play, according to Fernandes. “As you get in to enterprise application, inherently they tend to be more complex than some of the applications you see on the consumer side running in the public cloud today,” he said.
Some Fortune 1000 companies, such as statistic tools provider Fair Isaac Corporation (FICO) have already leveraged this Red Hat technology, Fernandes said. He noted, however, that xPaaS can be used by SMBs who need to reduce the amount of times it takes to develop and deploy applications.
It’s not uncommon for IT and business leaders to want to reap the benefits of having employees collaborate amongst each other, but the same perspective isn’t always seen when it comes to sharing technology resources. Those decision-makers need an attitude adjustment when it comes to shared workloads, according to Susan Eustis, president and CEO, co-founder of WinterGreen Research.
It may seem like common sense, but not everyone seems to get it. “It’s far more efficient to share a resource than it is to build and not use it all the time,” Eustis said. “It’s a message people don’t want to hear, but in fact, people who invested in shared workloads are the leaders in their industry segment.” Such organizations include Wal-Mart and Travelers Insurance, she noted.
In an era where organizations are trying to stay afloat, or simply get off the ground, looking towards the cloud can seem like a logical move. By sharing a workload in the cloud, costs can edge downwards. This can lead to a competitive advantage because organizations that adopt such a model can afford to offer products and services at a lower price point.
In a traditional set-up, every department within an organization would have its own set of servers and thus individually pay for the service. That however, is starting to change. Now, some companies are only paying for the portion of servers that are in use. “The virtualization workload moves on and off the cloud in the way it hasn’t happened before,” Eustis said.
The cloud isn’t the only area cost savings can be found. IT leaders might be surprised to learn that mainframes may not be the money-draining resource they have a reputation for being. “I’ve done a lot of work over the years and I’m showing the mainframe is 10 times cheaper than the servers,” Eustis said.
The message from Eustis is clear – archaic thinking isn’t going to get an organization ahead. “People have to stop being afraid of losing their job and start looking at what the reality is,” she said.
Have you seen stubborn thinking and practices hinder an organization’s ability to succeed? What are some common pitfalls you’ve seen leaders take when it comes to making IT decisions?
A growing number of organizations are jumping on the big data bandwagon, according to research from Gartner. Within the next two years, more than 70% of just over 300 survey respondents said they will invest in the technology. The figure represents a nearly 10% upswing from last year.
While the notorious three Vs: Volume, variety, and velocity have plagued most who attempt to wrangle loads of data, survey respondents paid the most attention to volume, that is the sheer amount of data.
Even though certain types of information have been gathered for quite some time, the quantity of that data has rapidly risen. If not properly managed, the structured or unstructured information that was once a profit point could turn into a costly, headache-inducing problem.
The research points out that data variety, the different types of information, can be one of the more problematic areas of big data to manage. With the upswing of social media, for example, a new set of skills and tools, plus expanded storage, is needed to make use of the information.
That may explain why more organizations aren’t attempting to get information from log data, often derived from social media. The survey revealed that the number of organizations attempting to glean insights from profiles and interactions dipped 2%. Gartner believes issues integrating social media with other data may be the root of the trend.
Figuring out what to do with, and how to manage big data from social media, isn’t the only problem IT professionals are facing. Mobile devices are also a pain point for developers as needs and goals can vary depending on the application’s target audience.
Despite the problems big data can present, the opportunities to extract valuable information cannot be overlooked. Given the uptick in organizations planning to deploy a big data project in the near future, it seems business leaders are getting the picture. Now it’s up to IT professionals to figure out how to deal with big data in a cost-effective and timely fashion.
Has your organization struggled to integrate information gathered from new sources, such as social media, with more traditional big data sources? How have you gone about overcoming the obstacle?
Public cloud spending is on the rise, according to research from International Data Corporation. Software as a service accounts for more than 70% of the market, the Worldwide Semiannual Public Cloud Services Tracker revealed. Platform as a service and Infrastructure as a service round out the other product groups.
One of the main factors driving the uptick in spending is developer migration to the cloud, according to IDC Chief Analyst and SVP Frank Gens. “Over the next few years, anyone looking for the best enterprise apps will almost certainly be adopting them as cloud services,” he predicted. The second major force Gens said is increased comfort of cloud services being “enterprise ready.”
With the increased funds going towards cloud services, more vendors are entering the space, which can be a good thing for developers and enterprises as a whole. “We’ll continue to see a rapid growth in the diversity of options IT shops can find in the public cloud world,” Gens said.
Some options will include on-demand private clouds, more specialized cloud instances, improved security options, and new developer services, Gen said. The biggest trend on the horizon, however, will use the cloud and big data. “We see most of these having a very industry-and/or role-specific focus,” he noted.
What improved cloud service options would you like to have available?
When people talk about “going to the cloud,” it’s often viewed within the context of what it means for an organization in terms of workflow and costs. Anjoy Willy, a director at business transformation product provider Trace3, sees the cloud in a different light. “This is really a push for much greater democracy,” he said.
Radio and television are often cited as revolutionary technologies that transformed the way we transmit ideas. While the platforms made it possible to reach large audiences, both are designed for one-way communication. Furthermore, creating content for radio or television initially involved knowing the right people and having the proper resources. That all changed when access to the Internet became mainstream.
The Internet made it possible for people to not only receive messages, but created an opportunity for a broad range of people to participate in content development and dissemination. Now, it’s simple for anyone with access to a computer or wireless device to create a blog or go on to YouTube and upload their own video.
“That is what is incredible about the Internet and the cloud in general,” Willy said. “Take that computing power and give it to somebody for a fraction of the cost of what it used to take.”
As access to the cloud becomes easier and cost-efficient, the road is being paved for more innovation. “We have more startups now than we did in the dot com boom,” Willy said.
Do you see the cloud as having a democratizing affect? How have you seen the cloud change the enterprise landscape?
By: Jan Stafford
More and more, enterprise architects are building environments for large-scale middleware testing in the cloud. Why? There are huge limits on how many test and development environments on-premise systems can accommodate, said Steve Millidge, Director, C2B2 Consulting, Malvern, UK. For example, many businesses don’t have the physical resources to create a 32-node cluster just to test middleware.
“If an architect suddenly wants, say, 20 servers to run middleware tests, allocating all those servers would be a burden and could reduce performance of core applications,” Millidge said. He’s helped organization set up middleware testing in AWS, where tests can be set up in a single morning, run during the day and torn down in the evening. “Large quantities of tests can be done completely on-demand, which is a fantastic value case,” he said.
C2B2 architects who routinely build and put images on AWS have seen that reuse opportunities are many. “If some others need to use an instance, they can easily clone one for a few days, then shut it down again,” said Millidge. Fixed, on-premise servers can’t be rapidly built and torn down.
Looking for more advice and info on middleware? Check out Millidge’s report on a UK higher education services project, which enabled uptime and performance for a huge one-day, online event. Expert George Lawton explains how to solve operational business intelligence problems using SOA and middleware. Operational BI is more event-driven than traditional BI, Lawton says, and focuses on using BI for process improvements.
Jan Stafford plans and oversees strategy and operations for TechTarget’s Application Development Media Group. She has covered the computer industry for the last 20-plus years, writing about everything from personal computers to operating systems to server virtualization to application development. E-mail her at firstname.lastname@example.org.