One of Gartner’s predictions for 2016 is that by 2020, smart agents “will facilitate 40 percent of mobile interactions, and the post-app era will begin to dominate.”
In its annual list of looking ahead, Gartner says that smart agent technologies, which it characterizes as virtual personal assistants (VPAs) or other agents, “will monitor user content and behavior in conjunction with cloud-hosted neural networks to build and maintain data models from which the technology will draw inferences about people, content and contexts.”
In other words, the apps being written over the next few years are going to get smarter about dealing with data. That’s undoubtedly the result of underlying algorithms growing increasingly sophisticated. That’s certainly one reason why the job market for people with analytics expertise is exploding.
Gartner goes on to say that VPAs will be able predict users’ needs, become trusted, and eventually act autonomously on the user’s behalf.
That’s powerful stuff. Whether we’re ready for this from a societal perspective certainly is fodder for profound discussion. Have at it.
Aside from the impact on civilization, I’m interested in what this means for the development community. This concept goes well beyond traditional coding we’ve all done — transaction processing, database queries, report generation, etc. It’s also very different from the coding that drives applications, such as data communications, packet processing, validation, authentication, load balancing, failover switching, and dozens of others.
As a software developer, do you see smart agents advancing to a degree that they become predictive and able to act on a user’s behalf? Share your opinions, we’d like to hear from you.
Looking back at Oracle OpenWorld 2015, all one can see is clouds. “Cloud, cloud, cloud” was the conference summation by 451 Research analyst Alan Pelz-Sharpe. The interesting thing about the cloud theme, he added, was that no one, not even Oracle chairman Larry Ellison, was saying “jump on the cloud now.” Instead, Oracle came to town with a cloud transition message.
Sure, Oracle was pushing its cloud technologies, but OOW 2015 keynote speakers largely talked about cloud strategies, said Pelz-Sharpe, 451 research director for social business. This stance was refreshingly realistic and rare in the realm of vendor conferences.
Pelz-Sharpe opines on Oracle’s cloud and Java strategies in this video interview with Jan Stafford, SearchCloudApps executive editor.
TechTarget’s coverage of OOW 2015 supports Pelz-Sharpe’s description of Oracle’s cloud and cloud strategy focus there. For example, Ellison touted Oracle’s cloud services in his keynote, but he also stressed that Oracle and its customers just getting started in adopting cloud services.
To help customers ease legacy applications and development platforms into the hybrid cloud, Oracle touted its Java Standard Edition (SE) Cloud Service and Oracle Integration Cloud Service (ICS). The new SE Cloud Service is a cloud-based platform for Java development that provides the means to move Java SE 7 and 8 applications onto the Oracle cloud platform. Oracle ICS is an application integration PaaS that facilitates point-and-click usage of several Oracle integration suites, including SOA and API management cloud services.
Oracle users do need help with adopting cloud, because cloud will bring substantive changes their entire enterprise environment, said Melissa English, president of the Oracle Applications Users Group (OAUG) in an OOW 2015 interview with SearchOracle reporter Jessica Sirkin. Indeed, English said, many Oracle users are asking if they have to implement cloud at all. She appreciated OOW speakers’ focus on the benefits of the cloud, which pieces of the cloud to start with and other how-to, why-to, when-to adopt cloud topics.
What OOW 2015 didn’t bring was big news about Java strategies, said Pelz-Sharpe. Check out this video to hear his views on why and why this fact doesn’t belittle Java’s importance.
We’re seeing an increasing number of cloud and mobile apps turn to Bluetooth for beacon-based location services and other peripheral device connectivity. The number of devices using Bluetooth continues to explode, outstripping the technology’s earlier vision and capabilities. Are you ready to leverage the more-powerful Bluetooth that’s coming your way?
Earlier this week, the Bluetooth Special Interest Group, keeper of the technology’s specs, trademark, and licensing, announced a roadmap for 2016 that encompasses longer range, faster speed, and standardized mesh networking. The key driver for this? IoT, of course.
It’s all summed up on one sentence from Mark Powell, executive director of the Bluetooth SIG:
“The new functionality we will soon be adding will further solidify Bluetooth as the backbone of IoT technology.”
And there you have it.
According to Statista, 2012 saw 3.5 billion BT-enabled devices installed worldwide. The market researcher is projecting the number will soar to 10 billion in 2018. I think that may be low, although slowing growth in the tablet market may have in impact.
There’s several interesting aspects to this. The BT SIG plans to:
- Quadruple the range of Bluetooth Smart. That will give a huge boost to smart home and infrastructure applications, allowing them to deliver an extended, more-robust connection for full-home or outdoor use cases.
- Double the speed. Increasing BT speed by 100 percent, without any increase in energy consumption (think battery drain), will enable faster data transfers in critical applications, such as medical devices. You get better responsiveness and lower latency.
- Mesh networking will enable Bluetooth devices to connect together in networks that can cover an entire building or home, opening up home and industrial automation applications.
The mesh aspect allows device-to-device communications, eliminating the need for everything to pass through a central air-traffic control tower. While it’s still too early to know where this roadmap will eventually lead, the people who dream up new apps must be salivating at the possibilities.
Look no further than Apple’s push into its Health Kit platform, and we can surmise the market for BT-based medical devices is about to explode. Personally, I’d like to see my car communicate and log its own status and health information in detail to my iPhone; that’s a “healthcare” app, too.
Think of this as industrial IoT-based BT, rather than consumer-oriented BT headphones and other similar toys. After all, IoT now represents an enormous market. How big? Toby Nixon, chairman of the Bluetooth SIG Board of Directors, has the answer. “Current projections put the market potential for IoT between $2 trillion and $11.1 trillion by 2025. The technical updates planned for Bluetooth technology in 2016 will help make these expectations a reality and accelerate growth in IoT.” (And yes, that’s quite a “between” fudge factor of $9.1 trillion.)
the addition of mesh networking will strengthen Bluetooth connections by allowing them to bridge from device to device, rather than routing each product through a central hub. These upgrades are just a “technical roadmap” for now, but SIG says we’ll hear about “additional features” and more details in the coming months.
What are your organization’s plans for leveraging Bluetooth? What apps and products are you going to build to beat up your competitors and make the world a better place? Share your thoughts, we’d like to hear from you.
It’s easy to understand why the CFO likes cloud computing. It’s almost certainly less expensive than running a traditional IT organization and infrastructure.
Perhaps the HR manager likes the cloud, too. It means cutting payroll headcount and issuing fewer ID badges. The facilities manager probably adores the cloud. Fewer employees and a dismantled server infrastructure means lower electricity, lighting, and HVAC consumption. But, what to do with the recovered office space?
And what about end users? Should they like the cloud? I can argue they should not have an opinion either way. That’s because an app and its data should simply work and that whatever lies under the hood should not be visible or identifiable. When I log into my bank or into a retailer, I haven’t the slightest clue what’s in the cloud, whether it’s running on AWS or Azure, what’s on-premises, or anything else.
Developers, of course, are nearly universally for the cloud, though I’ve been told the top reason is one that might be a surprise. In a conversation yesterday with Dave Bartoletti, principal analyst at Forrester, he says the top reason is that the cloud lets developers develop faster. All the pieces are in place and there’s no internal red tape to conquer when asking for resources. “It’s the fastest way to get their job done,” he told me. There’s also the idea that with cloud expertise in uber-high demand, developer and architect salaries are being driven up.
CIOs lie somewhere in the middle. While they’re certain to be won over by lower costs and faster development cycles, there’s the opposing loss of control when their beloved on-premises infrastructure of long standing gets dismantled.
What about you? What is the one thing about the universe of cloud computing that you like above all others? What’s the one thing you dislike most? Share your thoughts, we’d like to hear from you.
With Oracle using this week’s JavaOne conference to reaffirm its commitment to the Java platform, third-party tool maker Azul Systems debuted an early access program for Java 9 support in Zulu, its open-source build of the Open Java Development Kit (OpenJDK).
Pre-release builds of Zulu supporting Java 9 are available now for the Windows, Linux, and Mac OS platforms. Azul, based in Sunnyvale, Calif., plans to introduce additional pre-release builds throughout the next year, keeping in cadence with OpenJDK Java 9 project advances.
Though the target launch date set by Oracle for Java 9 is still nearly a year away (Sept. 22, 2016), many key aspects are already public. A new modular architecture will let developers select only those features needed for a specific deployment, helpful when scaling down to small devices. Java 9 is also slated to provide improved run-time efficiency for stored class and resource files, faster graphics, a new HTTP client API, and improved keystore security.
In an earlier IT Knowledge Exchange blog post, I wrote about how Adobe, in an astonishingly poor effort to simplify, completely changed the import procedure in its Lightroom application, widely used by photographers. Photography forums went nuclear with rage and anger. In response, Adobe, in its Lightroom Journal advice blog, went so far as to issue an apology on Oct. 9, along with a promise to do better. It has.
Just one week later, on Oct. 16, in the same Lightroom Journal, Adobe’s Tom Hogarty said the company will restore the previous import experience in the next dot-release update. The blog post even provides a link to instructions for rolling back to a prior release for people who cannot wait.
Here’s why this is important. Lightroom does two primary things, non-destructive photo editing that logs every step in a permanent history, allowing users to roll back to any point; and digital asset management to keep track of every image. To do either, you first import the images into the Lightroom catalog, which is actually an SQLite database. What gets imported is not the actual image, but all of its metadata, including physical disc location. There’s also numerous options that can be selected at import time, much of which was eliminated in the simplification. The catalog is where the step-by-step editing history for each image resides, which can easily balloon the database to gigantic proportions. My own Lightroom catalog file currently encompasses 59,205 images of various file types (RAW camera, PSD Photoshop, TIFF, JPG, PNG, etc.) and is 5.3 GB in size. Though I keep several backups on various network drives and two NAS systems, the live working catalog must be stored locally. The actual image files can live anywhere and Lightroom dutifully keeps track.
Acknowledging that the ill-fated “simplified” (and I might add, positively awful) import dialog was sprung on users as a surprise, Mr. Hogarty writes, “We will continue to investigate ways to improve the ease of use of our photography products and will do so via an open dialog, with both existing and new customers.”
It’s good to see one of the world’s premier software companies admit it made a mistake. It’s even better to know that Adobe plans to fix it.
Has your company ever rolled out a software update or product that contained features reviled and rejected by users? What was done about it? Perhaps a rollback or maybe a decision to simply ignore the blowback and plow ahead. Share your thoughts, we’d like to hear from you.
Visit any of the TechTarget family of websites or e-publications and Big Data is everywhere. It’s unavoidable. Inescapable. IT isn’t about applications, it’s about the data. Think of all the things we do with big data: collect, validate, authenticate, store, de-dupe, access, secure, encrypt, back up, analyze, query, display, aggregate, report, purge, and integrate into applications. Data is so vital, countries and industrial spies steal it, and corporations sell it.
Sell it? Sure. Data has value. Big value. As a developer, the on-premises, cloud, and mobile apps you build are essentially refineries that transform the crude-oil-like raw ones and zeros of big data into information. Information, in turn becomes knowledge. Without developers, big data is like a tanker full of crude oil — pretty much useless, but packed with enormous economic value. And as we all know, economic value is power.
Who says so? IBM does. That’s why Big Blue is getting deeper into big data by agreeing to buy the digital business and data assets of Weather Co., corporate parent of the Weather Channel cable network and apps. According to the Wall Street Journal, the deal, which encompasses Weather’s website and app, digital intellectual property, infrastructure, and data, could be valued north of $2 billion. Weather information can be packaged and resold to a wide swath of industries, including airlines, truckers, agribusiness, insurance, electric utilities, pharmaceutical manufacturers, and more. The television network is not part of the package and will remain owned by owned by Comcast’s NBCUniversal, Blackstone Group, and Bain Capital, according to the Journal.
How big is Weather’s big data? According to CNBC, the Weather app ranks fourth on the list of mobile apps used daily in the United States. Weather fields an astounding 26 billion inquiries to its cloud-based services every day. And you think you’ve got performance and response-time issues?
There’s a link at the bottom of weather.com for the company’s API and documentation, which it calls “a Weather API designed for developers.” (For whom else would an API be designed?) Pricing is free for a maximum of 500 calls per day, and $200 monthly for up to 100,000 daily calls. Need to do millions of calls? There’s special pricing for that.
Clearly, big data isn’t worthless, it’s very valuable. But, without applications developers it sure is useless. That’s why all of Weather’s intellectual assets, which no doubt includes developers and the rest of IT, is part of the deal.
Finally, there’s that lingering question: Can your business’s databases handle 26 billion queries every day?
We all know that “things” of all kinds, limited only by your imagination, are already demanding all connectivity all the time. But, you ain’t seen nothin’ yet, if the predictions of those in the predicting business are accurate. Are your applications and the platforms they run on robust enough to handle the coming tsunami?
Research firm Gartner two weeks ago published its list of predictions for 2016. “By 2018, six billion connected things will be requesting support.” This is about tech support and also the need to think of devices as your customers. That means a steady flood of automated requests for data and the opposite — devices reporting their status. This isn’t just the “you’re gonna need a bigger boat” stuff of the movie Jaws, it’s about needing a battleship and a very agile one at that.
Juniper Research is even more aggressive with its IoT device projection. In a July 2015 report, the firm says “the number of IoT connected devices will number 38.5 billion in 2020, up from 13.4 billion in 2015.” That’s a jump of more than 285 percent. The report goes on to say that connected devices already exceeds the earth’s population by more than double. Yet, “for most enterprises, simply connecting their systems and devices remains the first priority.” One problem cited is conflicting standards.
It appears not to matter if the IoT devices your apps communicate with are consumer oriented (home automation, connected vehicles, healthcare, banking, etc.) or industrial (manufacturing floor, agriculture, power grids, smart office buildings, etc.) It’s a mashup we might as call “IoT Big Data.”
Even if your applications can handle the huge volumes and velocities of device-driven data, you still need powerful back ends to transform those ones and zeros into something useful. The Juniper report states the case well: ” Mere connections create data; however, this does not become information until it is gathered, analyzed and understood.” In other words, it’s not just device communications pipelines that need to be beefed up, it’s also the analytics standing in the background. Data, once it’s understood, is not just information, it’s knowledge.
Are your applications, storage systems, and analytics ready to handle the coming meteoric rise in connected devices? What is your organization doing to ensure capacity stays ahead of demand? Share your thoughts with your peers. We’d like to hear from you.
Application development isn’t easy. That’s why the profession of software engineering is so valued and why talented cloud and mobile app developers — like you — are continually sought after. That said, how do you feel after pouring your soul into a project, only to see it flop and get yanked mere months after its rollout?
The latest case is the shutdown this week of Amazon Destinations, the retail behemoth’s six month-old venture into hotel booking. A terse statement on the site says simply, “Effective October 13, Amazon Destinations stopped selling reservations on travel.amazon.com and the Amazon Local app. If you have a reservation, your reservation is valid and will be honored by the hotel. No action is required on your part.” That’s the entire statement.
Think about this for a moment. Technically, Amazon Destinations got the best of everything. It’s a darn good bet this product was running on Amazon Web Services and was developed, tested, and deployed using the very best of AWS’s own tools, perhaps even ones not available to typical AWS subscribers. You know the product was given oodles of compute resources and data-storage space. QA was, no doubt, top notch. Still, work needed to be done. Databases had to be built. Integration with external resources, such as databases of hotels and room availability had to be accessed and integrated. Queries, credit card transaction processing, logging, exception handlers, and user interfaces all had to be designed and built.
So, what went wrong? Perhaps Amazon’s effort was simply far too late to have any impact. There’s already tons of competition from dozens of dot coms, including Expedia, Hotels, Airbnb, Booking, Trivago, Kayak, Priceline, Hotwire, BookingBuddy, HotelBooking, and many more. I had heard almost nothing of Amazon Destinations, and I’m in the business of writing about such matters. According to Bloomberg Business, Amazon Destinations was launched in April 2015 in an effort to broaden the reach of its Amazon Local platform, which finds discounted, close-to-home shopping deals. Did you know that? I didn’t.
There’s also the growing problem of rogue booking websites and mobile apps that are out-and-out scams, according to a story published this week by NBC News. Amazon Destinations, of course, was most definitely not a scam, but it’s likely that once people find a travel site or app they trust, they’re going to stick with it. In other words, if you’re happy using Expedia, why switch?
As I’ve noted before, migrate a bad application from the on-premises data center into the cloud and you still have a bad application. That’s purely a technical design and coding failure. But, here, the problem is very different: starting out with a bad business concept. Even the best of developers would have no way of knowing. You get the specs and you code your heart out.
Has this happened to you, seeing an app withdrawn or service shut down soon after its launch? How did this impact you, after putting in a huge amount much work and many late nights? Share your experiences; we’d like to hear from you.
Today, let’s do something different. Instead of offering up my own opinions, I’ll let others hang themselves with their own words. Suffice it to say, test your software fully before shipping it out. These aren’t kids cloistered away in their bedrooms building apps for fun; these are commercial developers. Have comments on this? Well, sure you do — you’re a software developer, too. What do you think? Join the discussion; we’d like to hear from you.
Adobe royally messes up this week’s updates to Lightroom (desktop) and Lightroom CC (cloud).
In an effort to simplify the process of importing images from camera into the Lightroom database, well, it’s broken. Broken so badly the various photography forums are ablaze with anger. Today (Fri., Oct. 9, 2015), in the Adobe Lightroom Journal, branded as “tips and advice straight from the Lightroom team,” Adobe’s Tom Hogarty, writing on behalf of the Lightroom Management Team, falls on his sword:
Lightroom 6.2 Release Update and Apology
I’d like to personally apologize for the quality of the Lightroom 6.2 release we shipped on Monday. The team cares passionately about our product and our customers and we failed on multiple fronts with this release. In our efforts to simplify the import experience we introduced instability that resulted in a significant crashing bug. The scope of that bug was unclear and we made the incorrect decision to ship with the bug while we continued to search for a reproducible case(Reproducible cases are essential for allowing an engineer to solve a problem). The bug has been fixed and today’s update addresses the stability of Lightroom 6.
The simplification of the import experience was also handled poorly. Our customers, educators and research team have been clear on this topic: The import experience in Lightroom is daunting. It’s a step that every customer must successfully take in order to use the product and overwhelming customers with every option in a single screen was not a tenable path forward. We made decisions on sensible defaults and placed many of the controls behind a settings panel. At the same time we removed some of our very low usage features to further reduce complexity and improve quality. These changes were not communicated properly or openly before launch. Lightroom was created in 2006 via a 14 month public beta in a dialog with the photography community. In making these changes without a broader dialog I’ve failed the original core values of the product and the team.
The team will continue to work hard to earn your trust back in subsequent releases and I look forward to reigniting the type of dialog we started in 2006.
Epicurous, the foodie website, messes up its mobile apps beyond repair. The apps have to be completely withdrawn. Really.
Read portions of what Eric Gillin, executive director of Epicurious, writes on the Epicurious website on Tues., Oct. 6, 2015 (excerpted):
Important Announcement for All Epicurious App Users
If you’re using our app on Apple or Windows, it’s critical you upgrade it immediately. If you’re using it on another platform, sadly, we no longer support those apps. Read on for more information.
Effective immediately, our apps for Apple and Windows will no longer work unless you update them. On that same date, our apps for Android, Nook, and Kindle will stop working properly.
IF YOU USE THE EPICURIOUS APP ON APPLE OR WINDOWS TABLETS OR PHONES
You must update the Epicurious app. Upgrading your app is critical. It will no longer work properly if you remain on an older version. If you’re unable to install the update for some reason, you can still view our recipes — and your recipe box — right here on our website.
IF YOU’VE BEEN USING THE EPICURIOUS APP ON ANDROID, KINDLE, OR NOOK DEVICES
These apps will no longer work properly. Because we can no longer support these apps, we will be removing them from stores. We’ll be working to create new versions that include all of the functionality you’ve come to expect from us.
We’re deeply sorry if this causes frustration. We’re doing this because we want to make sure that we offer the best possible experience and sadly, it’s no longer possible for us to support older devices and every single platform.
Do I need to say anything else? What do you have to say?