A new project spearheaded by a researcher at Carnegie Mellon University is aiming to use AI to spot examples of fake news. But what at first may sound like a promising and relatively straightforward undertaking may in fact be a monumental lift.
The Fake News Challenge (#FakeNewsChallenge) is offering $2,000 to any programmer who is able to develop an AI algorithm that makes progress toward spotting fake news. But the challenges related to the project get at the very core of what makes true AI so hard.
Take a listen to this podcast to learn more about the Fake News Challenge, how researchers are trying to use AI to address the problem and why this task is so difficult.
In the year just past, data warehousing took a noticeable shift toward the cloud, as Microsoft, Oracle, Snowflake and others played catch up in pursuit of cloud leader Amazon. Just as remarkable, Hadoop distribution providers found themselves spinning up new versions of their offerings that were especially tailored to compete with Amazon’s style of Hadoop instances. These and other issues are under inspection in an edition of the Talking Data Podcast that looks back at 2016 in earnest and forward to 2017 in anticipation.
The year just passed would not have been the same without machine learning, deep learning and artificial intelligence. Trade and popular press alike zeroed in on these analytics trends with great fervor. In this edition of the Talking Data Podcast, TechTarget editors mull the meaning of it all. One question asked: Are we at the point where we can see adoption panning out at scale for machine learning and the like? To find out more, listen here.
Ed Burns looks back at the World of Watson – an IBM event held recently in Las Vegas – and finds much of interest to followers of AI. At WoW, GM said it would work with IBM to improve drivers’ experiences in traffic and at the gas pump. Staples rolled out a new version of its “Easy Button” that was hooked up to Watson Natural Language Processing and connected to the Staples order system. These and other use cases show IBM can read the tea leaves. Apple’s Siri and Amazon’s Alexa have often been front and center in AI discussions these days, and it appears the consumer angles is not lost on IBM, as GM, Staples and other Watson use cases indicate. “Household names seem to be out ahead in this,” said Burns. Hear more by clicking below to access this edition of the Talking Data podcast.
In some ways, big data these days is a very much a conglomeration of open source application frameworks. It started with Apache Hadoop, but it has come to include Apache Spark, Apache Flink, Apache Kudu and others, with more in the wings. In this edition of the Talking Data podcast, Craig Stedman outlines the pros and cons of such bounty. For IT teams, he said, putting together the pieces can be a daunting task. Also discussed is Hadoop on the cloud. Surprisingly, perhaps, Hadoop configurations by and large evolved in on-premise implementations, rather than on the cloud. That is changing quickly as users opt for easy-to-spin-up pay-as-you-go setups. Learn more. Listen to the podcast.
Oracle 12 C release 2 carries forward the expectations of the company to compete aggressively with Amazon and its Aurora and Redshift databases in the cloud. While it has been used around the industry for time, sharding appears as a new feature in the database. This and other Oracle advances – including better in-memory and JSON support, and improved multitenancy — are discussed in this podcast, as are the engineered system efforts that set the stage for Oracle’s cloud incursions.
While there is much focus on building out new data-oriented applications using open-source data frameworks, some development managers still opt for something more turnkey. This podcast looks at the new tenor of the age-old build vs. buy conundrum from both sides. Actually, there are more than two sides, because ‘rent’ is becoming part of the equation. Just because you can do it, doesn’t mean you should, right?
A visit to the MIT Chief Data Officer and Information Quality Symposium was a springboard for musing upon a leadership role that emerged in the wake of the 2008 Wall Street debacle. Back then, the question was ‘where is your data?’ The question now is ‘where are you going with your data, and can a CDO take you there?’ Listen to this podcast.
Many businesses have yet to fully capitalize on their investments in big data and analytics technologies.
That’s one of the main takeaways from a new report from the Economist Intelligence Unit and ZS Associates. Report authors surveyed businesses across the country to gauge their success with analytics technologies and many of the responses were surprising.
Analytics tools are now seen as almost a no-brainer where the value seems fairly obvious. To be sure, when implemented and used wisely, the tools can be game changers for enterprises. But the report suggests that few are reaping the full promised benefits. Take a listen to this interview with ZS associate principal Dan Wetherill to hear more about why some businesses are struggling.
The opportunity to celebrate 50 years of publication at Computer Weekly proved also the occasion for some musings on the big picture of British computing over the years, when CW’s Brian McKenna joined Ed Burns and Jack Vaughan for the Talking Data podcast. The discussion turned to the role of new technologies like Hadoop in London’s frenetic financial services industry. It included a consideration of the influence that the vaunted Bletchley Park development center had on the course of U.K technology.