Fuzz testing or fuzzing is a software testing technique used to discover coding errors and security loopholes in software, operating systems or networks by inputting massive amounts of random data, called fuzz, to the system in an attempt to make it crash. If a vulnerability is found, a tool called a fuzz tester (or fuzzer), indicates potential causes. Fuzz testing was originally developed by Barton Miller at the University of Wisconsin in 1989. Continued…
Quote of the Day
“Fuzz testing is most useful for software that accepts input documents, images, videos or files that can carry harmful content. These are the serious bugs that it’s worth investing to prevent.” – David Molnar
Microsoft previews Project Springfield, Azure-based fuzz testing
Microsoft previews Project Springfield, Azure cloud-based fuzz testing service, at Ignite 2016.
What is fuzz testing? What are some ways to use fuzz testing?
Fuzz testing is a form of black box testing where large amounts of data in varying formats are sent to the inputs of a program such as a Web application.
The benefits of remote debugging techniques in the cloud
Using remote debugging requires IT professionals to have refined debugging techniques so the benefits of using these tools are maximized.
Can a new encryption trick prevent reverse engineering?
Learn about Hardened Anti-Reverse Engineering System, or HARES, which cannot exactly prevent reverse engineering, but makes it a little tougher for attackers to complete.
Inside the changing DDoS threat and how to mitigate it
Attackers have devised new types of distributed denial-of-service attacks. Here’s how the DDoS threat has evolved.
Real-time analytics is the use of, or the capacity to use, data and related resources as soon as the data enters the system. The adjective real-time refers to a level of computer responsiveness that a user senses as immediate or nearly immediate. Continued…
Quote of the Day
“Any large company has a number of applications in which their business processes would be smarter and more effective if they were using real-time analytics.” – Roy Schulte
Users look to real-time streaming to speed up big data analytics
Real-time streaming and analytics systems are becoming more common in IT architectures as organizations try to accelerate their big data analytics efforts.
Streaming data systems take big data analytics into real-time realm
Stream processing is a niche application, but streaming data platforms tied to Hadoop clusters can help users process large amounts of information for big data analytics uses.
Process real-time data analytics options
The amount of data has hit an all-time high in enterprise data centers. Real-time data analytics allows you to process important data as fast as you need. Which options should you implement to analyze data in real-time.
Real-time data analytics on IoT info starts with solid architecture
IoT opens up new avenues for real-time data analytics. Consultant David Loshin has tips on planning the required IoT data architecture.
Streaming analytics accepts no delays in examining big data
Streaming analytics favors Apache Flink over Apache Spark to eliminate microbatching of data and analyze events in real time.
A T-shaped employee, in the context of human resources, is an individual who has deep knowledge and skills in a particular area of specialization, along with and the desire and ability to make connections across disciplines. Continued…
Quote of the Day
“Who you want is known as a T-shaped person. Yes, you want people who are experts in their area, but they also need to have broad expertise. They need to be able to pick up any work that needs to get done and help move the work along.” – Joseph Flahiff
How to hire developers without really trying too hard
When you’re wondering how to hire developers, there’s no one better to ask than the CIO of a human capital management firm. Here are some insider secrets.
How do you fill DevOps jobs? Hire squirrels
Enterprises that have successfully filled DevOps jobs learned to focus on attitude and mindset over specific skills.
How organizational agility will save and destroy your company
Achieving organizational agility will require your company to change six ways it is currently doing business, explains organizational expert Joseph Flahiff.
For IT security and compliance jobs, experience a must
IT security and compliance job candidates are in high demand, forcing companies to closely examine potential employees’ past experience to offset a potential GRC skills shortage.
Compose an enduring DevOps organization structure
Learn who’s who in a DevOps organizational structure, the skills you need to join it, and helpful tips for engaging multidiscipline groups on one goal.
An IT control is a procedure or policy that provides a reasonable assurance that the information technology (IT) used by an organization operates as intended, that data is reliable and that the organization is in compliance with applicable laws and regulations. IT Controls can be categorized as either general controls (ITGC) or application controls (ITAC). Continued…
Quote of the Day
“Results of the ITGC audit also provide an effective assessment of the risk level to the infrastructure. They identify areas where improvement is needed, which can help reduce risk.” – Paul Kirvan
Six ITGC audit controls to improve business continuity
A proper ITGC audit analyzes security issues, management and backup and recovery. Explore six controls to audit and steps for how to complete the process.
IT controls help compliance programs deal with changing regulations
Cataloging regulations, framework objectives and IT controls can simplify compliance programs and help compliance officers better deal with changing regulations.
IT risk assessment methodology guide for disaster recovery planners
In our IT risk assessment methodology guide, learn about how to perform a risk assessment, and why it’s an important part of disaster recovery planning.
Best practices for an information security assessment
An information security assessment is a good way to measure the security risk present in your organization. Find out how to yield effective results.
A ‘cloud first’ strategy calls for strong security: Five tips to get there
How to think ‘cloud first,’ the power of next-gen firewalls, why DDoS attacks are on the rise and more: The Data Mill reports.
Shift left testing is an approach used to speed software testing and facilitate development by moving the testing process to an earlier point in the development cycle. Shifting left is a reference to moving testing to the left on a timeline. Continued…
Quote of the Day
“Testing in short Agile iterations often necessitates a ‘shift left’ model, an approach in which testing starts much earlier in the application lifecycle.” – Adrian Bridgwater
SmartBear puts ‘earlier’ testing inside IDEs
SmartBear’s new developer focused test automation tool is called TestLeft.
How to do DevOps so you can avoid disaster and disappointment
If you want to do DevOps correctly, you need to think about the team as a whole. Expert Bert Jan Schrijver offers firsthand advice.
When it comes to DevOps and testing, change is certain
When DevOps and testing work together, the only thing that is certain is that change will happen. Expert Stephen Elliot explains.
How should organizations approach the DevOps movement?
Is DevOps just a buzzword or will it have long-term consequences for IT operations?
The future of software testing: It’s a brave new world
From mobile to the cloud and IoT, the future of software testing is going to look nothing like today. Let expert Gerie Owen guide you in the first of her 11-part series.
Information governance is a holistic approach to managing corporate information by implementing processes, roles, controls and metrics that treat information as a valuable business asset. Continued…
Quote of the Day
“Information governance strategies are critical to risk management. Effective information governance helps deliver the objective, authoritative data that risk management requires to be effective.” – Jeffrey Ritter
Examining the top data governance tools on the market
By examining the top data governance tools and the key criteria presented here, you can select the product that best fits your organization’s needs.
New risk management needs challenge information governance processes
Information governance is complicated by businesses’ risk abatement needs, but data management expert Jeffrey Ritter says they could be mutually beneficial with the right approach.
Protecting and managing your company’s proprietary information
In this excerpt of Information Governance and Security, authors John G. Iannarelli and Michael O’Shaughnessy offer tips for establishing guidelines for all departments or sectors of a business.
Can aligning compliance and information governance create new revenue?
Aligning companies’ compliance and information governance processes produces several business benefits, including potential new sources of revenue.
Merge big data governance with data validation to create new revenue
Learn how big data governance and data validation can be used to generate revenue for businesses, explained by information governance expert Jeffrey Ritter.
A quant (quantitative analyst) is a financial industry professional whose qualifications also include advanced mathematics and computer skills. Continued…
Quote of the Day
“Technology shifts have made it a lot simpler to harvest the value of data. Quants are being replaced by a new population: the data scientists.” – Yves de Montcheuil
Data Scientist: the New Quant
Today, the IT industry is only scratching the surface of the potential of big data. Data scientists hold the keys to that potential and are the new statisticians, or quants.
Case study: Hedge fund AHL Man Group uses MongoDB to feed quants with data
Hedge fund AHL Man Group has replaced a range of disparate relational databases with a single data platform built on NoSQL database MongoDB for financial market data.
Tom Davenport on using analytics to influence business decision-makers
Analytics guru Tom Davenport gives a peek into his latest book, ‘Keeping Up with the Quants: Your Guide to Understanding and Using Analytics.
Understanding big data analytics
Big data analytics is an emerging term in the storage industry. We identify the characteristics common to the technologies identified with big data analytics and explore the four major developmental segments that underline the diversity found within Big Data analytics.
Building data science teams takes skills mix, business focus
Analytics managers offered advice on building data science teams as part of a Strata + Hadoop World 2016 panel discussion on finding and retaining data scientists.
A Hadoop cluster is a special type of computational cluster designed specifically for storing and analyzing huge amounts of unstructured data in a distributed computing environment. Continued…
Quote of the Day
“A Hadoop cluster based on commodity hardware is better positioned to scale incrementally, and the required investment in a few more compute nodes and storage devices should be relatively small.” – David Loshin
Get your analytics architecture right to support scalable tools
Analytics teams can’t overlook the importance of setting up a scalable analytics architecture before deploying data tools and models to lines of business.
New tools offer a better view into managing Hadoop clusters
When Hadoop clusters move from pilots to production, big data management tools from LinkedIn, BlueData and Hortonworks may help ease the transition.
Make the right choice between Hadoop clusters and a data warehouse
A comparison of specific criteria and variables can help organizations decide whether their data processing needs are best met by Hadoop clusters or an enterprise data warehouse.
Managing Hadoop projects: What you need to know to succeed
Many companies are implementing Hadoop projects to manage pools of big data. This guide offers different perspectives on the Hadoop framework, explaining what Hadoop can and can’t do.
How big data and Hadoop will change storage management
Big data analytics creates some challenges for storage managers, but effective integration of big data and Hadoop into a storage environment can make them less daunting.
Sharding is a type of database partitioning that separates very large databases the into smaller, faster, more easily managed parts called data shards. The word shard means a small part of a whole. Continued…
Quote of the Day
“Database sharding isn’t anything like clustering database servers, virtualizing datastores or partitioning tables. It goes far beyond all of that.” – Jason Tee
Shards of Oracle: Distributed performance improved in Oracle 12c Release 2
Oracle sharding takes a page from playbooks of Amazon and others. With Oracle 12c Release 2, it joins Real Application Clusters in the company’s toolkit.
RAID systems yield to erasure coding for data protection
RAID systems have been the pre-eminent data protection choice for decades, but data growth and high-capacity systems have led to a rise in popularity of erasure coding.
Sharding in the Cloud
I/O reads to non-virtualized datastores can be one of the biggest bottlenecks in applications that have been deployed to the cloud. One of the easiest ways to avoid this problem is to simply shard your database.
Like Americanos? Then you’ll love distributed storage
Read replication, sharding and consistent hashing made simple. The Data Mill reports.
Sharding relational databases in the cloud
Discover why best practices in relational database sharding for Big Data in cloud computing databases are an essential part of the data management process.
Social media analytics is the practice of gathering data from blogs and social media websites and analyzing that data to make business decisions. The most common use of social media analytics is to mine customer sentiment in order to support marketing and customer service activities. Continued…
Quote of the Day
“Social media analytics also differs from traditional analytics in its complete reliance on online data as its data source.” – Reda Chouffani
How social media and analytics have redefined politics
Social media and analytics have had a major effect on electoral politics, as well as enterprise operations.
Tapping the potential of social media analytics tools
Social media analytics tools can help companies find and make sense of valuable customer data. This guide examines the potential and the challenges that come with social media analytics.
Social media monitoring tools rarely used to full potential
Social media monitoring tools give companies the data they need to build targeted marketing strategies, if they successfully analyze social data.
Social media monitoring tools: What they can and can’t do
Social media analytics has the potential to give companies valuable insight into customer data in new ways. Test your understanding of social media monitoring tools by taking this quiz.
Five key questions about social media monitoring and analytics
For enterprises looking to derive business value from social media monitoring and analytics, our expert has answers.