SWOT analysis (strengths, weaknesses, opportunities and threats analysis) is a brainstorming exercise for helping to identify what internal and external factors will impact the ability of a project, product, place or person to be successful. Continued…
Quote of the Day
“Successful completion of the strategic planning process means the newly implemented strategies must be periodically evaluated and updated as needed.” – Paul Kirvan
Your business continuity process can enhance strategic planning
Learn how activities performed in your business continuity process can foster long-term strategic planning. We note the five areas where a business impact analysis can add value.
What-if business planning simulation at its predictive best
As analytics gets more advanced, the ease with which large-scale what-if planning simulation in business can be implemented is increasing.
Enterprise digital transformation: How CIOs can drive business growth
Stephen Hamrick, vice president of product management for SAP Jam, explains how enterprise digital transformation is changing the role of the CIO in business process development.
CIO playbook for 2018: Leading analysts break down the trends
What’s in the 2018 CIO playbook? Ten leading analysts lay out the top issues: Read what they say about cloud consolidation, networks, API management and more.
CMS analytics arms businesses with a strategic planning edge
CMS analytics helps large enterprises get business value from unstructured data in their networks, delivering content to the people who need it.
Digital transformation is tied to the broader trend of business transformation and takes ______ to take hold.
b. a while
Continuous deployment is a software release strategy in which any code commit that passes automated testing is immediately released into the production environment. Continued…
Quote of the Day
“Continuous deployment skips the operations oversight step in software production. So, automated tools must ensure code success in administrators’ stead — before mistakes go live.” – Adam Bertram
Grow rapidly into a continuous delivery pipeline
For one organization modernizing applications, the logical next step was to update operations as well — first with a continuous delivery pipeline, then eventually full continuous deployment.
Understand continuous delivery and continuous deployment
To the average software user, continuous delivery and continuous deployment mean the same thing. However, there are significant differences you should know.
Deploy patches safely and sanely in a CI/CD workflow
Deploy patches to a CI/CD DevOps workflow without causing disruptions or failures by automating where possible.
Automated continuous deployment tools produce successful code
Continuous deployment tools that employ automation capabilities help ensure that the last phase of the pipeline goes off without a hitch.
What the CI process can and can’t do for software delivery
Critics initially scoffed at some DevOps strategies, including continuous integration. But, today, the CI process is a crucial part of faster app deployment pipelines.
Treating testing as a ________ rather than something that should be done continuously throughout software development is a recipe for defective software.
Infrastructure as code (IaC) is an approach to software development that treats physical compute, storage and network fabric resources as web services and allows apps to run where they are best suited, based on cost and performance data.
Essentially, IaC negates the need for software engineers to be concerned with the physical location of infrastructure components. Instead, when a software application requests infrastructure to run, available services are located through an automated discovery process and resources are allocated on demand. When an infrastructure resource is no longer required, it is re-appropriated so it can be allocated to another application that needs it.
Examples of IaC tools include AWS CloudFormation, Red Hat Ansible, Chef, Puppet, SaltStack and HashiCorp Terraform. Each of these tools has its own way of defining infrastructure, and each allows an administrator to define a service without having to configure a physical infrastructure. These tools are also able to roll back changes to the code, should an unexpected problem arise when new code is released.
Some IaC tools rely on a domain-specific language (DSL), while others use a standard template format, such as YAML and JSON. When selecting an IaC tool, organizations should consider the target deployment. For example, AWS CloudFormation is designed to provision and manage infrastructure on AWS and works well with other AWS offerings. Alternatively, Chef works with on-premises servers and multiple cloud provider IaC offerings.
IaC can be managed through the same version control and automated testing procedures that developers use to maintain quality assurance (QA) in their continuous integration and continuous delivery (CI/CD) pipelines. As of this writing, there are no agreed-upon standards for implementing IaC and the concept is known by several other names, including composable infrastructure, programmable infrastructure and software-defined infrastructure. Continued…
Quote of the Day
“While IT organizations are catching on to the benefits of infrastructure as code, the majority haven’t achieved compliance automation, despite a swath of available tools for the job.” – Kurt Marko
Compliance automation prevents regulation audit snafus
Make compliance automation part of the move to infrastructure as code. The tools available from Chef, Ansible and other vendors fit the bill.
How infrastructure as code benefits and eases provisioning
With infrastructure as code, organizations can automate provisioning. Among the infrastructure as code benefits is the power for an IT team to use scripts to deploy infrastructure more efficiently and the ability for an organization to avoid vendor lock-in.
Benefits of infrastructure as code range from speed to scaling
IT organizations must learn new skills, and possibly new tools, to reap the benefits of infrastructure as code. The effort pays off with enhanced automation and consistency, among other rewards.
Infrastructure governance keeps systems in compliance
Software-defined cloud infrastructure configurations may drift out of compliance over time. Automated infrastructure governance is always on guard.
Become an infrastructure developer to get code to production faster
The infrastructure developer job role combines skills from software programming with knowledge of the base layers of the IT stack to enable fast, stable code rollouts.
Successfully _______ a DevOps culture in a data center isn’t easy, but it brings great rewards.
Value-based healthcare is a business model that seeks to reduce the cost of healthcare services by providing patients with the right care from the right provider at the right time. Continued…
Quote of the Day
“As value-based healthcare becomes more established, payer-provider partnerships can be advantageous because their collaboration can yield a truer view of patient data through the combined use of analytics, strategies, workflows and IT solutions.” – Megan Charles
Payer-provider partnerships spurred by data, value-based care
Payer-provider partnerships are becoming increasingly common in healthcare in response to the transition to value-based care and greater adoption of EHRs. But trust between the organizations must be established first to make the collaboration work.
March to value-based healthcare results in ‘payviders’
As value-based healthcare presses providers and insurance companies to explore options for measuring care quality, some healthcare organizations have opted to create their own health plans — becoming payviders.
Medical group sees the future in value-based healthcare
The Hatfield Medical Group in Arizona is going all out to become a full-time provider of value-based healthcare. But the move takes time — not all payers are on board — and it requires passion and investment in technology and staff.
Can value-based care models cut costs? Yes, they can
Value-based care models have come into their own, according to a new study by Change Healthcare. The cost savings are clear, but the challenges remain.
Hospitals battle duplicate medical records with technology
As the healthcare industry edges toward value-based care, duplicate medical records are not only a burr in the side of health IT professionals, but they also raise questions about the quality of patient care. Learn what to do when patient information doesn’t match up.
The FDA reclassified sutures as Class II medical devices in the early ______.
Cloud automation is a broad term that refers to the processes and tools an organization uses to reduce the manual efforts associated with provisioning and managing cloud computing workloads. Continued…
Quote of the Day
“Together, cloud automation and orchestration provide more consistent and predicable workflows, something that’s especially important in hybrid environments.” – Kathleen Casey
Here’s what it takes to become a cloud automation engineer
Interested in becoming a cloud automation engineer? The IT role often requires years of experience in both development and operations, using a wide range of tools and platforms. Review the requirements and skill sets that prospective employers look for in a candidate.
Brush up your cloud automation and orchestration skills
Cloud automation and orchestration are critical technologies that save IT shops time and money. Use these five tips to implement them correctly.
Automation poses risk and reward for cloud operations teams
Enterprises increasingly automate cloud computing tasks, such as disaster recovery and resource management. While this shift might worry some IT staff, there are ways cloud operations teams can maintain relevancy.
Embrace automation in a hybrid cloud deployment: Free chapter
A hybrid cloud deployment introduces a lot of IT complexity — but automation, when properly implemented, helps simplify management. Use this free chapter from The Evolution of Cloud Computing by Clive Longbottom to keep your hybrid environment running smoothly.
Compare two types of cloud automation and orchestration tools
Cloud automation and orchestration get complicated, especially with multiple platforms in play. Assess whether native or third-party tools meet your needs.
Brokers must work closely with cloud providers, while also _________ their services to appeal to customers.
BeeMe is a web-based social experiment that will be hosted by the Massachusetts Institute of Technology Media Lab at 11pm on October 31, 2018. Continued…
Quote of the Day
“Crowdsourcing relies on the principle that many brains are better than one and, when brought together, can innovate or problem-solve far more effectively than when people work and brainstorm on their own.” – Lauren Horwitz
Humans and AI tools go hand in hand in analytics applications
Organizations are pairing up humans and AI tools in analytics applications in an effort to ensure that the output of machine learning algorithms and other forms of artificial intelligence is accurate and relevant.
Robot social engineering works because people personify robots
Robot social engineering could be a viable attack vector in the future, according to Brittany ‘Straithe’ Postnikoff, both because of the various social abilities that robots can use and because robot manufacturers don’t focus on security.
Crowdsourcing gets street cred with cognitive computing software
Cognitive computing software is starting to have a real-world impact on enterprise processes and is fueling innovation in approaches like crowdsourcing.
Can crowdsourcing ideas boost customer experience?
Crowdsourcing ideas and product innovation have become more than just novel experiments in several industries.
AI chatbots can provide business value when used wisely
Businesses find value in AI chatbots when humans remain on the customer support team to provide empathy and address complex issues.
Some people join social networking sites but don’t ________ them often.
Red Hat is a software company that assembles open source components for the Linux operating system and related programs into distribution packages that can easily be ordered and implemented. Continued…
Quote of the Day
“IBM and Red Hat plan to build a raft of complementary open source products to capture a huge hunk of the hybrid cloud market, but not offend their respective partners.” – Ed Scannell
IBM acquires Red Hat in $34bn hybrid cloud push
IBM and Red Hat senior leadership teams confirm imminent merger, which they claim will help accelerate migration of enterprise workloads to the hybrid cloud.
IBM, Red Hat customers should watch acquisition closely
Red Hat customers should expect the company to remain autonomous after IBM completes its landmark $34 billion deal, analysts say.
IBM-Red Hat juggernaut targets hybrid clouds
The proposed IBM-Red Hat combination aims to use each other’s strengths in open source technologies to establish a beachhead in the hybrid cloud market. But the companies must sort out overlaps with each other’s offerings and partnerships with competitors.
Ansible roadmap steers toward network, security via integrations
Ansible packed its roadmap with developments largely aimed at networking and security specialists who need to speak the same automation language as infrastructure and application teams.
Your Red Hat Hyperconverged Infrastructure questions answered
The Red Hat Hyperconverged Infrastructure platform hit the market almost exactly one year ago. The company combined existing open source software offerings already in its portfolio into a suite for organizations looking for a DIY HCI system that can be deployed on existing hardware.
In a hybrid cloud, sensitive data and computing resources may be maintained in two _________ environments.
A mission-critical application is a software program or suite of related programs that must function continuously in order for a business or segment of a business to be successful. Continued…
Quote of the Day
“When deploying a mission-critical application, we need to test and validate both the functionality of the application and the capabilities of the infrastructure.” – Niel Nickolaisen
Cloud and containers rewrite monitoring and management rulebooks
Modern IT is undeniably more flexible with the abstraction layers of cloud and containers — but is it still hitting performance goals? Only if ops teams change tactics.
Mission-critical IoT: What every business needs to know to prosper
Keysight Technologies’ Cheryl Ajluni offers three things any business or product maker can do starting today to improve their chance of mission-critical IoT success.
Dell EMC: Time for hyper-converged, but not for mission critical
Hyper-converged vs converged infrastructure: Dell EMC says hyper-converged isn’t ready for mission-critical applications, but may be by the end of the year.
Evolving data protection technologies require attention
There are lots of threats out there, but these are ‘exciting times’ for data protection technologies, according to Arcserve’s vice president of products.
Puppet on why people, process and technology are mission critical to DevOps success
At the PuppetConf user and partner summit in San Francisco, Puppet and its customers shared insights on how best to address the people, process and technology sides of the DevOps success equation.
An operating system (OS) is the ________ that, after being initially loaded into the computer by a boot program, manages all the other programs in a computer.
Net Promoter Score (NPS) is a metric for assessing customer loyalty for a company’s brand, products or services. Continued…
Quote of the Day
“NPS can help CIOs continually rate and iterate on how they’re serving the customer, which could be especially useful when introducing change to the organization.” – Nicole Laskowski
Net promoter score a useful IT tool for honing customer-centric design
A net promoter score could provide IT with a mechanism for changing how they deliver products and services.
NEC launches InfinityBoard collaboration hardware for meeting rooms
NEC Display launches its new InfinityBoard collaboration hardware, an outdoor equipment manufacturer deploys NewVoiceMedia to improve contact center performance and Fuze announces mobile work upgrades and an upcoming analytics tool.
Three ways the employee Net Promoter Score is an HR supertool
To support worker retention efforts, you need to know how loyal your staff is. The employee Net Promoter Score can help you find out.
KPIs: Monitoring VAR business metrics for a better bottom line
Get advice on and find out how your peers are monitoring VAR business metrics, identifying trends and taking action to improve the bottom line.
IT managed service providers may find it pays to diversify
IT managed service providers continue to expand, but the fastest growing firms are those offering services in the cloud and other emerging fields.
Our data shows that a vast majority of customers prefer Brand A ____ Brand B.
A streaming data architecture is an information technology framework that puts the focus on processing data in motion and treats extract-transform-load (ETL) batch processing as just one more event in a continuous stream of events. Continued…
Quote of the Day
“The rise of streaming data architectures is connected with a larger change that is happening: the enterprise is becoming more real time. With streaming data architectures, data can be processed on the fly and no longer needs to be collected in a data store so that queries can be run against it.” – Kostas Tzoumas
Big data platform broadens place in analytics architecture
Structured data and streaming analytics are broadening the role of big data platform technologies if the 2018 Strata Data Conference in New York is any indication. This podcast sorts through the signs for users looking to add big data systems to their analytics architectures.
Information architecture applied to big data streaming, AI
Data management expert William McKnight looks at big data streaming, AI and GDPR in an interview. While these issues challenge data professionals, a look at their basic composition can provide a guide to their future status as part of the enterprise information architecture.
Big data tooling rolls with the changing seas of analytics
On the eve of the Strata conference in New York, big data tooling continues to morph. This news story tracks some recent product activity of noted Hadoop vendors, uncovering the paths they’re taking from alternative data warehousing to full-fledged big data analytics systems.
Streaming data analytics puts real-time pressure on project teams
Streaming data analytics systems give companies useful information in real time, but a plethora of technology options complicates efforts to build them.
5 trends driving the big data evolution
Big data evolution stems from factors like the convergence of structured and unstructured data platforms, practical machine learning and cheaper compute resources that have brought big data use into the mainstream. It’s time to pay attention to big data trends and put the technologies to use.
Although big data is getting bigger all the time, much of the data being collected ___ useless.