The average manager spends 20% of their week putting out IT fires. What does this mean to the business? As we design modern network architecture, availability is a serious problem. Not just in terms of technology but also in terms of workplace productivity. What does it mean when an manger spends 20%
Imagine the owner walks in the door…
Hey Boss… Suzan over in payroll just called her computer is down, until it’s up, we don’t get paid.
The owner of the company has legal responsibilities to make payroll or face fines and even imprisonment if the problem is ignored. Employees that aren’t paid reliably don’t stick around either. So payroll is a big problem. The manager will drop everything to discover the status of the payroll system.
The reality is that IT touches every department. So when any system is down, that means a manager is following up on the problem. Let’s put some real numbers around the problem. So if a manager makes $100,000 / year that means that the company is losing $20,000 / year in salary to fix a problem that probably shouldn’t be happening. A company with a CEO and 4 managers could be losing $100,000 just in lost wages each year.
Lost salaries are easy to see but there’s more. What about lost productivity? We don’t hire the manager to break even. We actually expect a manager making 100,000 to provide 2.5 times that number (or more) in productivity. So to pay the managers salary we must demonstrate 250,000 dollars in real work place productivity. If 20% of that productivity is lost, that means we lose not 20,000 but actually $50,000 in lost productivity due to failed computer systems. Now the loss in productivity in 5 managers is $250,000 not just $100,000.
Employees seldom think about this, but your CEO will be thinking about it when you promote a new technology for the company. By understanding this idea, IT consultants can actually promote more IT projects if we tie in the concept of improved reliability and regaining lost productivity to the organization.