Cliff Saran’s Enterprise blog


March 20, 2013  12:18 PM

Recovering and deleting data from SSDs

Cliff Saran Profile: Cliff Saran
The next big thing

The latest generation of laptops and hybrid devices use solid state disks to boost performance and speed up the time it takes for the operating system to boot. In this guest blog post, Robert Winter from Kroll Ontrack writes about some of the challenges on attempting to recover data from a damaged SSD.

robert_winter.jpg

When choosing a storage media type companies should understand how this decision can affect the ease of retrieving data when there is a data loss.

A lot of businesses are investing in Solid State Drives (SSDs) to leverage its numerous benefits, but users beware. Although SSDs are more robust than traditional hard drives (HDDs), data loss can still occur – and in the event of a data loss, it’s also more complex to recover the data.

Unlike HDDs, SSDs store data in memory chips which have no moving parts, eliminating hardware damage like head crashes or motor defects. Yet, data loss can occur with SSD storage devices because the flash chips are susceptible to physical damage and the way data is stored is complex. SSDs are also exposed to the usual traditional data loss events such as human error, computer viruses, natural disasters, and software/programme corruption.

Recovering data from the common sources of SSD failures requires expertise in overcoming technical challenges that are unique to SSD and flash technology, such as decoding complex SSD data structures, specialised controller chips and numerous other SSD specific issues.  Data is stored on SSD dynamically, and this complexity makes data recovery highly specialised and time consuming. Also a single SSD memory structure can be as complex as an enterprise RAID (redundant array of independent disks) with eight, 16 or even 32 drives!

Only a handful of data recovery experts have data reconstruction programmes in place to identify, separate and reassemble SSD memory so that data can be extracted and achieve high quality results.  At Kroll Ontrack the recovery process involves the following actions:

  • Accessing and reading the data at chip level
  • Overcoming any encryption
  • Rebuilding data striping (much like RAID)
  • Overcoming any file system problems such as corruption or parts missing

The time it takes for Kroll Ontrack to recover data from an SSD is difficult to determine, because the recovery time is dependent on factors including the extent of data loss and the effort required to decode the data from the particular SSD in the device- which is the biggest challenges in the recovery process for SSD.  The way data is configured also varies greatly between manufacturer and models of SSD.  Each model requires Kroll Ontrack to work-out the configuration before data decoding can begin.  In most cases this is done with no help from the manufacturers.

Performing secure data disk sanitisation techniques on SSDs is equally tricky since it’s difficult to specify the exact location of where the data is stored to overwrite it.  Therefore, the best way to permanently destroy the data is through physical media destruction.  This typically involves shredding the media into small pieces so not a single chip escapes destruction. If the shredding process misses a chip, it’s still possible to recover data from it, so care needs to be taken to destroy everything.

SSDs are durable and it’s difficult to assess their lifespan because they vary depending on the manufacturer. To get an idea of how long a solid state-drive will last in application, the following calculations can be used to determine its life span:

It should be noted that these calculations are valid only for products that use either dynamic or static wearing levelling. Use the solid-state memory component specifications for products that do not use wear levelling.

There are various things a user can do to attempt to maximise its lifespan. The best way to find out the right methods is to look at reliable chat rooms and manufacturer recommendations here too.

SSD is a new technology and very few people have learned enough about it to expertly navigate through its RAID and the SSD layers and successfully find data when there’s a failure. Best practice is that before choosing to use it, contact a data recovery specialist for more information about the impact on data recovery for the specific environment and technologies you are investigating.

Robert Winter is responsible for all operations within the area of disaster recovery in the Kroll Ontrack labs, based at the UK headquarters in Epsom. 

February 15, 2013  3:39 PM

Oracle should heed warnings from the trends in enterprise

Cliff Saran Profile: Cliff Saran
Enterprise software, Oracle, Oracle VM, Software as a Service

The findings from Forrester’s latest research on Oracle point to a worrying trend in the enterprise software landscape. Businesses are not generally doing large, transformational IT projects built around traditional enterprise resource planning (ERP).

The key suppliers are adapting their enterprise software portfolios in a bid to drive more sales. But the CIOs Forrester spoke to are not convinced it is a strategy that is working for Oracle.

In Forrester’s Oracle’s Dilemma: Applications Unlimited report, many people are happy with the software they are running and have no real plans to migrate onto Oracle’s future enterprise platform.Since Oracle is a strategic supplier to many, there is little interest among CIOs for migrating away. There are concerns that Oracle may turn some of the products they have deployed into cash cows, potentially with high, annual maintenance fees and licensing costs.

Members of the IT director’s group, the Corporate IT Forum, are angered by the changes to Oracle licensing. Head of research Ollie Ross told Computer Weekly that members were being pushed into taking certain technical directions like OVM (Oracle VM), rather than VMware. The forum’s executive director, David Roberts, believes many CIOs are reacting negatively to Oracle’s exceptionally high-pressured sales techniques. This is reflected in the supplier’s poor software licence revenue when compared with its nearest rival, SAP. If businesses are not upgrading at a rate that looks good on the company’s balance sheet, Oracle will need to take a different approach.

Newham Borough CIO Geoff Connell is concerned that Oracle (and other top tier vendors) will increase licensing, because their customers are “locked into” their products due to historical investments.He argues that many software suppliers appear to be ignoring the financial climate and are attempting to make up for reduced sales volumes with higher unit costs.

Coercing customers to buy more software is not the right way to go. But Oracle executives have not shown much willing to go wholeheartedly down the software as a service (SaaS) route, or even offer a roadmap for integrating SaaS and on-premise enterprise IT. Nor has Oracle been willing to adapt software licensing to make it more virtual machine friendly. The research shows customers are unhappy and the time for Oracle to make some tough decisions is long overdue.

Connell believes if Oracle and other leading suppliers continue to hike prices, users will abandon commercial enterprise software for open source alternatives.

Enhanced by Zemanta


January 21, 2013  12:19 PM

Virtual Modelling: A new IT optimisation tool

Cliff Saran Profile: Cliff Saran
Business, Cloud Computing, Platform as a Service, Project management, Software as a Service

A few weeks ago I interviewed Paul Michaels, CEO of business technology consultancy ImprovIT, about a methodology for modelling decision-making. In this guest blog post Robert Saxby, consulting director at ImprovIT, explains a bit more about how the methodology, called Virtual Modelling, works, and the business benefits.

saxby.jpg

When it comes to re-engineering IT environments to save money or achieve best practice, a trial-and-error approach can be both complicated and costly. Virtual Modelling is a new business tool uses ‘what if?’ scenarios to simulate real world outcomes and identify efficiencies, future strategies and best sourcing options without chopping, changing or disruptive ongoing operations.

The challenge

CIOs today are caught between a rock and a hard place:  Having to slash IT costs while retaining productivity and service quality – often due to government mandate.  Of course cost cutting pressures are nothing new, and for many there is little blood left in the stone.  The question now is: “How and where can we make further reductions without knee-capping the entire operation?”  There are plenty of apocryphal tales about organisations axing staff and abandoning efficiency enabling technology projects only to discover their actions have mortally wounded deliverables and reputation. The result: a panicked and costly rehiring and/or re-purchasing exercise to redress the balance.

Finding the cost/quality balance

Wouldn’t it be great if you could work out the exact cost and productivity balance without the cost and disruption of making changes on a trial-and-error basis? Virtual modelling creates scenarios that are based on real, current and accurate data mined from your own ICT operation that can predict real world outcomes without impacting current operations. But it can only do this based on available KPI data, and if it doesn’t already exist it must be generated via benchmarking studies. For as Lord Kelvin, the 19th c. physicist once said:  ‘If you cannot measure it, you cannot improve it.’

Measure it first

Once created, this baseline data provides the tools to compare performance against other public service (and commercial) entities of a similar size and complexity in terms of things like value for money, quality of service, best practice and competitive pricing.  Digging a bit deeper, you can also find out where your organisation stands in relation to best practice standards for staffing (quality and quantity), process complexity, outsourcers (scope & service levels) and IT governance. 

All of this information is then used to create ‘what if’ scenarios, typically dealing with areas such as: Cost/Price, Volumes, Staffing, Quality & Service Levels, Service Scope, Complexity, Project Efficiency and Process Maturity.  Providing the model has been mapped with well-researched data, the outcomes obtained offer  accurate indicators that can be used to make decisions about outsourcing, staffing, process re-engineering, cloud migration or anything else.

 

In the video below, Paul Michaels, CEO of  ImprovIT, discusses how our Virtual Modelling methodology helps decision-making within a project.

 

 

 

Building up the model

Virtual modelling is designed to pinpoint the impact of one or more project parameters upon all the others. For example:  If I change Service Quality (SLAs) and/or ‘Service Scope’ what effect will this have on ‘Cost?  Or: If I reduce ‘Complexity’ what effect will this have on ‘Processes’?   It also shows the changing balances of the whole picture when one or more parameters are altered.  For example: ‘If I want to increase ‘Volumes’ or ‘Service Quality’ what changes do I need to make to all the other segments and how will this impact the enterprise as a whole?

Virtual Modelling System

So, to find the Goldilocks balance between IT cost and service quality let’s start by feeding staffing metrics into the simulation model, given the high impact of staffing on cost.  But this isn’t just about a straightforward set of numbers: it also has to allow for a range of ‘soft’ factors such as varying levels of knowledge, skill sets and the specialist expertise that can make an individual or team difficult to replace.

Next let’s look at complexity – typically the highest contributor to an IT department’s spend after staffing.  This involves any and everything from security and data confidentiality to high availability requirements, legacy system integration and the number of nodes in the enterprise network. Rule of thumb: The greater the complexity the higher the cost. A virtual modelling analysis determines where simplifications can be made without jeopardising mission-criticality. Once established that these changes are advisable, modelling can also provide an accurate estimate of cost, timelines and impact on staffing and service levels.
Then there is the question of outsourcing. Will it save money?  What services should be outsourced?  And if we are to outsource, what kind of service – a traditional provider or a cloud-based service?  And what business model:  Iaas, SaaS or PaaS?  Data fed into a simulation model can provide an accurate estimate of the likely ROI and TCO – with timescales – of each option. 

Process maturity also impacts the cost/performance balance.  There are industry standards which provide best practice guidelines, such as ITIL (IT Infrastructure Library) ‘Agile’ and ‘Lean’ (a production practice that looks to reduce resource expenditure down to the minimum required to deliver value to the customer).  Comparisons with these guidelines can indicate where improvements can be made, but virtual modeling can determine what will cost and whether it’s worth the disruption to operations.  It’s also worth noting that achieving process maturity is rarely a quick win: it takes time and requires clear, unequivocal goals and plans led from the top.

Cloud migration

Given the chequered history of public sector IT projects, and the challenges that so many ITC departments are going through in making decisions about things like whether, when and how to migrate to the cloud, and how to optimise resources on an ever-diminishing budget, using Virtual Modelling to run scenarios on all the available options provides new decision-making tools that help to identify the best roadmap ahead while avoiding wrong roads and dead-ends.

Robert Saxby is consulting director, ImprovIT

Enhanced by Zemanta


January 17, 2013  3:54 PM

Software licence audits: Confidence in Your Choices

Cliff Saran Profile: Cliff Saran
CIO, Guest Blog, Legal: The Technology Counsel, rants, Software Choices

Over the last few weeks Computer Weekly has written about software licensing and how suppliers are demanding IT departments run costly software audits. At the same time, we have started looking at the complexities of licensing, such as in a virtualised environment.

In this guest blog post, Martin Thompson, a SAM consultant and founder of The ITAM Review and The ITSM Review, provides some top tips on what to do when you receive an audit letter:

martin-thompson.jpg

Payment Protection Insurance (PPI) spam is in vogue.

You may have received one or two of these recently:

“You are entitled to £2,648 in compensation from mis-sold PPI on credit cards or loans.”

PPI claims and other spam solicitations are the bane of our inboxes. The vast majority of us know to simply ignore them. Unfortunately the handful of those who do respond justifies the exercise to the spammers. 

This mass-marketing technique is used in exactly the same fashion by trade bodies such as BSA and FAST to force their agenda and start software audit activity.

Supplier audits are a fact of life, some software audit requests are serious and expensive, some are merely spoof marketing campaigns – how can IT professionals decipher between the two?

Whilst I’m not a legal expert, fifteen years in this industry has taught me that there instances when you should respond to an audit request and instances when you should simply walk away.

When to Take Software Audit Requests Seriously

In my opinion there are two instances when you should take software audits seriously:

  1. When you are approached by a software publisher directly with reference to a signed contract
  2. When you are approached by an organisation with real proof of a breach of intellectual property law.

Contracts with software publishers have ‘Audit Clauses’, the right to come and audit you periodically at your own cost. Your company either signed this and agreed to it or will need to fight against it. Smart companies negotiate it out of the contract by demonstrating maturity in their internal processes.

Breaches of intellectual property supported by evidence are a legal dispute and should be treated as such – by passing the issue over to your legal team in the first instance.

When to Ignore Software Audit Requests

Requests for ‘Self-Audit’ or other direct mail fishing exercises can be ignored.

Trade bodies such as BSA and FAST commonly write letters to companies requesting them to ‘Self-Audit’ or declare a ‘Software Amnesty’.

These organizations are masters at crafting well-written legal sounding letters but have no legal authority whatsoever. Nor do they have the resources to follow up to every letter sent.

Just like any other complaint made to your business it should only be taken seriously if there is firm evidence or the organisation issuing the dispute is supported by the appropriate government agency. For example the Federation Against Software Theft (FAST) has no teeth whatsoever unless accompanied by HM Customs and Excise.

Confidence in Your Choices

IT departments with the appropriate Software Asset Management (SAM) processes in place have both the confidence and the supporting data to discriminate between bogus claims and genuine supplier audit requests.

Whilst much noise is made in the industry of senior management being sent to prison or the company name being dragged through the gutter – the real and compelling downside to a lack of software management is UNBUDGETED cost and DISRUPTION. Surprise license costs and massive disruption whilst IT staff are diverted from key projects to attend to an audit or hunt down the appropriate data.

Unexpected software audits can be good for your health in the longer term if it allows the organisation to realize it is out of control.

SAM is so much more than compliance and counting licenses. Organisations with a solid SAM practice are more nimble, competitive and dynamic. No more stalling on that virtualisation project because we’re unsure of the licensing costs, no more uncertainty about moving to the cloud because we don’t know how that leaves us contractually. SAM provides the business intelligence to innovate and take action.

Martin is an independent software industry analyst, SAM consultant and founder of The ITAM Review and The ITSM Review. Learn more about him here and connect with him on Twitter or LinkedIn.


December 6, 2012  11:12 AM

MS user CAL fails BYOD

Cliff Saran Profile: Cliff Saran
Windows 7

The changes Microsoft has made to client access licences (CAL) reflects a change in how people use the company’s software. Today, people expect to have access to the MS Exchange Server via their Andoid or iOS device. This is not added-value. Email access from any device is essential to enable people to use their own devices at work. So why does Microsoft want to charge extra?

The problem Microsoft faces is that its traditional business model, where people would run out and buy a new Windows PC, every time it released a new OS, is broken. Windows 8 is a massive departure from previous OSes, and it will take an awful long time before people feel the need to upgrade. In the meantime, it is losing out, because Apple and Google devices are able to connect to Microsoft servers.

It still makes money:  the users who connect to Microsoft systems have to pay a Microsoft client access user licence. Just because a user may have more than one device, does not give Microsoft the right to charge more. After all, most of the time they will only ever use one device at a time to access a Microsoft system. How often will someone want to accesses email simultaneously on a PC, a tablet and a smartphone. Come on Microsoft, we only have two eyes, two hands and one brain.

Forrester analyst Duncan Jones says per-device licensing for software is obsolete in the mobile and virtual world.

So rather than charging a premium for user-based CALs, Microsoft should make device CALs cheaper, since they are more restrictive.


November 26, 2012  1:22 PM

Home working alert

Cliff Saran Profile: Cliff Saran
Security, an afterthought

A couple of weeks ago I received a telephone call at home claiming to be from the Windows Support team. The lady on the phone asked me if my PC was running slow (which it was!) and put me through to a tech lead.

“How did you get my number,” I asked.

The tech support man said he worked for a company that had been approved by Microsoft to provide customer support. He then asked me to open the Windows Event Viewer. “Your PC has been infected,” he said, when I told me what the Event Log was showing.

I guessed his next question would be to ask me to grant him remote access to the PC….The penny dropped. Ah this is a phishing scam. Had I agreed, the caller would probably have been able to install rogue software on my PC.

Okay so he very nearly got me. Lesson learnt.

But it is worrying how easy we can be tricked. And with more of us using our own computers for work, there is a very real risk that hackers will target us at home claiming they are tech support.


November 20, 2012  12:47 PM

Emirates in-flight wi-fi and telemetry

Cliff Saran Profile: Cliff Saran
Evolutionary IT

Image1.jpg

I have been a guest of Emirates today at its network control centre in Dubai, I am currently on an Airbus A380 flying back to the UK

.

There is Wi-Fi on this flight, which is connecting via satellite internet. Emirates charges just $5 for 20 Mb, which is not bad. I have been able to access Skype messaging and connect to the corporate VPN. The connection works, but it is not fast enough (Broadband Speed Checker failed to start) and is intermittent. However I have been able to connect to our Exchange Server and write this blog.

The network control centre manages the fleet of aircraft and the crew. The engineers have remote access to the aircraft in flight, allowing them to access telemetry data, reinitialise systems, or schedule replacement parts to be available at the destination airport.

Data from the flight is shared with suppliers, and fed into maintenance systems that use predictive analytics and forecasting to ensure the aircraft is kept operational and passengers are not delayed.


November 2, 2012  6:18 PM

Video review: Microsoft Surface tablet

Cliff Saran Profile: Cliff Saran
Netbooks

The £399 Surface is Microsoft’s first foray into PC hardware. The tablet device runs Windows RT, which means it is unable to use x86 applications. This is a bit limiting if you would like your own browser  since the only one available at the moment is Internet Explorer 10, that ships with Windows RT.

However, you can download applications from the Windows Store, Skype, eBay and the Kindle app are there, but sadly, no BBC iPlayer. There are several tools to improve the tablet, but there does not appear to be much in the way of enterprise software in the Windows Store.

Still, the relatively limited functionality means the Surface tablet could be deployed where people need simply email access, tasks, calendar and basic Word, Excel and PowerPoint functionality. It is probably suitable for a device in education and where a relatively locked-down environment is preferred.

I personally like Windows 8 Professional, which is only available on x86-based tablets and hybrid devices, as I prefer to run my own applications, rather than be restricted by the choice in the Windows Store.


November 1, 2012  1:11 PM

Windows 8: worth the wait?

Cliff Saran Profile: Cliff Saran
CIO

Windows 8 is out, the Microsoft Store in New York is open and Surface has surfaced. Times are certainly changing for Microsoft as it takes aim at Apple with a consumer friendly device and OS. Microsoft needs to win hearts and minds. But the winning formula that  has given it a licence to print money with Windows licences, is no longer compelling.

Windows 8 touch-screen UI

One CIO I spoke to questions the need to pay £25 per desktop to upgrade the PC OS and he is certainly bot enamoured by the Microsoft Enterprise Agreement for MS office, given that only a small proportion of employees use PowerPoint and Excel.

Certainly, it seems there is no compelling reason to switch over from Windows 7 to Windows 8. But Windows 8 devices and Microsoft’s own Surface tablets will find their way into the enterprise, thanks to the touch UI.

Personally, I’m not convinced Windows 8 makes a good desktop OS, for a non-touchscreen PC, but Microsoft says it is 30% faster than Windows 7.


October 18, 2012  10:36 AM

Oracle pushes proprietary, but can it keep ahead in hardware?

Cliff Saran Profile: Cliff Saran
CIO

Oracle is well and truly pushing engineered (ie proprietary) systems.Speaking yesterday in London, Oracle president Mark Hurd claimed that Oracle’s vertically integrated stack combining hardware, middleware, database and enterprise applications, has been pre-integrated so customers do not require expensive IT consultants to connect the system together.

This may be true of the bits within the Oracle stack, but most businesses connect systems across complex, heterogeneous environments.

Hurd also stated that Oracle will spend $5 bn this year on R&D. Now that it is playing in the hardware race, how far will that go? After all, Intel is expected to spend over $18 bn on R&D.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: