During the Build developer’s conference in April, Microsoft is expected to reveal more details of a future version of Windows codenamed Threshold.
Rumours on the web suggest that Threshold could become Windows 9. The OS is set to bring together Windows Phone, Windows 8 and the xBox One operating systems.
Microsoft’s previous attempts at simplifying its various operating systems have had varying degrees of success.
Windows 2000 Workstation and Windows Millenium merged into Windows XP with a single kernel for home and professional users.
On the mobile side, Microsoft attempted to provide a common look and feel with Windows CE, and cross-platform development, but at the time, a desktop-like GUI on a smartphone did not gain acceptance.
With the evolution of the Windows Phone OS, Microsoft introduced a touch UI with tiles, that has made its way onto the Windows 8 OS. This time, however, the touchscreen UI, has not sat very well in the corporate market and home users have generally preferred cheaper Android-powered tablets over Windows 8 powered tablets.
The fact that Microsoft is looking to rebrand Windows RT, its low-end ARM-powered Windows operating system, suggests the company is moving towards a single Windows OS across all devices. Interestingly, the xBox One also runs a version of Windows .
A single core OS would greatly simplify application development and integration. For Microsoft, it would mean core services like Skype, Windows Live and Office 365 would work seamlessly between the xBox One, Windows-powered tablets and smartphones and traditional PCs.
The wider MS ecosystem would benefit – so, in theory, B2C companies could develop services once and target customers across all three platforms.
All will be revealed at Build 2014, but Microsoft has some big changes to make this year, not least, hiring a new CEO to take over from Steve Ballmer.
It has been a while since my last post. Today I spent the last few hours updating Windows 8 and I thought I’d share my experiences.
It’s a 3 GB download and will reboot your PC a few times and configure the system before booting the new OS. A network connection is required after the download for user authentication.
On my configuration I needed to enter my Windows Live login and a code that Microsoft texted to my registered mobile phone.
Finally, the machine boots up and….there is a Start button. Not quite,…my machine still boots into the Windows 8 Start screen with the sliding tiles of apps. The desktop tile does indeed bring up a Start button, but don’t hold your breath. It’s no Windows 7 start menu. It is a button that will bring you back to the Start screen. Amazing.
For anyone like me, who really wants the Start menu, download Classic Start menu. It’s free and still works on Windows 8.1.
Earlier this week Ubuntu’s CEO, mark Shuttleworth unveiled a crowdfunding model to get people to buy into his concept of a converged PC/smartphone device.
The appeal hit its first million within five hours and became the fastest crowdfunding campaign ever to reach $2 million, hitting this milestone in 7 hours, 59 minutes and 58 seconds. It has already also beaten Indiegogo’s previous highest ever campaign, which stood at $1,665,380.
In fact, at the time of writing (3pm GMT, July 24 2013), all 5000 of the $500 device have been “pre-ordered”. I’m being quite deliberate with the words I have used here: yes, I think Shuttleworth’s Ubuntu Edge campaign is a bit like pre-ordering a new brand product on Amazon, that isn’t shipping yet (like the xBox One) or Playstation 4).
OK, both these products do exist – unlike Ubuntu Edge. But, Shuttleworth is not really selling a product. He is selling an idea, allowing people to get in at an early stage and become early adopters.
The campaign aims to raise $32 million over 30 days, for a limited production run of 40,000 devices.
Raspberry Pi (Photo credit: CesarCardoso)
It’s been a while since I raved about losing a weekend getting to grips with Linux on a home project to build a music server.
It took a weekend to get the open source Squeezebox music server software to run on Ubuntu – that was two years ago.
Now thanks to the Raspberry Pi, it is entirely possible to get going in under an hour – if you can Google instructions and are comfortable copying and pasting bits of code.
All you need is a £30 Raspberry Pi (the Model B with 512 MB of memory), a micro USB power supplier (standard Android phone charger should work), a 4 GB or bigger SD card (I used a £12 8 GB class 10 card from Maplins), and a network cable to plug the Pi into your router.
There’s plenty of places to find out how to get started. I’ll link to the stuff I found useful.You’ll need to decide which Rasberry Pi Linux distribution to use, download it from a standard PC then copy the “image” file onto the SD card.
You will need a special program to copy an image of the Raspberry Pi Linux distribution to the SD card, as standard Windows file copy will not work. Again a Google search will show a number of applications that will work: I used Win32 Disk Imager.
Unplug the SD card and plug it into your Pi, connect the network cable to your router and boot up!
This gets you going, the Pi will light up and you’re up and running but you won’t be able to see anything as no monitor or keyboard is connected.
My preferred way to work with a Linux system is by remote access using the excellent Putty tool to connect using SSH.
To use Putty you will need to find the IP address of your Pi. This is found though your router management console. On my D-Link router I log into http://192.168.1.1 and click on the Network tab, which lists all the devices connected to my home network. It detected the Raspberry Pi – mine had a machine name of RaspPi, and I was able to reserve the IP address.This way, I can use Putty with the same IP address to log into the Pi.
Once Putty is running, you will need to log in using the user name (pi) and password (raspberry).
The Linux distribution I chose was SqueezePlug. This allows you to set up a SqueezeBox music server and/or a player to access your existing server. The player can be controlled wirelessly from an Android or iOS device (such as the free Logitech Squeezebox controller in the Google Play store).
That’s it – and it can be installed and running in under an hour.
Going forward, try the Raspbmc Linux distribution, which allows you to run a media centre on the Raspberry Pi. I have been able to use the Pi to stream Freeview, BBC iPlayer and ITV Player to my Nexus 7 Android tablet using the TVHGuide client app. it is also possible to access these streams from any device running an XBMC client.
This time, you need to plug the Pi into a display, you’ll also need a powered USB hub, a compatible USB digital TV receiver and a Wi-Fi adapter. There’s quite a few steps:
I recently spoke to Kim Stevenson, CIO at Intel. When she started in IT, people used to try to issue one device per employee. She says Intel has been running BYOD for three years. “People like choice and they pick devices for the work they wish to accomplish.” At Intel, this means becoming a more productive employee. So as an IT professional, she says it is important to understand this driver, rather than try to resist the change and loss of control that occurs through BYOD.
She says: “We are in an era of business productivity. It is perfectly reasonable for employees to have seven devices. Through our BYOD programme we have documented a gain of 57 minutes in productivity per employee.”
For IT professionals, she says the Number One issue they face is “velocity.” Business unit managers can buy a service directly from a service provider. the consumer It experience is better than enterprise IT. She says IT must address how to deliver the consumer IT experience within the confines of the enterprise.
In this guest post Martin Prendergast, CEO and co-founder, Concorde Solutions and board Member, Cloud Industry Forum, writes about issues to consider when licensing IBM software.
Enterprise software can represent as much as 30% of an organisation’s IT spend, so at a time when budgets are still being squeezed like never before, CIOs are understandably being careful to ensure that their investment in software represents value for money.
However, software licensing costs can be a real bugbear for CIOs, with the potential to quickly ratchet up the overall price through painful non-compliance fines, unwittingly incurred as a result of software vendors’ complex and convoluted terms.
The challenge is exacerbated as each software vendor has its very own unique brand of complexity, which makes the jobs of the IT Asset Manager, the CIO and the CFO even more taxing. In this article, we examine some of the key challenges and solutions for dealing with IBM’s software licensing.The problematic portfolio position.
IBM has over 1,500 products on offer available on around 30 licensing metrics; each metric may differ only very slightly, but can still have a significant impact on licensing requirements and position. Historically, the picture has been further complicated with IBM through its well-known practice of acquisitions, expanding the product portfolio and licensing metrics even further. IBM may choose to retain the licensing metrics of the company they acquire, and sometimes may choose not to.
For customers this can be incredibly difficult to track; and without careful management and analysis of their IT estate, businesses can find themselves operating under altered metrics and contracts without realising. It goes without saying that non-compliance fines can often be the result of this – and large software vendors, as we know, have found a lucrative income stream in such levies.It’s relatively widely known that IBM has a tendency to be one of the most aggressive vendors on the market when it comes to non-compliance. IBM’s fines, which can include a two year back penalty on maintenance clauses in addition to the costs of ‘missing’ licenses, are considered harsh even in comparison to other large vendors.
Indeed, just a few years ago, IBM sought to audit all of its corporate customers without warning and with huge audit teams, which netted them a considerable amount of income. Of course, IBM isn’t the only vendor that is a fan of the surprise audit and there are a couple of things that businesses can do to ensure that if an audit arrives, they’re not caught unaware.
1. Preparation is the first line of defence – ideally businesses should seek independent third-party confirmation of their licensing position both pre and post audit.
2. IBM has now lengthened the list of its products that are eligible for its sub-capacity licensing.
3. Dealing with sub-capacity licensing – irrespective of how enterprises partition their machine, without a sub-capacity license in place, they may still get charged for full-capacity.
Denali is the holding company through which Michael Dell hopes to reinvent Dell. A US securities and exchange filing at the end of March shows the company will actively move away from the PC and high volume servers.
- Denali hired John Swainson, former CEO at CA to run the software business. The company will look at expanding its business into areas like BI and storage software, presents a huge opportunities, software as a service will eat away revenue.
- While Denali has benefited from the trend to migrate workloads from expensive Unix systems to commodity x86 servers, Gartner notes that this potentially short term. According to Gartner, the move to virtualisation and server consolidation will enable businesses to defer server purchases
- A section of the SEC filing prepared by J P Morgan reflects this challenge. The investment bank highlighted Denali’s management plans around reducing margins from end user computing devices, servers and storage to reflect increasingly aggressive competition and buyers spending less.
- The acquisition of Force 10 in 2011 will help drive networking sales. IDC expects the networking business to grow 7.3%. Gartner expects sits software revenue to grow 7.7%, due to the acquisition of Quest. Storage is expected to suffer as a result of the decline in its long-standing relationship with EMC.
- On the services side, J P Morgan notes that Denali should see modest growth in its PC maintenance business, but competitive pricing will put pressure on traditional outsourcing.
The latest generation of laptops and hybrid devices use solid state disks to boost performance and speed up the time it takes for the operating system to boot. In this guest blog post, Robert Winter from Kroll Ontrack writes about some of the challenges on attempting to recover data from a damaged SSD.
When choosing a storage media type companies should understand how this decision can affect the ease of retrieving data when there is a data loss.
A lot of businesses are investing in Solid State Drives (SSDs) to leverage its numerous benefits, but users beware. Although SSDs are more robust than traditional hard drives (HDDs), data loss can still occur – and in the event of a data loss, it’s also more complex to recover the data.
Unlike HDDs, SSDs store data in memory chips which have no moving parts, eliminating hardware damage like head crashes or motor defects. Yet, data loss can occur with SSD storage devices because the flash chips are susceptible to physical damage and the way data is stored is complex. SSDs are also exposed to the usual traditional data loss events such as human error, computer viruses, natural disasters, and software/programme corruption.
Recovering data from the common sources of SSD failures requires expertise in overcoming technical challenges that are unique to SSD and flash technology, such as decoding complex SSD data structures, specialised controller chips and numerous other SSD specific issues. Data is stored on SSD dynamically, and this complexity makes data recovery highly specialised and time consuming. Also a single SSD memory structure can be as complex as an enterprise RAID (redundant array of independent disks) with eight, 16 or even 32 drives!
Only a handful of data recovery experts have data reconstruction programmes in place to identify, separate and reassemble SSD memory so that data can be extracted and achieve high quality results. At Kroll Ontrack the recovery process involves the following actions:
- Accessing and reading the data at chip level
- Overcoming any encryption
- Rebuilding data striping (much like RAID)
- Overcoming any file system problems such as corruption or parts missing
The time it takes for Kroll Ontrack to recover data from an SSD is difficult to determine, because the recovery time is dependent on factors including the extent of data loss and the effort required to decode the data from the particular SSD in the device- which is the biggest challenges in the recovery process for SSD. The way data is configured also varies greatly between manufacturer and models of SSD. Each model requires Kroll Ontrack to work-out the configuration before data decoding can begin. In most cases this is done with no help from the manufacturers.
Performing secure data disk sanitisation techniques on SSDs is equally tricky since it’s difficult to specify the exact location of where the data is stored to overwrite it. Therefore, the best way to permanently destroy the data is through physical media destruction. This typically involves shredding the media into small pieces so not a single chip escapes destruction. If the shredding process misses a chip, it’s still possible to recover data from it, so care needs to be taken to destroy everything.
SSDs are durable and it’s difficult to assess their lifespan because they vary depending on the manufacturer. To get an idea of how long a solid state-drive will last in application, the following calculations can be used to determine its life span:
It should be noted that these calculations are valid only for products that use either dynamic or static wearing levelling. Use the solid-state memory component specifications for products that do not use wear levelling.
There are various things a user can do to attempt to maximise its lifespan. The best way to find out the right methods is to look at reliable chat rooms and manufacturer recommendations here too.
SSD is a new technology and very few people have learned enough about it to expertly navigate through its RAID and the SSD layers and successfully find data when there’s a failure. Best practice is that before choosing to use it, contact a data recovery specialist for more information about the impact on data recovery for the specific environment and technologies you are investigating.
Robert Winter is responsible for all operations within the area of disaster recovery in the Kroll Ontrack labs, based at the UK headquarters in Epsom.
The findings from Forrester’s latest research on Oracle point to a worrying trend in the enterprise software landscape. Businesses are not generally doing large, transformational IT projects built around traditional enterprise resource planning (ERP).
The key suppliers are adapting their enterprise software portfolios in a bid to drive more sales. But the CIOs Forrester spoke to are not convinced it is a strategy that is working for Oracle.
In Forrester’s Oracle’s Dilemma: Applications Unlimited report, many people are happy with the software they are running and have no real plans to migrate onto Oracle’s future enterprise platform.Since Oracle is a strategic supplier to many, there is little interest among CIOs for migrating away. There are concerns that Oracle may turn some of the products they have deployed into cash cows, potentially with high, annual maintenance fees and licensing costs.
Members of the IT director’s group, the Corporate IT Forum, are angered by the changes to Oracle licensing. Head of research Ollie Ross told Computer Weekly that members were being pushed into taking certain technical directions like OVM (Oracle VM), rather than VMware. The forum’s executive director, David Roberts, believes many CIOs are reacting negatively to Oracle’s exceptionally high-pressured sales techniques. This is reflected in the supplier’s poor software licence revenue when compared with its nearest rival, SAP. If businesses are not upgrading at a rate that looks good on the company’s balance sheet, Oracle will need to take a different approach.
Newham Borough CIO Geoff Connell is concerned that Oracle (and other top tier vendors) will increase licensing, because their customers are “locked into” their products due to historical investments.He argues that many software suppliers appear to be ignoring the financial climate and are attempting to make up for reduced sales volumes with higher unit costs.
Coercing customers to buy more software is not the right way to go. But Oracle executives have not shown much willing to go wholeheartedly down the software as a service (SaaS) route, or even offer a roadmap for integrating SaaS and on-premise enterprise IT. Nor has Oracle been willing to adapt software licensing to make it more virtual machine friendly. The research shows customers are unhappy and the time for Oracle to make some tough decisions is long overdue.
Connell believes if Oracle and other leading suppliers continue to hike prices, users will abandon commercial enterprise software for open source alternatives.
A few weeks ago I interviewed Paul Michaels, CEO of business technology consultancy ImprovIT, about a methodology for modelling decision-making. In this guest blog post Robert Saxby, consulting director at ImprovIT, explains a bit more about how the methodology, called Virtual Modelling, works, and the business benefits.
When it comes to re-engineering IT environments to save money or achieve best practice, a trial-and-error approach can be both complicated and costly. Virtual Modelling is a new business tool uses ‘what if?’ scenarios to simulate real world outcomes and identify efficiencies, future strategies and best sourcing options without chopping, changing or disruptive ongoing operations.
CIOs today are caught between a rock and a hard place: Having to slash IT costs while retaining productivity and service quality – often due to government mandate. Of course cost cutting pressures are nothing new, and for many there is little blood left in the stone. The question now is: “How and where can we make further reductions without knee-capping the entire operation?” There are plenty of apocryphal tales about organisations axing staff and abandoning efficiency enabling technology projects only to discover their actions have mortally wounded deliverables and reputation. The result: a panicked and costly rehiring and/or re-purchasing exercise to redress the balance.
Finding the cost/quality balance
Wouldn’t it be great if you could work out the exact cost and productivity balance without the cost and disruption of making changes on a trial-and-error basis? Virtual modelling creates scenarios that are based on real, current and accurate data mined from your own ICT operation that can predict real world outcomes without impacting current operations. But it can only do this based on available KPI data, and if it doesn’t already exist it must be generated via benchmarking studies. For as Lord Kelvin, the 19th c. physicist once said: ‘If you cannot measure it, you cannot improve it.’
Measure it first
All of this information is then used to create ‘what if’ scenarios, typically dealing with areas such as: Cost/Price, Volumes, Staffing, Quality & Service Levels, Service Scope, Complexity, Project Efficiency and Process Maturity. Providing the model has been mapped with well-researched data, the outcomes obtained offer accurate indicators that can be used to make decisions about outsourcing, staffing, process re-engineering, cloud migration or anything else.
In the video below, Paul Michaels, CEO of ImprovIT, discusses how our Virtual Modelling methodology helps decision-making within a project.
Building up the model
Virtual modelling is designed to pinpoint the impact of one or more project parameters upon all the others. For example: If I change Service Quality (SLAs) and/or ‘Service Scope’ what effect will this have on ‘Cost? Or: If I reduce ‘Complexity’ what effect will this have on ‘Processes’? It also shows the changing balances of the whole picture when one or more parameters are altered. For example: ‘If I want to increase ‘Volumes’ or ‘Service Quality’ what changes do I need to make to all the other segments and how will this impact the enterprise as a whole?
Virtual Modelling System
So, to find the Goldilocks balance between IT cost and service quality let’s start by feeding staffing metrics into the simulation model, given the high impact of staffing on cost. But this isn’t just about a straightforward set of numbers: it also has to allow for a range of ‘soft’ factors such as varying levels of knowledge, skill sets and the specialist expertise that can make an individual or team difficult to replace.
Process maturity also impacts the cost/performance balance. There are industry standards which provide best practice guidelines, such as ITIL (IT Infrastructure Library) ‘Agile’ and ‘Lean’ (a production practice that looks to reduce resource expenditure down to the minimum required to deliver value to the customer). Comparisons with these guidelines can indicate where improvements can be made, but virtual modeling can determine what will cost and whether it’s worth the disruption to operations. It’s also worth noting that achieving process maturity is rarely a quick win: it takes time and requires clear, unequivocal goals and plans led from the top.
Given the chequered history of public sector IT projects, and the challenges that so many ITC departments are going through in making decisions about things like whether, when and how to migrate to the cloud, and how to optimise resources on an ever-diminishing budget, using Virtual Modelling to run scenarios on all the available options provides new decision-making tools that help to identify the best roadmap ahead while avoiding wrong roads and dead-ends.
Robert Saxby is consulting director, ImprovIT