Will IT shops soon be using System Center and Windows Server to control and update enterprise applications on employee smartphones and tablets? One analyst thinks that could be the case in the near future.
In a discussion of Microsoft’s enterprise software roadmap for 2012 and beyond last week, Rob Helm, research vice president for analyst firm Directions on Microsoft, suggested that mobile device management would be a particular area of growth in the coming years. This should come as no surprise given the proliferation of “Bring Your Own Device” initiatives within the enterprise that have employees using all manner of mobile devices based on various operating systems.
We can already see some evidence of this growth; System Center Configuration Manager 2012, currently in RC version, builds on Microsoft’s previous device management offerings (including SCCM 2007, and the discontinued Mobile Device Manager). SCCM 2012 supports Android-, iOS-, Windows Phone- and Symbian-based devices (all of which connect to the Exchange ActiveSync protocol) and has a “user-centric” application delivery model that leverages virtual desktop technology to allow for access to the same application on multiple devices, whether or not they have native support.
Helm predicted that this is just the beginning, with System Center eventually being able to update applications directly on devices. Windows Server will also play a role in security. “I think Windows Server in the near future – between now and 2014 – will gain the ability to control encryption of data on mobile devices,” said Helm. This could potentially be done via something like Active Directory Rights Management Services.
Though these updates may ease integration of mobile devices in the enterprise, companies will need to consider how such initiatives affect their bottom line. Companies supporting BYOD may actually be in violation of current Microsoft licensing policies, and in any case may incur substantial per-device fees that make enabling mobile access a costly proposition. “You will have to look more closely at what rights you get for remote access to Microsoft products from mobile devices,” said Helm, advising that per-user licensing (for example, through Office 365) may be the cheapest option for many organizations.
Do you use System Center Configuration Manager for mobile device management? What would you like to see improve? Let us know in the comments, or via Twitter @WindowsTT.
ReFS, a new file system that Microsoft calls “the foundation of storage on Windows for the next decade or more,” will be coming to Windows Server 8, the company said this week.
According to a Building Windows blog post penned by Surendra Verma, a development manager with Microsoft’s storage and file system team, ReFS (Resilient File System) is built on the foundation of NTFS and maintains a high level of compatibility with it.
ReFS is built for scale and consistent uptime. Verma wrote that the goal with ReFS is to “never take the file system offline. Assume that in the event of corruptions, it is advantageous to isolate the fault while allowing access to the rest of the volume. This is done while salvaging the maximum amount of data possible, all done live.”
But a big part of ReFS is that it also works with the new Storage Spaces system; in fact, the two were developed at the same time.
Storage Spaces, detailed in another Building Windows 8 blog post, is designed to create a pool of different storage devices (hardware and virtual) and ensure protection of the data on that system, similar to a RAID array.
According to the post, ReFS also combats “bit rot” — when accessed bits of a file system become corrupt over time — through disk scrubbing tasks that read and validate data.
While Microsoft says it will be production-ready by the time Windows 8 ships, it will only be available to use on Windows Server 8 right out of the gate. Even then, ReFS won’t be able to be used as a boot disk; instead, Microsoft says it will adopt a “conservative” approach and start with ReFS as a storage space-only option. After that, it will roll out to clients for use as a storage space and finally becoming a boot option.
There are additional caveats; ReFS isn’t supported on removable media and NTFS data can’t be converted to ReFS data. Plus, ReFS doesn’t natively offer deduplication, but Microsoft reassured users that third-party dedupe software would continue to work as normal.
New screenshots leaked from a pre-beta build of Windows Server 8 show some aesthetic differences from the Developer Preview release.
Build 8180 uses a darker gray theme compared with the sky blue look of the windows in the Developer Preview. In addition, a new login screen has some Metro stylings. WinUnleaked also caught the first glimpse of the new Storage Spaces feature in Windows Server 8 that works with the new filesystem ReFS (Winrumors covered the server-only filesystem back in December).
Many Neowin commenters either trashed the UI design or questioned its utility. After all, does a server operating system really need to look pretty? And, if not, why waste precious resources on something that will have minimal impact? Who’s going to be looking at a GUI, anyway (given Microsoft is hoping to push admins to the revitalized, character-based PowerShell)?
There are some Microsoft defenders out there, though, who rationalized some of the decisions made. Poster “dtboos” said the changes to the server operating system make sense because it points to the “unification of their products, and consistency among them. They need a common element of design, and that’s what they are moving towards. Very quickly I may add. I love it.”
Another poster, Kushan, added “I don’t know why people are getting so upset – the server UI’s in windows have always looked horrible. But who cares? It’s a server, if you have to look at it every day, something isn’t right. Once its set up, you should only have to remote in once in a blue moon.”
Windows Server 8 is still expected to hit sometime later in 2012 with the public beta coming as early as February, according to Tim Schiesser at Neowin.
When it comes to Windows Server 8, small is the new big.
In a recent post on the Microsoft Server and Cloud Platform blog, Windows Server Director of Program Management David Cross (not to be confused with this guy) explained how the latest version of the server OS is all about giving system administrators more control of what components are installed, so that they have “just enough” to do what they need, and not a feature more.
You probably know where this is going: Server Core and PowerShell. As we’ve been stressing for a while now, Microsoft is pushing admins toward the command line, adding over 2,300 cmdlets that enable management of server roles; Cross notes that admins can also install and remove GUI elements directly from the command line.
For those who are a bit apprehensive about losing the GUI altogether (PowerShell does have a great GUI, actually), Microsoft has introduced what it’s calling the Minimal Server Interface, an intermediate state that is somewhere between Server Core and the full Server Graphical Shell. It includes some graphical elements but not the desktop, Windows Explorer, Internet Explorer, or Metro-style application support. It’s not just a safety net for the command-line-phobic, though; it also helps ensure compatibility for applications, such as Microsoft Management Console, that won’t run on Server Core alone, while still offering the reduced footprint and other benefits of that installation. While Server Core is the “preferred solution,” admins will be able to switch between server installations with a single command and reboot in order to test applications.
It should be interesting to see how admins respond to this additional option. Those who have not yet learned PowerShell may feel more confident in the short term, but it’s clear that the overall focus is still on slimming things down and moving to the command line. How does it affect your plans in preparing for Server 8?
In a move that could spell change for Microsoft’s server marketing strategy, longtime Redmond employee Robert Wahbe, corporate vice president of marketing for the Server and Tools group, will be leaving the company in February, according to his official biography (ZDNet’s Mary Jo Foley first reported this news over the weekend).
Wahbe has been at Microsoft since 1996, serving in a number of product development positions. In his current role, he oversees product planning, pricing, packaging, branding and advertising for Windows Server, SQL Server, System Center, Visual Studio, and Forefront.
Wahbe will be replaced by Takeshi Numoto, currently the head of product management for Microsoft Office and its various desktop applications. According to his corporate bio, Numoto joined the company in 1997, and has previously worked in business development for Windows NT, Windows Embedded and Windows Mobile. He played a role in the launch of Office 365, and delivered a keynote at GITEX 2011 in October, addressing common IT admin concerns about the cloud application service (read the full transcript on Microsoft’s site).
A Huffington Post column Numoto wrote in July 2010 offers a few clues as to his product philosophy. Describing his role with the Office team, he wrote, “I take great pride in listening to customers’ desires and expectations regarding the software tools they use to get work done. We receive tons of customer feedback every month, and study thousands of hours of customer videos for every product release.” He also writes somewhat regularly on product features and trends for the Office Exec blog.
The transition occurs on February 15.
Do you think this change represents a shift in direction for Microsoft’s server products? Let us know in the comments, or via Twitter @WindowsTT.
Microsoft’s private cloud plans will soon go public.
On January 17, the company will host a webcast, titled “Transforming IT with Microsoft Private Cloud,” featuring Server and Tools Division president Satya Nadella and Brad Anderson, corporate vice president of Management and Security. The two will discuss private cloud “insights and news,” and Anderson will also host an executive panel and Q&A session regarding best practices for private cloud implementation. The two-hour event (beginning at 8:30 a.m. PST/11:30 a.m. EST) will conclude with a “scenario-based demonstration” from the Microsoft Technology Center in Redmond. Registration is free and open to the public.
Though it’s not expressly mentioned in the event announcement, System Center is likely to be a major part of any Microsoft private cloud discussion. The systems management suite includes several tools for launching and managing private cloud infrastructures in conjunction with Windows Server and Hyper-V. In particular, Operations Manager, Virtual Machine Manager, and App Controller enable private cloud deployment, monitoring and application management. Some, including ZDNet’s Mary Jo Foley, have speculated that this event will be when we’ll see the release to manufacturing of System Center 2012 products, most of which are currently available in beta or RC versions.
At last May’s TechEd event, Microsoft laid out a roadmap for System Center that had all next-generation products reaching the RTM stage by the second half of 2011, but that didn’t happen. If the company hopes to release the final product in time for the Microsoft Management Summit in April, this would be a good time to hit the next milestone.
Redmond Mag notes that Microsoft may have already offered a sneak preview of what’s in store during a briefing with journalists last month in San Francisco. In a subsequent Q&A, Nadella noted that “hybrid cloud solutions” – a mixture of cloud and on-premises technologies – are “where the most value lies for businesses.” Expect to hear more about how the breadth of Microsoft’s product offerings – both private and public – can be applied to a variety of enterprise scenarios during the webcast.
What do you hope to learn about during this event? Have you implemented a Microsoft private or hybrid cloud? Let us know in the comments, or find us on Twitter @WindowsTT.
When chip-maker AMD released its latest Opteron server chips, based on the Bulldozer architecture, hopes were high that the powerful processor would launch the company into direct competition with Intel. Early reviews were less than laudatory, though; the chips’ poor performance can be traced in part to the fact that current versions of Windows were not configured to take advantage of the Bulldozer’s multi-threaded TurboCore features.
AnandTech offers a visual representation of the problem, which is essentially that the task scheduler puts threads on multiple modules rather than exploiting Bulldozer’s ability to have a single module share threads. This means Windows views each dual-core block as a single core, negating Bulldozer’s competitive advantage.
In response, Microsoft announced a manual hotfix download for Windows 7 and Windows Server 2008 R2 last week, which promised to fix the task scheduler issue. That patch has since been withdrawn, with Microsoft saying it was released prematurely, with a second part not yet ready to be pushed live. This makes sense, considering that some users reported that the fix actually decreased Bulldozer performance instead of improving it (a 2-7% increase had been touted).
Microsoft is reportedly working on an updated version – and the issue is already addressed in Windows 8 and Windows Server 8. The effort to support Bulldozer shows that the company has not given up on AMD just yet; but it remains to be seen how the rest of the market will respond.
Do you think AMD can compete in the Windows Server chip market – or is Bulldozer dead on arrival, with or without this fix? Tell us in the comments, or on Twitter @WindowsTT.
Windows (Server) 8. Hyper-V 3.0. PowerShell. Cloud.
It’s not too hard to predict what the hot topics will be for Windows admins in 2012 (they have much in common with what we thought would be big in 2011). What is tricky, though, is determining how these new and updated technologies will impact how jobs get done – both on a daily basis and in the long term. That’s when we turn to the experts.
Over the next few weeks, we’ll be talking to our esteemed group of contributors to find out what challenges IT pros will face next year – but we want to hear from you, too. What skills are you learning now? What upgrades do you plan to make? What’s on the horizon that makes you excited…or nervous?
It’s no secret that technology companies will go to great lengths to snag top talent, offering extras that range from free on-site meals to gifts of skinny jeans and bicycles. Last week, CNET reported that social web startup Scopely (no one even knows what the company plans to do yet) has the biggest perk package of them all, with new hires earning a pile of goodies including a year’s supply of Dos Equis, an oil self-portrait, a tuxedo, and $11,000 in cold, bacon-wrapped cash.
Microsoft isn’t about to get left behind in the latest recruitment trend. In an effort to attract engineers for its Kinect for Windows team, the company is peddling a bacon wagon around Seattle’s tech hot spots, offering up free pork strips (topped with chocolate sauce, if you like) and hoping to tempt a few potential development whizzes in the process. It remains to be seen whether the campaign, dubbed “Wake Up and Smell the Future,” attracts more than those who forgot to bring their lunch, but it’s at least helping to get the word out that Microsoft is on the hunt. A more effective effort may be the recently announced Kinect Accelerator program, which will give 10 qualified development teams a $20,000 investment and free office space in which to work with the Kinect SDK for three months. They can probably get some free bacon, too, if they ask nicely.
We have seen it more than a few times before: Microsoft jumping into a new market already forged by many of its enterprise competitors, hoping to deliver lower-cost, alternative solutions. This time, it is Microsoft making a big deal out of big data.
At its PASS Summit this week Microsoft said it would deliver an implementation of Apache Hadoop for Windows Server and Azure sometime in 2012, with the help of Hortonworks, a Yahoo spinoff. In concert with this plan, the company said it would wire up SQL Server 2012 (“Denali”) to work with Hadoop as well. Hadoop is an open source framework that allows for the distributed processing of large data sets among clusters of computers.
The moves make sense given Microsoft’s renewed strategic ambitions to go after the upper reaches of corporate IT. Last month the company showed off Windows Server 8, which is chock full of enterprise-class cloud, virtualization, clustering and storage and systems management capabilities to handle dozens of servers as if they were one system.
To me, Windows Server 8 looks like a pretty solid foundation on which to build enterprise-class infrastructure and applications to let Fortune 500 shops do sophisticated analytics on massive amounts of structure and, more importantly, unstructured data. Microsoft, this time, seems intent on lining up the pieces needed to play with big data’s big boys.
Those big boys, IBM, Oracle, EMC, SAP and even Hewlett Packard, have already started delivering individual products and/or solutions to help corporate shops better control and extract meaning from the flood of information contained in social media feeds, e-mails and documents.
These are companies that have long catered to larger IT shops that have in turn sunk billions into buying products and services they rely on to be successful. These shops will be rather reluctant to toss out even smaller technology pieces dedicated to handling big data, in favor of technologies from Microsoft, which many IT shops wouldn’t mention in the same breathe with the words “enterprise class.”
But then, this is Microsoft we are talking about. A company that succeeded in markets where initially it had to come from way back in the pack: CRM, SharePoint, Exchange and the Xbox (although not too many enterprise accounts care about the latter – not during office hours anyway).
It’s long established that it take Microsoft three tries to succeed with many of the core products that flourish today. I am not sure Microsoft has the luxury to fail a couple of times before it can break into the big data market, but that depends on how effectively the other big data boys can maintain their technology lead.
Another issue, particularly among loyal Microsoft’s users in the smaller shops: why spend so much sweat and treasure pursuing yet-to-be realized opportunities in new, higher-end markets when they could be better serving core SMB customers with more effective solutions?
This could be a legitimate question Microsoft should answer. Do you think Microsoft should be aggressively pursuing big data opportunities or should it remain focused on better serving the needs of smaller enterprises? Let me know: firstname.lastname@example.org.