Enterprise Linux Log


December 5, 2008  3:45 PM

SUSE Linux is growth engine for Novell



Posted by: Dkr
Linux, Linux blogs and news, Open source applications, SUSE/Novell, TechTarget Blogs

Linux is alive and well at Novell Inc.

In its year-end fiscal report yesterday, the Waltham, Mass.-based company reported that open source products, primarily Linux, rose $34.8 million or 37% to $128.8 million in 2008. Fourth-quarter open source revenues rose just a whisker less or 36.1% to $35.7 million.

The increases far outstripped Novell’s other three product divisions, which are identity and security management, systems and resource management, and its workgroup division.

For the 2008 fiscal year, Novell’s total revenues were $957 million, compared with $932 million in 2007, and its net annual loss was $9 million in 2008, compared with a net loss of $44 million the previous year.

Thanks to aggressive pricing and key partnerships with companies such as Microsoft and SAP, the company’s Linux growth rates far exceeded the 22% increase in the overall Linux market, said Novell CEO Ron Hovsepian. SUSE Linux Enterprise, which grew its market share an additional 3% in 2007, added 3,000 new Linux customers this year, including many large enterprises, he said.

Hovsepian added that Microsoft has sold $195 million of the $240 million in SUSE Linux certificates it bought as part of the 2006 pact between the two companies and has purchased an additional $25 million so far this year, Hovsepian said. In 2008, Novell also has increased its independent software vendor agreements significantly, he added.

Discussing the company as a whole, Hovsepian said Novell has made great progress over the past two years, achieved all its milestones for 2008 and, in turn, stabilized and strengthened the company. While acknowledging that the current uncertain economic climate doesn’t lend to detailed forecasts, Hovsepian said Novell will continue to strive for operational improvement and long-term profitability.

December 1, 2008  8:07 PM

Red Hat donates holiday party money to feed the poor



Posted by: Dkr
Linux, Linux blogs and news, Open source applications, Red Hat, TechTarget Blogs

Despite a year of enviable revenue growth, Red Hat Inc. will not be throwing its customary all-out holiday bash this year. Due to concerns about ostentatious spending during an economic downturn, the Raleigh. N.C. software company decided to host a low-key office gathering instead of the lavish affair originally scheduled at the Raleigh Convention Center. As part of the North Carolina’s high-tech Research Triangle, Raleigh appears to be faring better than many regions but, according to news accounts, the city still has residents who are hurting financially due to the national fiscal turmoil.

Therefore, Red Hat decided to donate the money it would have spent on its annual party to charity. Following the results of an employee poll, Red Hat will send the funds to Chicago, Ill.-based Feeding America, a national organization that funds more than 200 food banks, including several in North Carolina.

Although Red Hat declined to reveal the amount of its donation, the company said the funds were enough to pay for 800,000 meals by Feeding America or 1 million pounds of food and grocery products, a Red Hat spokeswoman said.

In addition, Red Hat offices in Raleigh as well as Westford, Mass., are organizing canned food drives for the hungry; Raleigh executives have pledged a donation for every 500 cans donated at its headquarters office.

“We just felt it was the wrong time to be spending a lot of money on ourselves,” DeLisa Alexander, Red Hat’s senior vice president for people and band, told the Raleigh News & Observer. “I don’t see us going back [to big parties]. People want to work for a company that is socially responsible.”


November 25, 2008  11:54 PM

Novell hurts itself with Red Hat swipe



Posted by: Dkr
Linux, Linux blogs and news, Open source applications, Red Hat, SUSE/Novell, TechTarget Blogs

In an obvious swipe at its larger rival, Waltham, Mass.-based Novell Inc. crowed this week that SUSE Linux Enterprise has surpassed all other Linux distros with 2,500 vendor-certified applications running on top of its operating system. If true, this would be quite astonishing since Raleigh, N.C.-based Red Hat Inc. has a far larger Linux market share.

Upon querying Novell’s PR agency, Andover, Mass.-based Pan Communications Inc. responded with a link to the Red Hat  Software Catalog which lists 2,166 certified software vendors, not applications, so Novell isn’t comparing apples to apples.

A Red Hat spokeswoman said, however, that there are 3,400 vendor applications certified to run on Red Hat Enterprise Linux.

This is silliness. I feel like a mom telling two toddlers to go back to their sandboxes. (Red Hat is not really at fault here since it didn’t rush forth with a counterpunch.)  I’m probably more fascinated than the competition between these two companies and their strategies than anyone else I know. But this little self-promotion was a waste of time. And Novell, with its carefully worded misleading statements, has a lot to lose: its credibility. Let’s declare the “I’m bigger than you are” nonsense all over. Let’s all get back to work.


November 18, 2008  7:32 PM

LAMP stack story overlooks impact of cloud, reader says



Posted by: Dkr
Apache, Cloud computing, Development, Java, Linux, Linux blogs and news, Open source applications, scripts, TechTarget Blogs

My recent story on the dimming of the LAMP stack sparked a thoughtful reader response from John Locke, the manager of Seattle-based Freelock Computing. The story concluded that while an all-open source stack is still a valid concept, there are many more open source options that LAMP (Linux, Apache, MySQL and Perl, Python, PHP) is largely irrelevant.  I made a single exception for Apache, the popular Web server.

Locke argued, however, that even Apache has a growing array of alternatives such as the Lighttpd Web server, the Apache FastCGI Web interface,  the Nginx proxy server and others.

But what undercuts the LAMP stack more than the advent of additional open source options is the emergence of cloud frameworks, Locke said.

Initially, cloud computing meant renting compute power on demand from the likes of Amazon Elastic Compute Cloud (EC2). This meant renting a host virtual machine, programming the top layer, adding libraries and then when it was all done, managing the host and the virtual application, Locke said.

The problem with this model is that data centers are responsible for scaling the application up or down in response to changing volume requirements, he said. To solve this problem, Google, as well as Microsoft’s recently announced Azure platform, go beyond computing-on demand and manage the entire process with frameworks. All you do is write the application code (yes, you still need the P in LAMP), put it atop an application framework, and the framework will scale the application up and down as needed. No further involvement required. No LAMP stack required either.

Two successful examples of cloud frameworks are Salesforce.com and Facebook, he said.

The downside of frameworks, however, are loss of control and potential vendor lock-in, Locke said. The risk is less with Amazon EC2 since its controls are far more limited, he said. When writing an application for a specific vendor’s framework, however, a customer can lose portability because the provisioning and scaling mechanisms are behind-the-scenes and the source code and licensing are not necessarily readily available, he said.

The biggest challenge to LAMP as well as the Java and .NET stacks, therefore, is not the growth of additional choices but the cloud frameworks which may make all the stacks irrelevant. While handing over management and control is convenient, it also has its downside: you have to live by someone else’s rules, Locke said. Just  like a condo or regulated housing community, you’ve delegated the work, but you’ve also lost your freedom. Time will tell if you’ve made a good bet.


November 18, 2008  3:58 PM

Red Hat scoops up accolades



Posted by: Dkr
Linux, Linux blogs and news, Open source applications, Red Hat

Recently, Red Hat Inc. has scooped up some impressive awards. Last Saturday, Matthew Szulik, the chairman of the Raleigh, N.C.-based company was named the Ernst & Young entrepreneur of the year and the Ernst & Young national winner in technology for growing the open source company into a successful business.

 In presenting the award, James S. Turley, the chairman and CEO of Ernst & Young said that “Szulik follows a proud tradition of pioneering entrepreneurs who overcame skeptics and brought a novel, seemingly improbable business idea to market successfully.”

 Joining Red Hat as president and CEO in 1998, five years after the company was founded, Szulik headed day-to-day operations during almost a decade of expansion. He was replaced as president and CEO nearly a year ago by Jim Whitehurst and now serves as chairman.

 Three days later after Szulik received his award, Red Hat itself was named public company of the year by the North Carolina Technology Association. The award was given on the basis of sustained company growth, solid operating results and stock price appreciation.

 In these turbulent times, accolades are hard to come by.

 

 

 


November 12, 2008  10:48 PM

Don’t want to go to the cloud? Cassatt says to build your own



Posted by: Leah Rosin
Administration, interoperability and integration, Cloud computing, Data center physical infrastructure, Hardware issues

You may be one of the data center administrators who’s heard the buzz about the cloud and has decided that it’s just hype. But the advantages of the cloud infrastructure are clear:

  • Extremely low operating costs (as low as 1/10 traditional IT costs)
  • Extremely high energy efficiency
  • Extremely low levels of complexity
  • High economies-of-scale
  • Metered billing (based on use); transparent costs
  • Other benefits of large data centers without owning capital

There are also a bundle of fears or disadvantages that are enough to keep many from jumping on board. Here are a few of Cassatt’s observations on what these are:

  • Security: infrastructure sits outside the enterprise’s walls
  • Service levels: Nascent model; agreements and levels unproven
  • Performance
  • Auditability and logging/traceability issues
  • Potential for cloud platform architecture “lock in”
  • Does not lower existing cost of installed capital or operations

In preparation for Cassatt’s internal cloud release this week, I talked to Ken Oestreich, director of product strategy, Steve Oberlin, chief scientist, and Jay Fry, VP of marketing at Cassatt. Oestreich explained how a product that the company had been developing for five years (initially referred to as its “utility computing product”) aligns with the cloud computing model. Essentially, Cassatt’s Active Response 5.2 allows you to turn your entire data center into a “cloud,” but without the disadvantages above.

“People who haven’t outsourced because of regulatory or security are not going to change,” said Oestreich. “They’re not going to a cloud.”

Seeing this opportunity, and realizing that many CIOs would love to take advantage of the efficiencies of the cloud but can’t afford the risk, Cassatt’s product allows users to get pretty close. Additionally, there are some advantages to the “internal cloud” model that include no platform-dependency issues and no “lock in” to an external cloud provider. Active Response 5.2 provides multi-platform support for Linux, Solaris, Windows and AIX; virtual machine support for VMware, Citrix (Xen) and Microsoft Hyper-V as adoption warrants; and networking support for Cisco, F5 and Force10.

In addition to Active Response 5.2, Cassatt has introduced its Active Profiling Service.

“Before you can embark on creating an internal cloud and merging application groups into pools that can share these resources, you have to know what you’ve got,” said Oberlin. “That’s what enables you to create a management strategy.”

Some consolidation planning software exists on the market, but Cassatt’s team thinks that it misses the mark and doesn’t provide users with all of the information they need. Cassatt points out that existing inventory tools don’t look at usage patterns, application dependencies or workload dynamics, and consolidation tools don’t consider workload management and server repurposing.

“A lot of companies today buy consolidation planning software if they’re doing virtualization. What this software doesn’t do is what all of our customers ask us about — provide a picture of the dynamics of the data center,” explained Oestreich. In order to manage a virtualized environment, it’s helpful to have an idea of “… which apps are quiescent and when, where the orphan servers are, where is virtualization appropriate and not, where is power management appropriate, and where is the internal cloud computing appropriate and not.”

Oberlin shared that the actual setup and implementation of the software can be rather rapid (a day or two). “The longest period of time is recording performance and utilization data to capture a reasonable business cycle to get a decent utilization profile of the applications over time,” said Oberlin.

The team envisions its internal cloud offering as something that users can gradually work into. I imagine dipping a toe in and then easing into the hot-tub and relaxing while the data center is efficiently managed.

  1. Analyze infrastructure and opportunities using Active Profiling Service
  2. Get started using policy management
  3. Take advantage of the power-management infrastructure and achieve increased energy efficiency
  4. Manage virtualization across multiple vendors , simplifying and automating virtual infrastructure
  5. Implement application availability across platforms and achieve greater operational efficiencies
  6. Implement resource repurposing across physical and virtual platforms and achieve greater capital efficiencies
  7. Meter infrastructure use, regardless of physical or virtual

In a time of budget cuts and reduced staffing in the data center, there’s no question that improved efficiency in physical and virtual machine management is beneficial. This type of move can help any data center prepare for future increased utility costs and trim down on new equipment provisioning. And who knows, maybe one day you’ll consider joining a community cloud.


November 10, 2008  9:43 PM

Microsoft’s embrace of open source could signal turnaround



Posted by: Dkr
Linux, Linux blogs and news, Linux versus Windows, Microsoft Windows, Open source applications, open standards, TechTarget Blogs

Microsoft used last week’s ApacheCon as a platform to reach out to the open source community in a public way.

In his keynote last Friday, Sam Ramji, Microsoft’s senior director of platform strategy, told the Apache faithful that Microsoft is serious about partnering with the open source community to create open standards and interoperability. Collaboration on the fundamentals will promote healthy growth, competition and a new round of innovation and will enable customers to allocate IT dollars for constructive uses instead of overcoming infrastructure bottlenecks.

The most recent example of Microsoft’s collaboration with its open source counterparts was its recent decision to join the Advanced Message Queuing Protocol (AMQP) Working Group for improving message interoperability at the application level, which is currently very difficult without expensive proprietary solutions. But Microsoft is also boosting interoperability with open source in Web services, security, databases and network monitoring, Ramji said. This past spring, Microsoft reached an interoperability “milestone” between Soap and Apache’s Axis Web services protocols, he added.

“I’m an eternal optimist. I’d have to be after three years of leading open source at Microsoft,” Ramji said. “There’s been a big shift in a short period of time,” involving hundreds of steps in a company with 93,000 employees, he said.

Ramji’s embrace of open source was echoed, if somewhat less strongly, in a speech last Friday in Australia by Microsoft CEO Steve Ballmer and reported in a CNET blog by Matt Asay. In response to a question about its Internet Explorer browser, Ballmer said Microsoft is unlikely to make Explorer open source because of its proprietary extensions, but he didn’t reject the suggestion out of hand. The measured tone of Ballmer’s response, Asay wrote, “could well be the most rational, pragmatic, open-source-related comment from Ballmer that I’ve ever read.”

Ballmer’s comment suggests that Microsoft has finally recognized that open source can be a useful component of its overall software strategy, Asay concluded.

In other words, Microsoft may finally have decided to stop fighting open source and instead begin to find areas where the two communities can help each other. And that’s a good thing.


November 6, 2008  4:49 PM

Free the Penguin virtualization desktop giveaway offer extended



Posted by: Dkr
Linux

Due to popular demand, Waltham, Mass.-based Novell Inc. and two Canadian firms have extended their Free the Penguin desktop virtualization giveaway program that organizers hope will promote Linux in schools. Originally scheduled to end Nov. 30, the offer has been extended to Dec. 31.

Alberta, Canada-based Userful Corp. and its systems integrator, Omni Technology Solutions Inc. of Edmonton, Canada, will continue to give any nonprofit school or university up to three Linux-based Multiplier virtualization systems apiece for free. Each Multiplier virtualization system subdivides the hard drive of a single PC, enabling up to 10 students to use an individual workspace on the same computer by simply connecting multiple keyboards and monitors to a PC.

Novell, too, will repeat its original offer to give away free SUSE Linux Enterprise Desktop licenses to the first 30 respondents, matching its initial offer when the program was launched several months ago.

Each lab, which runs on any flavor of Linux, uses slightly more energy than a single PC and avoids the discharge of tons of carbon emissions into the atmosphere.

The only limitation of the Multiplier system is distance. The keyboards and monitors must be within 21 feet of the PC to avoid signal degradation over the cable.

To date, Novell and the Canadian firms are both pleased with the campaign results, which Userful says has resulted in the distribution of more than 900 licenses to 50 customers since the beginning of the special offer.

Grant Ho, Novell’s senior product marketing manager for SUSE Linux Enterprise Server (SLES), welcomed the giveaway extension, which is a boost to SLES’ modest but growing adoption in U.S. and European school systems.

“SLES is helping schools save dramatically on costs while gaining a user-friendly desktop experience,” Ho said.

The giveaway program does not include hardware. Additional information is available on the Omni website.


November 6, 2008  12:33 PM

Red Hat debuts Fedora 10 preview



Posted by: Dkr
Fedora Linux, Linux, Linux blogs and news, Red Hat

A preview of Fedora 10 is now available for download from the Fedora website, with improved virtualization, package management and policy controls, and faster startups.

The latest version, which we previewed on SearchEnterpriseLinux.com in Fedora foreshadows Red Hat Enterprise Linux nearly two months ago, introduces new features that Raleigh, N.C.-based Red Hat Inc., will further test and refine and then ultimately add to the next version of Red Hat Enterprise Linux, due out in 2010.

These include the ability to perform virtual machine installs, provisioning and storage management, all remotely, via libvirt. Libvirt is a Red Hat-initiated virtualization interface that neutralizes differences for common commands among many hypervisors, including Xen and KVM.

Fedora 10 also promises faster startup times based on a new Plymouth engine and adds the ability to set up ad hoc Wi-Fi networks to share an Internet connection among multiple computers. In addition, Fedora 10 simplifies software package management with PackageKit, which bundles all components required for a software install together, and improves security with PolicyKit, which refines user authentication privileges. Fedora 10 also includes a First Aid Kit for detecting and repairing problems automatically and numerous other features.

The final version of Fedora 10 will be released later this month.



October 29, 2008  8:31 PM

Centrify streamlines administrator tasks in mixed environments



Posted by: Suzanne Wheeler
Administration, interoperability and integration, authentication, Data center physical infrastructure, Enterprise applications for Linux, HP, Linux, Microsoft Windows, Security

On Oct. 21, Mountain View, Calif.based Centrify Corp. added DirectAuthorize to its suite of products for integrating Active Directory into mixed Linux and Windows environments. DirectAuthorize streamlines user access rights management so that administrators no longer have to configure rights separately on Windows servers and then on non-Windows servers. By consolidating information in a centralized location, DirectAuthorize eliminates redundant rework.   

DirectAuthorize arrives as the third member of a line of products created to ease the task of managing mixed environments with Active Directory. The other two products, DirectControl and DirectAudit, perform centralized authentication and auditing.  

“Typically we serve customers who are looking to introduce Linux, Hewlett-Packard, AIX, or Unix into their environments, and also often VMware.” Centrify CEO Tom Kemp said. “In terms of access rights and password management, that ends up being a lot of sticky notes next to your screen.” DirectAuthorize replaces non-Windows systems’ authorization infrastructure with that of Active Directory, which allows admins to move all user authorization information to a central location and to manage it from that location.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: