Application security company Veracode is demonstrating to developers how easy it is to test and identify vulnerabilities in their applications by granting free access to one of its services. Veracode’s offerings include automated binary analysis in the cloud and as of today, developers can register to upload one application to the cloud and test for cross-site scripting (XSS) vulnerabilities at no cost. XSS, a common security exploit where attackers put malicious coding into a link that releases itself when a user clicks the link, is a veteran problem in application development and responsible for major security breaches.
Veracode hopes to demonstrate how avoidable XSS vulnerabilities are while highlighting their application security testing offerings, boasting their ability to serve both SMBs and large organizations. Most development oversights are minor, but can have major repercussions, which is why Veracode is doing its part to aid in the “long road to eliminating XSS.” In a recent blog post, application security researcher at Veracode Chris Eng likens fixing XSS vulnerabilities to squashing ants, but that doesn’t mean the problem isn’t major just because its solution can be:
At Veracode, we see thousands — sometimes tens of thousands — of XSS vulnerabilities a week. Many are of the previously described trivial variety that can be fixed with a single line of code. Some of our customers upload a new build the following day; others never do. Motivation is clearly a factor. Think about the XSS vulnerabilities that hit highly visible websites such as Facebook, Twitter, MySpace, and others. Sometimes those companies push XSS fixes to production in a matter of hours! Are their developers really that much better? Of course not. The difference is how seriously the business takes it. When they believe it’s important, you can bet it gets fixed.
In a climate that’s teeming with new security threats every hour, a company’s security priority list can be the difference between a close call and a major setback. Proactivity is key. There’s no such thing as a free lunch, but when a company is offering free security testing, it makes reprioritizing not only appealing but affordable. What does your company have at the top of its security priority list this year? Do you anticipate taking application security testing in the cloud for a spin? Let us know in the comments or send me an email at Melanie@ITKnowledgeExchange.com.
It’s been a pretty busy month around IT Knowledge Exchange, and we’ve learned a lot about desktop virtualization. We’ve compiled some of the highlights to serve as your go-to list for some of the top considerations from planning to deployment.
Blogging About Virtual Desktops
- Desktop virtualization is so much more than VDI, so get your questions answered before virtualizing.
- Guest blogger and author Michael Fox stopped by to share his desktop virtualization wish list for 2011. We asked you to share your own for a chance to win his book, DeMystifying the Virtual Desktop; does yours include VDI-in-a-box?
- Jason Tramer takes a break from ranting and reviews XenDesktop 5.
- Heather Clancy and the CDW end-user survey revisit the state of client virtualization.
One Stop Shop: Desktop Virtualization Experts
Twitter is bursting at the seams with virtual desktop experts and enthusiasts. Take advantage of your chance to interact and ask questions. If you’re still stumped, head over to IT answers, and be sure to tag your question desktop virtualization.
Not a fan of Twitter? Don’t worry, we’ve got an impressive list of virtual desktop blogs for you, too.
We’ve always got some good discussions happening in the forums. Here are some of the top discussions and questions from desktop virtualization month:
- Virtual Apps vs. Virtual Desktops: Member Juttej is exploring options for a 600 PC network project. Rechil and KFaganJr shared some of their own experiences with the two options while Pjb0222 gave some questions to consider during planning.
- Open IT Forum: Who is deploying desktop virtualization?: Mortimer1, Batye, Carlosdlg, and Rechil discuss what their companies are exploring as far as virtualized desktops go. It’s a great look into the inner workings and conversations that happen when an enterprise is considering desktop virtualization.
- Which is better: Microsoft App-V or Citrix XenApp?: Mortimer1 gives a good rundown of what situations are ideal for each product.
- Suggestions for desktop virtualization resources: From books to hardware, check out some of the community’s suggestions.
Brian Madden, Gabe Knuth, and SearchVirtualDesktop.com editor Bridget Botelho sit down and discuss their hopes and predictions for desktop virtualization in 2011.
VDI is no small undertaking, so be sure to avoid some of the common VDI deployment problems.
Amazon’s cloud empire floated a little higher yesterday with the announcement that the web giant is adding bulk messaging to its cloud services. From the announcement:
We’re excited to announce the beta release of Amazon Simple Email Service (Amazon SES), a highly scalable and cost-effective bulk and transactional email-sending service for businesses and developers. Amazon SES eliminates the complexity and expense of building an in-house email solution or licensing, installing, and operating a third-party email service. The service integrates with other AWS services, making it easy to send emails from applications being hosted on services such as Amazon EC2. With Amazon SES there is no long-term commitment, minimum spend or negotiation required – businesses can utilize a free usage tier, and after that, enjoy low fees for the number of emails sent plus data transfer.
The horizontal play isn’t particularly surprising. While e-mail is something Amazon has been supporting via EC2 instances, the results aren’t always pretty. The dynamic IPs, for example, often get Amazon-powered e-mail flagged as spam. A dedicated service will help push past these problems, particularly for businesses where e-mail is an important tool but not necessarily the prime competitive advantage, leaving one less thing for your average IT department to puzzle through.
Who says the big guys never make room for the little guys? Whoever they are, IBM is making them think again with its latest offering suited for small- to medium-sized businesses: IBM Virtual Desktop for Smart Business. Using the Virtual Enterprise Remote Desktop Environment (VERDE) software from Virtual Bridges, IBM’s offering provides connection to Windows or Linux desktops for numerous devices. Enterprises have several options with Virtual Desktop for Smart Business: Windows XP or Windows 7 operating system, Linux (Ubuntu, Red Hat, Novell), and deploying on existing infrastructure or through an IBM reseller host.
The requirement for devices to access the desktop? If your device is capable of connecting to the server via a browser, you’re in. VDI, IT’s latest buzz acronym, appears to be stumping enterprises big and small. Aimed at companies from 100 to 1,000 end users, IBM’s virtual desktop offering provides VDI without the complexities SMBs can’t afford to navigate for just $150 per user, per year. The service is available through IBM’s approximately 100 resellers and integrators around the world.
Not quite as simple as Kaviza’s VDI-in-a-box offering, IBM’s virtual desktop requires configuration, design, and installation. The VDI package runs on IBM System X server running Suse Linux and VERDE, altogether with the capacity for up to 200 desktops on a server.
The offering is exciting for the desktop virtualization market for a couple reasons. Continued »
Virtualization hulk VMware closed out 2010 with a bang, reporting a 37.4 percent increase in revenues at $835.7M, and a 212.5 percent increase in net income at $119.9M.
Desktop products such as VMware’s View and the Workstation hypervisor accounted for 10 percent of revenues for the full year in the United States, a possibly deceptive figure in the face of the prevailing hope that 2011 will be a big year for desktop virtualization, particularly virtual desktop infrastructure. VMware’s chief financial officer, Mark Peek, didn’t make any predictions in that arena, but TechTarget’s own virtual desktop experts Brian Madden and Gabe Knuth took a stab at foretelling desktop virtualization trends in the coming year, marking VMware’s Horizon as a major competition for Citrix’s OpenCloud Access.
VMware as a whole is projecting modest growth in 2011, between $800 and $820M for Q1. Acquisitions were a major part of VMware’s 2010 strategy, with seven in total, adding upwards of 600 employees. VMware finished off the year with around 9,000 employees worldwide, up from 7,000 at the start of the year.
With pre-installation of vSphere on servers from big name hardware makers such as HP, Dell, and IBM, VMware is in good standing for a strong 2011.
What are your predictions for desktop virtualization in 2011? Do you think VMware will continue to reap the benefits?
Today’s guest post comes from Michael Fox, author of DeMystifying the Virtual Desktop. Michael Fox is a practicing expert working in the field of desktop virtualization. He has worked in IT for more than a decade, holds numerous technical certifications, speaks regularly at technical conferences and currently works as a Senior Architect in virtualization for EMC Consulting, the professional services division of EMC.
My hope for 2011 is that I have fewer meetings introducing people to the basics of desktop virtualization and more meetings about advanced topics in the field. I want to have more meetings on the kinds of desktop virtualization topics that allow us consultants to flex our muscles, bringing the considerable weight of our practice’s talent to solve an organization’s challenges. I want to have CIOs ask me how we can baseline the existing compute experience and then, using desktop virtualization, improve productivity for employee workflows. I want to have VPs considering acquisitions asking me questions about how desktop virtualization can accelerate the time it takes to complete integration.
If everyone read my book, that might happen. I wrote DeMystifying the Virtual Desktop to be a resource that gives decision makers an introduction to the use and application of the technology. It covers the basics of desktop virtualization like putting the user first, communicating across teams, understanding foundational technologies, doing assessments and analyzing designs to help understand costs. It was peer-reviewed and written in a completely agnostic way that does not mention any specific technology or vendor.
According to a number of studies that are out there, a large number of organizations have only just begun looking at this powerful technology. The good news for those of you just beginning your virtual desktop journey is that the products and reference architectures are getting better all of the time. Density inside the data center is increasing and the competition between platforms is quite healthy. There were so many improvements in 2010 that they are difficult to count. Protocol improvements enabled improved access to virtual desktops inside the data center. A number of new features, both hardware and software, are eliminating the storage challenges with centralized deployments. Lastly, the promise of the client-side hypervisor, with its centralized administration and decentralized execution, is beginning to come true.
Watch for a number of exciting new desktop virtualization developments in the coming quarters. I can’t say anything because of those pesky non-disclosure agreements, but there are a few real gems out there that I am really looking forward to bringing up with customers – the kind of developments that enhance the usability and value of desktop virtualization technologies for both end users and administrators. Density will continue to do what it does best – increase. Application performance will continue to advance to levels that are quite significant. Architectures that support sub second application startups, for example, will continue to become more and more common. There is even progress on desktop certifications. AppSense has two new certification tests coming to a testing center near you in February.
What we all know for certain is that 2011 will continue to be a year of conquering virtual desktop implementation challenges. Enterprises that implemented the technology early on will be considering ways to take advantage of new storage and application virtualization techniques. Companies performing Windows 7 migration will continue to consider desktop virtualization technologies at the same time. A change from 2010 will be the slew of tablets that are going to be invading our enterprises. This may in fact be a boon to the entire desktop virtualization space. If you think about it, the tablet and the virtual desktop may be a match made in heaven by giving users the ability to access any kind of application and/or an entire corporate desktop from the convenient devices.
For a chance to win a signed copy of Michael Fox’s book, share your own desktop virtualization wishlist for 2011 in the forums!
In the hubbub of new year’s predictions, Security Week expects virtualization to take a step forward in 2011 despite general security concerns:
Many companies have cited security concerns as the main blocker to virtualization and private cloud adoption. Paradoxically, virtual machines can be more secure than the physical servers they replace. Because virtual machines are purpose-built, virtualization security software can offer levels of dynamic and automated security that are unequalled in the physical security realm. As organizations become more familiar with hypervisor-based security and VM Introspection, the apprehension that may have stymied virtualization of critical workloads will be appeased.
Rather than having numerous endpoint devices that increase the possibility of a rogue user or machine, aspects of desktop virtualization such as zero clients allow for a major decrease in the threat of human error. As a result of the lessening of security fears surrounding the technology, IDC analysts predict revenue of $1.7 billion by 2011 in the desktop virtualization market. Gartner, similarly, predicts an increase in desktops virtualized from 500,000 to 49 million by 2013. Now that desktop virtualization is becoming more of an accepted technology, you should know your options by asking yourself some questions.
A while back, Michael Morisy posed the question: Are iPads desktop virtualization’s greatest threat? When we posed the same question to the Cloud Computing, VMware, Virtualization and Enterprise 2.0 Group on LinkedIn, we got a slew of great responses. Here’s what we learned is going on in the minds of IT decision-makers when it comes to the future of virtualization:
- It’s not necessarily an either/or scenario. Because tablet and mobile computing are still in their infancy stages, the enterprise should take this opportunity to prepare its mobile strategy now.
- With many vendors offering access to VDI via RDP, VNC, or ICA, these services should be seen as complementary rather than competitive.
- Tablets are more of a threat to PCs, laptops, and Windows OS, but not so much to the practice and development of virtualization technology. On the other hand, without aggressive development for virtualization-friendly hardware, it’ll remain status quo for a while, with the PC industry remaining healthy for the time-being.
- One member of the group with experience in IT management said that the “security concern” is another major reason tablets are complementary to VDI rather than competitive. He posed the question: “Are companies prepared to accept VDI as a replacement for traditional desktop computing (regardless of client access device) and does the value add of VDI equate to real ROI?”
- A VMware technical trainer in the group wants an endpoint that is the application doing work for him (e.g. email, documents, etc.), rather than the device running the app. The ideal device is one that allows access to those applications without worrying about the in-betweens.
One topic that got a lot of attention in the discussion was phone virtualization. Will this emerging technology improve or detract from existing enterprise mobile policies? Perhaps the evolution in the device market will push innovation and customization in the desktop virtualization market as well. Either way, it seems IT pros are looking forward to the changes coming in 2011.
How do you think tablets will affect desktop virtualization? Let us know in the comments section or send me an email.
QR codes are finally coming to America with a caffeinated jolt, thanks to Starbucks’ new mobile payment system that lets you scan and pay for your drinks with an iPhone or BlackBerry pre-loaded with Starbucks Rewards account information.
And with Starbucks incredibly brand loyalty stats, the program has a huge opportunity for success. As the Seattle Times reports:
One in five Starbucks transactions is now made with the store cards, and mobile payments “will extend the way our customers experience and use their Starbucks Card,” Brady Brewer, vice president of card and brand loyalty, said in a release. “With mobile payment, the Starbucks Card platform further elevates the customer experience by delivering convenience, rewarding loyalty and continuing to build an emotional connection with our customers.”
But as Starbucks paves the way into a brave new world (for the US, at least) of QR payments, I get the sinking feeling that we’re bound to run into a “teachable moment” security lapse very soon. Continued »
We received a great response to our call to the community for advice on ensuring that a blizzard doesn’t become a perfect storm for your IT department. New member KFaganJr used the question as an opportunity to document everything his department did to prepare, preparing a fantastic guide for others facing a similar problem:
Most of the tasks were a verification that existing systems were fully functional before the storm hit. These included ensuring all off-site backups we run successfully, a live copy of any important documentation was hosted off-site and up to date, temperature sensors and reporting tools were functional and also tweaked to allow more time for action due to the extra travel time needed.
Being a small department we were able to just discuss things such as who can be where and do what if a major problem hits, but in a larger organization I would have documented it.
It’s important to walk through the process of redirecting traffic if a location goes down. We have four main sites that usually funnel all traffic through the main office, if one goes down then that traffic has to be sent elsewhere to keep everything functional. Things to note… how much time do you have to complete the job if need be, is the alternate connection you rely on functioning properly, is touch services needed to make the switch?
Also, for anyone working with end users, any employees with passwords about to expire were sent additional emails to prevent additional work, reminders of phone system features were sent out that would help users work from home seamlessly. emails to remind users that if they have VPN issues speak to IT before the storm hits if possible. Additional laptops were available to lend out for critical personnel.
The most important task in my mind is making sure everything is running smoothly before the storm though, you don’t want to worry about preparing for disaster and neglect the critical server that has been crashing, have an issue with that server when touch services aren’t available and making all other planning to avoid disaster in vein because of an unrelated issue.
Spadasoe suggested that while virtualization could help offload some effects of traffic spikes if the number of remote users faced a sudden jump, “Our fabric is our fabric so we are limited in what resources we can add.” Planning ahead and mapping out worst-case scenarios were the best tips for staying ahead of outages.
MrDenny shared that view, adding that using multiple sites and duplicated data was a worthwhile investment:
Having user data replicated between sites so that when they VPN into another site because the network link at their office is down the users can still access their data, etc. The setup and recurring costs for a little extra bandwidth is minimal compared to the loss of work from one bad 2+ day snow/ice storm.
One commenting wag suggested simply going with the flow, “cutting Internet access to the office? Then blame it on the provider.” While we can’t officially endorse that approach, we can understand the temptation. Any more tips? Leave them in the comments below or, better yet, add them to the community forum!