CW Developer Network

Page 103 of 107« First...102030...101102103104105...Last »

October 11, 2010  11:06 AM

What to expect from Qt Developer Days 2010

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Cross Platform, Developer, GUI, Qt

It’s hard work coming to Munich and bypassing the city centre, the bierkellers and the Gothic spires of the Marienplatz to head straight for an out of town ‘conference hotel’ where the Bavarian flavour is limited to the ‘Oompah-Band-Burger’ on room service. But such is life and such is Qt Developer Days 2010, which I am secretly pleased to attend as I have been at the last four year’s events and know the team to be straight talking developer-focused ‘Trolls’ as they like to call themselves.

The ‘Troll’ reference is a throwback to the days of Trolltech and the company’s initial iteration in its pre-Nokia acquisition form. Speaking to developer evangelists last night, it appears that Nokia has been mindful in terms of brand and culture awareness with Qt and allowed the company to retain much of its original identity and approach to cross-platform application framework development – situated as it is in its Oslo headquarters.

So what to expect from the week ahead?

Crowd 2small.jpg

Photo credit: David James Stone

Well, it’s a refreshing start this morning. Rather than kicking off with a corporate keynote, we’re straight into a training day to match the newly announced Nokia Qt ‘Specialist’ certification. There are 50 technical tracks, almost all of which are being presented by Qt’s own developers making up a total of 62 hours of training. Qt hosts up video sessions from last year’s event as well as the new training content that it creates throughout the year and this last twelve months has seen 35,000 hours of e-learning clocked up.

Speaker Line Up:

Up on the podium this week once again is Sebastian Nyström, VP of application service frameworks for Nokia, Qt Development Frameworks – and he’ll be joined by Qt director of R&D Lars Knoll who between them will ‘tag team’ the roadmap ahead for the next year to eighteen months.

Nokia has brought out the big guns for the week ahead and we do get to hear from (and meet) Rich Green who is senior VP and CTO of Nokia itself. It won’t quite be a press one-on-session, rather more of a seven-on-one, but it’ll be interesting to hear what the big man has to say one day after the launch of Windows Phone 7.

As well as building the GUI and the application structure for the Air Traffic Control system at Munich airport, Qt is also being used by DreamWorks Animation for a new application and lighting system.

As for more, I’ll keep a few things for other blogs. Suffice to say for now that this year looks completely different to previous years, substantially bigger, many new attendees showing new interest in Qt – and a lot more dedicated developer training.

Here’s a link to some photos from last year that I took myself – they have been illegally hosted on this Chinese website, so feel free to click the link and have a look if you wish.

October 8, 2010  12:12 PM

Developers, could the future of online apps be, well, offline?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Developers, map, OFFLINE, online

The Holy Grail of application development is online apps that work with great WiFi connections right? There’s nothing like being able to download all your favourite data and surf the web, especially on a mobile device, right?

The thing is, national city-wide WiFi zones haven’t quite arrived yet have they? Also, what about all those times when coverage is hard to get or prohibitively expensive – think hotels and planes as the prime examples there if you will.

So what we really need are downloadable apps for, say iPhone and iPad users, that work in full when not connected – but yet emulate the full functionality of an app that we would normally expect to only use online, right? A street map would be a great example here.

A street map of England perhaps, or even better the whole of the UK – or even better than that, the whole of Europe. As a download, with a one off payment fee, fully searchable and touchscreen enabled.

It’s not as fanciful as it sounds, Skobler has just released ForeverMap Europe for the iPhone and iPad (an Android version is coming soon) using data from the open source OpenStreetMap project.


Berlin-based skobbler has been independently developing navigation software for mobile phone platforms since 2008. Since its UK launch in June 2010, skobbler says it has topped both the free and paid navigation category in the UK Apple App Store for four consecutive months with UK downloads exceeding 200,000 to date.

“This is It’s something that hasn’t been done before (at least not for an entire continent) and we believe that there’s a real need for people who either travel a lot or who don’t have a mobile data connection in their device (i.e. all iPod and non-3G-iPad users). So we developed ForeverMap from scratch, using our own map compression and routing algorithms,” said Marcus Thielking, co-founder of skobbler. “We want to create an array of ‘lighthouse’ products that show what’s possible based on the OpenStreetMap in all sorts of categories on various mobile device platforms.”

ForeverMap Europe costs £3.49, while TomTom Europe will set you back around £60.

October 7, 2010  12:48 PM provides new word power for developers

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
API, Developers, Programmer

Although undeniably American in its general influence, is attempting to extend its global reach by launching its new API Development Centre to extend its Application Programming Interface (API) to software developers everywhere.


Text-based applications and, to be honest, any application featuring the option for a user to work with and manipulate text, can now be built using the site’s word definition power which also extends to a thesaurus, quotes, an encyclopedia and a translation tool.

There are also etymologies, pronunciations, slang and word of the day.

“Our robust API enables developers to leverage’s comprehensive offerings to enhance word games, create learning language applications and other word-related apps for online, mobile, eReaders and other connected environments. The API will empower the developer community to deliver more exciting content and experiences to their users,” says the site. sits under the ownership of and had previously opened the API to its preferred partners last year. This latest move, should developers wish to adopt the site’s offerings, will allow apps to be built with word definition power baked in – side stepping the need for the user to open a browser at all.

The API is offered in a selection of free options for non-commercial apps and paid options, which can be based on a revenue-sharing deal or a license fee.

October 6, 2010  12:32 PM

HTML guru: canvas & CSS3 advances deliver sophisticated animation & drawing power

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
canvas, CSS3, Developers, HTML5, Web development

This is the third guest post by Mat Diss who is founder of bemoko, a British mobile Internet software company that aims to pioneer new ways for web designers to quickly construct better websites that can be delivered across all platforms from desktop to mobile.

Here Mat continues his series of posts that demonstrate the advantages of designing with HTML5. In this blog, Mat looks at the concept of Canvas for creation of graphics…

HTML5 provides a whole new canvas concept which allows you to draw graphics on your web page. CSS3 also brings in some great new features such as transitions – – which allows CSS properties to be changed smoothly from one value to another. For example, rotate an image when the user clicks on it.

With the combination of both the canvas and CSS3 advances, an HTML developer has a lot more power at their fingertips. This can include drawing of a button (with text generated on the fly), sophisticated animation and interactive gaming. A lot of UIs that were once the realm of platforms like flash are now are now accessible via native HTML.

The canvas example – at – draws a few gradient shaded circles when the user clicks a button. This is done with the following Javascript and HTML snippet:


<script type=”text/javascript”> <br / />
  function draw(){ <br / />
    varctx = document.getElementById(‘whiteboard’).getContext(‘2d’);    <br / />
    // Create gradients  <br / />
    varradgrad = ctx.createRadialGradient(45,45,10,52,50,30);  <br / />
    radgrad.addColorStop(0, ‘#A7D30C’);  <br / />
    radgrad.addColorStop(0.9, ‘#019F62’);  <br / />
    radgrad.addColorStop(1, ‘rgba(1,159,98,0)’); <br / />
    <br / />
    varlingrad = ctx.createLinearGradient(0,0,0,150);  <br / />
    lingrad.addColorStop(0, ‘#00ABEB’);  <br / />
    lingrad.addColorStop(0.5, ‘#fff’); <br / />
    <br / />
    var radgrad4 = ctx.createRadialGradient(0,150,50,0,140,90);  <br / />
    radgrad4.addColorStop(0, ‘#F4F201’);  <br / />
    radgrad4.addColorStop(0.8, ‘#E4C700’);  <br / />
    radgrad4.addColorStop(1, ‘rgba(228,199,0,0)’);  <br / />
      <br / />
    // draw shapes  <br / />
    ctx.fillStyle = lingrad;  <br / />
    ctx.fillRect(0,0,150,150);<br / />
    ctx.fillStyle = radgrad4;  <br / />
    ctx.fillRect(0,0,150,150);  <br / />
    ctx.fillStyle = radgrad;  <br / />
    ctx.fillRect(0,0,150,150);  <br / />
  }    <br / />
</script> </p>


<div class=”content”>
  <div><button type=”button” onclick=”draw();”>Draw Something</button></div>
  <canvas id=”whiteboard” width=”150″ height=”150″>

See for quite an impressive experience built using the HTML5 canvas.

Visit the demo site at :

October 4, 2010  10:43 AM

Pulling together: interoperability and the user at the heart of business computing

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Applications, Enterprise, Interoperability, Users

This is a guest blog written by Phil Lewis who is a business consulting director with Alpharetta, Georgia headquartered enterprise software company Infor.

Without promoting his own company’s brand in any blatant way, Lewis gives us a relatively impartial view of how interoperability and connecting applications to execute cohesive business processes should always stem from user perception of business needs and real user control.


It is incredible to think that despite over two decades of business computing and the efforts of the multi-billion dollar software industry, companies in 2010 still struggle with the same issues they started tackling when they adopted their first computer:

• Complex integration projects with expensive and unfamiliar technology
• Rigid business processes which are difficult to re-configure
• Proprietary technology creating barriers to interoperability
• Visibility of business problems forcing a reactive approach instead of proactive improvement
• Islands of isolated information and data causing ineffective decision making and complex reporting

Alongside these “traditional” issues we can now add new challenges such as the ongoing debate between on premise vs. cloud applications, where businesses don’t just face the need to make a choice, but a new integration issue when they have made their selection.

The key to solving this history of challenges is ensuring that the technology that is implemented is done so in a way that does not prohibit or restrict future options.

For example, connecting applications to execute cohesive business processes, with a lightweight, standards-based, technology architecture, not only pulls together related business activities such as production and stock, but keeps the door to the future open with a set of connectors capable of integrating with third party applications and cloud services.

To do this without drowning in new technology investment, businesses need to align business processes to put the user in control. That is not to say that the technology should be entirely passive – it should help guide users to execute critical tasks, which support overarching company goals.

This eye on the future and end user focus mean business network reporting will become critical whereby a repository subscribes to messages published by all connected applications. This provides a single data source, which represents data from all business systems, giving a view of the entire business, opposed to an individual application.

Of course, finding applications that are delivered with this level of interoperability built-in is a whole new challenge…

October 1, 2010  1:35 PM

Explaining “Intelligent Workload Management” in simple terms

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Compliance, Data, Management, middleware, Virtualisation

This is a guest post by Mark Oldroyd, a senior technology identity and security specialist with Novell.

In this piece Oldroyd discusses the risks and challenges of computing across multiple environments. Despite the hurdles and obstacles ahead he says, companies can achieve their aims if they strike a balance between flexibility and control.


If a business adopts a highly flexible approach by increasing its adoption of virtualisation and allowing employees to work remotely – the corresponding risk to confidential data rises. But try to control application usage and the movement of data too closely – and IT managers risk damaging the competitive advantage of their organisation and its ability to respond quickly to market changes.

According to IDC, a new approach to managing IT environments is required – that approach is Intelligent Workload Management. A workload is an integrated stack of applications, middleware and operating system. These workloads can be made “intelligent” by incorporating security, identity-awareness and demonstratable compliance. Once made intelligent, workloads are able to:

  • Understand security protocols and processing requirements
  • Recognise when they are at capacity
  • Maintain security by ensuring access controls move with the workload between environments
  • Work with existing and new management frameworks

Intelligent workloads offer organisations multiple benefits. They ensure security and compliance; reduce IT labour costs in configuration; improve capital utilisation; reduce provisioning cycle times and reduce end user IT support costs.

Adopting an Intelligent Workload Management approach to IT isn’t a nice thing to do – it’s an essential thing to do in today’s complex IT landscape. Organisations today must deliver seamless computing across any environment, whether that be physical, virtual or cloud if they are to successfully achieve a balance between security and control.

Continued »

September 29, 2010  7:48 AM

AVG launches Internet Security Suite 2011, but what about the Bayesian probability factor?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
'Virus`, analysis, Developers, Kernels, labs, Security

So last night I was able to meet and share canapes with AVG CEO J.R. Smith and the company’s CTO Karel Obluk to celebrate the launch of the new AVG 2011 Internet security suite. What started out looking like a back slapping meet-and-greet did in fact turn into a deep dive technology update and a mathematics lesson in Bayesian probability.

Amidst the crab balls on spoons, miniature vegetarian tacos served by effeminate waiters and the glitzy gloss of a product launch you can sometimes, if you are lucky, get down to the nitty gritty of why a company has gone to market with a new product version. This was my mission…

AVG says its 2011 iteration features enhancements based on community feedback from the company’s global community of more than 110 million users and now includes enhanced web and social network-protection.

But what kind of feedback is this? What kind of enhancements have been made — and is AVG’s so-called People-Powered Protection technology and approach anything more than marketing puff?

I had a similar problem when I attended the launch of Adobe Photoshop Elements last month. I asked the speaker how the company had re-engineered and re-architected the product to extract it and simplify it from the total Creative Suite 5 offering. I got a, “we’ll get back to you on that” – and I’m still waiting.

This was not so much the case last night. AVG does seem to take the back end seriously enough to bring its CTO along to product launches and this guy is a programmer of the old school. Karel Obluk took me through software kernels, re-architecting modules to handle zero-day attacks, how automatic updates are engineered at the back end, how the analysis labs work inside an anti-virus company and where I should look for to get the best beer in his home city of Prague.

The concept is simple, or at least it ought to be. If you want developer/programmers to adopt your product and actually use it- then they are going to need to know more than whether or not it comes in a shiny new box.

AVG Jap.jpg

AVG’s People-Powered Protection appears to be a system where users decide to opt in or opt out of sharing data related to the websites (both safe and potentially malicious) that they visit. Aggregating this data and then feeding it into analysis engines fueled by (among other values) Bayesian probability logic where reasoning can be theorised on the basis of ‘uncertain’ statements – the company says it builds new Internet security power.

Without going into any further analysis of Bayesian probability and the state-of-knowledge concepts of objectivist views versus subjectivist views, there is still a lesson here I believe.

Of course, AVG does have its free to download LinkScanner technology which analyses website content before a user’s browser is directed onward – and it also has its AVG Free model, which, arguably, also brings in a good quotient of faithful converts to the AVG way.

But if you want to sell to your product to families and non-techies then fine, keep your messages pretty much as they are. If you want to seed power users from the top down who will understand why your product is better (if it indeed is) – then drop in some Bayesian probability theory and a CTO briefing. Otherwise, those canapes better be really good.

September 27, 2010  9:23 AM

The “Offline” Web Application

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
APIs, Developers, HTML, OFFLINE, Support

This is the third blog post by Mat Diss, founder of, a UK mobile Internet software company that is focused on developing new ways for web designers to construct better websites that can be delivered across all platforms.

In this blog, Mat looks at the offline support HTML5 delivers for web applications…

A key benefit of native applications (whether desktop or mobile) is the ability to interact with the application whilst you are offline. The HTML5 offline support allows web applications to achieve this. Google has demonstrated its backing for the HTML5 offline support by announcing that they are no longer supporting Google Gears – an early solution for for offline web access – and are backing the HTML5 offline APIs.

So along with strong support from the major browsers, this is an indication that this API will mature and become an essential foundation for the web. Offline storage will be an essential ingredient of any web application that requests information from a user and delivers essential information that a user would want to access anywhere, anytime.

For example when a user fills in a form, or edits some data it is an important aspect of the user experience that the information entered is not lost and causing frustration. With HTML5, changes can be stored locally in the browser and synced with the main server when a connection is re-established.

Information applications – such as travel guides – also provide much more value if you can access information quickly and reliably, even if you are in the tube, on a plane, or in a foreign country on an expensive data plan.

The example below implements a simple local ‘to do’ list manager:

function storeData() {
var newDate = new Date();
var itemId = newDate.getTime(); //creates a unique id with the milliseconds since January 1, 1970
var todo = document.getElementById(‘todo’).value;
localStorage.setItem(itemId, todo);
} catch (e) {
alert(‘Quota exceeded!’);
var todo = document.getElementById(‘todo’).value = “”;

function getData() {
var todoLog = “”;
var i = 0;
var logLength = localStorage.length-1;
//now we are going to loop through each item in the database
for (i = 0; i <= logLength; i++) {
//lets setup some variables for the key and values
var itemKey = localStorage.key(i);
var todo = localStorage.getItem(itemKey);

//now that we have the item, lets add it as a list item
todoLog += ‘



//if there were no items in the database
if (todoLog == “”)
todoLog = ‘

Log Currently Empty


document.getElementById(“todoList”).innerHTML = todoLog;

function removeItem(itemKey) {

function clearData() {

window.onload = function() {

Which can be used from the following page:

I’ve got to …

Enter something you have to get done

My to do list…

Nothing to do

Note that once this application has been downloaded by the user it can be used offline (i.e. without a network connection). In a full application you would probably want to persist these changes with a back end server when the user does come back on line which can be done by hooking into the on-storage event handler (see

By use of this code snippet, I agree to the Brightcove Publisher T and C
found at

This script tag will cause the Brightcove Players defined above it to be created as soon
as the line is read by the browser. If you wish to have the player instantiated only after
the rest of the HTML is processed and the page load is complete, remove the line.

September 22, 2010  5:17 PM

Michael Dell talks ‘Zettabytes’ at Oracle OpenWorld

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Data, data-centre, Databases, Dell, Networking, Services, Zettabytes

This enormous IT show started off today with a presentation from Michael Dell. The man himself kicked off with an overview of the partnership that has existed between his company and Oracle since 1995. This week has seen the launch of a new Oracle consulting and implementation practice at Dell, so it’s fairly safe to say that the two companies are snuggling up closer than ever.

Dell went on to detail a comparatively new measure of data in terms of our general awareness, the Zettabyte. A Zettabyte is 1.1 Trillion Gigabytes and studies suggest that by 2020, we’ll all be creating about 20 Zettabytes per year and about one third of that data will pass through a cloud.

Dell went on to talk about servers, storage, networks and services. He lent towards talking about how Dell works at the application level by detailing several large implementations of Dell hardware and services by bringing out customers including Zynga Games (the people behind Farmville) and FedEx too.

Dell is pointing to ‘next-generation’ data centres that will power the new ways we handle data over the next couple of decades – clearly he wants to make sure that his company’s servers are well positioned to drive these data banks – and it would be hard to argue that his relationship with Oracle will harm that objective.

The newly enhanced Dell Services Oracle Practice is designed to drive cost and complexity out of IT infrastructures. The company says that, “Dell Services also helps companies simplify data center operations with assessment, design and implementation services that focus on Oracle Database, including Real Application Clusters, in a standards-based x86 environment. These services are designed to enhance data availability while lowering the total cost of ownership.”


Given Oracle’s own prowess in database hardware and the newly launched Oracle Exalogic Elastic Cloud, it’s kind of hard to know where Oracle stands on hardware from a corporate policy perspective. On the one had, Oracle says that it buys a lot of Dell kit. On the other hand, the Exalogic box is described as the world’s first and only integrated middleware machine – or to use Oracle’s own words, “A combined hardware and software offering designed to revolutionise data center consolidation.”

Either way, if Dell’s Zettabytes predictions are true then we’re going to see a lot of software application developers and database administrators being kept very busy in months and years ahead.

September 21, 2010  12:14 AM

Larry Ellison: how should we define the cloud?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater
Cloud Computing, middleware

If you’ve been following the news for Oracle OpenWorld and JavaOne 2010 so far you’ll have seen a whole heap of products unveiled. The company has quite literally carpet-bombed the newswires with some pretty meaty announcements. There is therefore, I hope, room for some slightly more tangential comments as to the look and feel of the show.

Larry Ellison himself used the first keynote (unusually hosted on a Sunday night) of this event to detail his all-singing, all dancing integrated hardware and software box — the Exalogic Elastic Cloud. The concept behind this product being that it is all the hardware, middleware, processing power and software needed to create a cloud infrastructure.

Under the new tagline ‘Hardware and Software Engineered to work Together’, Oracle is rolling out what it claims to be a total cloud solution – but is it simply, as some commentators have suggested, nothing more than an ‘intelligent server’?

Is this announcement no more than “Cloudwashing” i.e. putting the term “cloud” in front of some otherwise standard piece of kit – or it is true innovation?

Could this really be the cloud in a box – delivered?

Larry 1.jpg

Looking at the hardware on offer here. We can see how Larry defines the cloud.

“The whole idea of cloud computing is to have a pool of resouces that is shared by lots of different applications within your company – at end of month accounts, the resouces are directed towards the finance department, when it’s time for a sales drive, a different department gets what they need and releases the power that they don’t need,” said Ellison.

If the Exalogic Elastic Compute Cloud is indeed a cloud in box, then this is what it takes to build one:

  • 30 servers in a box – 360 cores
  • An Infiband Network – so the servers can all talk to each other (but with a lot of bespoke aditional work from Oracle)
  • A storage device
  • A VM with 2 guest OS – one is Solaris for high end, and one is Linux
  • A strong middleware component

… and the secret sauce? Well, that’s the “coherence software” that conjoins the memories in all of the servers so that they appear as one complete unit.

OK so every tech vendor worth its salt has their own spin on what cloud computing really is – and Larry clearly had a riot of a good time running down the efforts of and Red Hat as he specified that Oracle’s defininition of the cloud – if he ABSOLUTELY HAD to pick another company – it is probably the Amazon (EC2) – elastic cloud 2.

For Larry, the cloud must be a “platform” upon which you build other applications, which includes both hardware and software and it’s all standards based – and the on demand element is especially impotant here, so much so that Amazon put the term ELASTIC into the term itself (which Larry actually quite likes) – after all it was Amazon that really popularised the term.

Is your head still in the clouds? Spare a thought for the attendees that had to sit through a 10 minute gung-ho boys at sea video of Larry winning the America’s Cup in his massive sailing catamaran.

Tonight, all the UK press are going out for a few beers with him while he asks us to tell us what we like most about open source computing and the community contribution model of software application development before we head on out for burgers.

OK – so that last bit didn’t happen, but all the above did!

Page 103 of 107« First...102030...101102103104105...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: