This is a guest post from Ivan Horrocks, senior lecturer in technology management at The Open University
It’s a fairly common situation in IT: You’ve spent the last ten years becoming an expert in your field, impressing everyone along the way, and now your boss tells you they want you to go down the management route.
To some this is welcome news. They’ve done their time doing the hands on work, and their reward is a move towards senior management. But to others, nothing is more depressing than abandoning working with the technology they know and love to sit in board meetings and negotiate deals.
Most people want to feel a sense of progression in their careers, but this doesn’t have to mean changing tack into business management to get into senior management. Although relatively unrecognised, there is an important difference between mainstream business management and technology management. For those wanting to progress their careers whilst remaining in touch with the world of technology, a far more preferable route may well be the latter approach.
Ultimately technology management focuses on the relationship between the management of technology and innovation, and how these relate to other areas of management, including operations, finance, supply chain and logistics and strategy.
This means looking at how existing technologies can work together to enhance an organisation’s processes and products or services. It requires a broad perspective on what’s available and an understanding of what works best, when, where and why. Technology managers appreciate how technology can be integrated; how the skills they and their team have can be employed to improve business operation and deliver value, both internally and externally, and where and when they need to bring new skills in to do so.
Technology managers are also technology strategists: always looking at ways to foster technological innovation. But implementation and application are also key concerns. Giving the teams you work with the time and space to experiment with different technologies, explore how they might work best, and develop or customise these for the benefit of your organisation and its customers.
As a strategist you will also need to guide this experimentation, create room for it to fail, and find ways to spot and develop what works. And you will need to model the consequences of integrating technology – understanding the impact it will have on staff and customers and ensuring there is appropriate training and support to make the new ideas a success.
What does success look like?
It is now widely accepted that IT no longer just provides the infrastructure for people to do business. We hear ever more about how strategic use of IT has transformed organisations and peoples’ lives in exciting ways.
Increased connectivity has created new delivery models – from on demand video to legal and financial services delivered online – disrupting long established industries.
Sensors in wearables are transforming industries like healthcare which can monitor for warning signs without having to keep people in hospital beds, saving money and freeing up time for other patients.
Harnessing data is allowing better targeting and streamlining. Transport – from vehicles, fleets, trains and planes – all collect huge amounts of data from sensors which they are now using to model ways to use fuel, reduce delays and predict problems.
These are all examples of the transformational uses of technology, particularly in business. They would all have been developed with the support of senior management and driven by the organisation’s strategy, but requiring a detailed knowledge of technology development, selection, customisation and integration.
Becoming a technology manager
Technology has the capability to make a significant contribution to organisational performance, economic growth and social well-being. As a result, there is an increasing demand across public, and private sectors for people with both the operational and strategic capability to plan, develop and manage technology and technological innovation.
To pursue this route, you will need to build on your technical expertise by developing the knowledge and skills to make the right decisions about technology acquisition, exploitation, implementation and innovation. These may rarely be taught in undergraduate or vocational IT and computing courses but they build naturally on the education and experience of someone whose career has primarily focused on developing and delivering successful IT projects.
None of this means you shouldn’t go the pure management route into senior management, if that’s your preference. Technology specialists develop many core business skills and can make a huge contribution to a more general management role. It’s just that they don’t have to – now more than ever there is a demand for managers who have a deep knowledge and experience of technology. General management principles are necessary to progress, of course, but this can mean building on a specialism, not abandoning it.
Dr Ivan Horrocks is qualification director for technology management (TM) at The Open UniversityThe TM programme is aimed at helping technology professionals and their organisations advance by using technology strategically to deliver innovation and drive the business forward.
This is a guest post from Evanna Kearins, director of analytics, TIBCO Analytics
Harvard Business Review called data science the “sexiest” job of the 21st century, yet a survey conducted by the CBI warned that 39 percent of companies cannot find staff with the required skills and knowledge in science, technology, engineering and maths.
With the number of GCSE students studying IT-related subjects increasing, it is surprising to hear that student numbers for A-levels in computing and ICT have decreased. But whilst ICT is a subject that is helping to develop the skills data explorers need, other core subjects like Maths, English, History, Science and Business studies are also indirectly all helping to develop the next wave of data scientists and data explorers. The following core subjects being taught in schools are indirectly teaching students they skills they need to help fill the current skills gap:
ICT – The next generation is growing increasingly comfortable in using an array of devices and platforms. With the Internet of Things connecting devices, the volumes of data available are growing exponentially. In teaching children how to use platforms and understanding the nuances of each, they are also teaching fundamental skills needed by any data explorer.
Maths is another subject that can help students begin their career as a data scientist or explorer. Pupils are taught how to read charts and are able to gain a better understanding of comparing different patterns. These skills are not only needed in the class room, but by data explorers to make valuable business decisions as they seek to understand the data flowing through the organisation.
English – Being able to crunch numbers is important for a data explorer, but so too is having the ability to communicate the true value of this to the other board members. Communication is key in all walks of life so being taught English helps to translate numbers and data into meaningful insights that can transform the business. This is also crucial to bridge the gap between the business and IT functions.
History – Having an understanding of the past and how this will impact the future is also an important aspect. Just as students must analyse events in History, so must data scientists and data explorers. After all, past data trends can help shape the future through predictive analysis.
Science – Just like in an experiment whereby you have a hypothesis, methodology and then actively test for results, Science can help to identify what is fact vs. what is fiction. It is important to not just base data analysis on predictions. Understanding the difference between what the data says and what it actually means is a bit like a science experiment in itself!
Business studies – Interpreting the data is vital for any data scientist. But what happens to the data once it has been analysed? Through exploring the data and understanding the different areas of the business, data explorers should be able to identify how data analytics can impact every part of the business.
We are all now an intricate part of a new ICT ecosystem, one built on big data, apps and industry innovation. This is something that the graduates of today know better than anybody else. Making sure young people have the right skills may well be only half the battle, but there is a greater opportunity than ever before to build upon student interest, encourage the training of a wider skillset and help to find the innovators of the future who will play a valuable role in bridging the data scientist skills gap.
This is a guest blog by Dr Arosha K. Bandara, senior lecturer in Computing at The Open University
A criticism often levelled at IT education is that by the time you come to apply the skills, they might be out of date. Why learn technology skills when that technology might not be in use in a couple of years?
IT does change fast, but the fundamentals of how we design and build systems change at a slower pace. As long as we learn about today’s technology in the context of how it relates to the business world and how it is likely to evolve, then we will be in a much better position to respond intelligently to the changing world.
But this is often overlooked by both formal and in-house training programmes, which have favoured skills which address very specific challenges. In order to be adequately prepared to tackle tomorrow’s technology challenges, we need to move from a mindset of knowing how to apply technology to well understood situations, to one of being able to think critically about problems, and identify solutions to unknown as well as familiar technology issues.
To prepare IT professionals for the rapidly changing world of technology, we need to instil an approach based on critical thinking. I’ll look at how we might do this, before putting this approach in context.
The organisation you work in is complex. It is shaped by the nature of individual thinking processes as well as existing technology and business pressures. Any changes will have causes and consequences that may have a much wider impact. Solving a problem will change things, which could lead to other problems.
Different people see different priorities. There is sometimes no obvious answer, or many different reasonable answers. But there are also wrong answers, which can be pursued, sometimes at great cost. These often result from a very narrow focus on the problem out of context.
Interconnections are too often ignored, a single cause may be presumed, or an individual quickly blamed. This is not exclusive to IT, we see this in wider society all the time – it’s easier to blame crime on individual criminals than deal with the many complex societal factors that lead some to criminality. The other mistake is a focus on outcomes – ie how many criminals can we arrest rather than how many crimes can we prevent.
To avoid these mistakes, problems should be approached by thinking about the systems that affect the challenge or opportunity. This is more difficult than isolating and addressing a problem, but ultimately more likely to produce a better solution.
Thinking about systems
As well as looking at how technology works, it is necessary to think about how people will react to it. Is a great new technology too hard to learn? Will tough new security procedures incentivise people to circumvent them? We need to understand the systems in which new technology operates.
Cognitive mapping is a technique for understanding and shaping the mental models your stakeholders use to perceive, contextualise, simplify, and make sense of otherwise complex problems. Thinking through these will help ensure new technologies and programmes have the results they are supposed to.
However good your plan is, you won’t foresee everything, so it is also critical to continuously test and review, and feed that learning into your ever evolving plans. Throughout the life cycle of any project, topics such as stakeholders, finance, risk, people, project administration and quality must be constantly reviewed in the context of the project.
The world of the future will require more understanding of flexible management. We will have to place more emphasis on learning as we go and making sure that learning changes our practice and organisations. We need to get used to this.
Critical thinking in context
Two core skills of any modern IT professional are cyber security and software engineering. Both relate to complex real world challenges and can only be dealt with effectively if they think critically.
Firstly, cyber security. Any IT professional needs to fully explore the available security technologies and stay up to date with them. But they also need to think through the risks that may arise in all relevant aspects of an organisation’s operations which may impact security, including human factors, web services and system upgrades.
You also need to be able to plan for when things do go wrong. Again, this needs an understanding of attackers’ motivations and employee weaknesses, as well as of the technologies available to circumvent your defences, and a sense of how these could evolve. It also requires an understanding of the legal frameworks and technologies relevant to digital forensics, which are essential when responding to cyber security incidents. Only then can effective plans be made.
Teaching all this must be put in a real world context. In our own post-graduate courses, most students learn these techniques by crafting a fit-for-purpose Information Security Management System for the organisation where they work.
Secondly, software engineering. Contact between the business and the external world is often mediated by software, and the business has a responsibility to its wider community that may be served, or jeopardised, by this software.
Skilled software engineers can add a lot of value by creating or adapting software, from managing projects and sales, analysing performance and customer data, and automating tasks. All of these exist in a complex real world, where humans react to change in different ways. Any new system must understand how users or customers will respond to it.
The skill is not one of knowing how to do this, it is one of knowing how to model the relationships between the software, the organisation it serves, and its wider environment. This approach must be used in development, roll out, updates and maintenance – it is an evolving process.
Critical thinking doesn’t mean ignoring technology, of course. The process can be evolved further by an understanding of different software engineering tools that can help them simulate, manage and monitor. Using these effectively is part of the skill of good IT planning.
A critical approach allows you to plan effectively
IT is critical to business and will become ever more so. It exists in an increasingly networked and interconnected world, where groups, teams, organisations and even nations will have to be smarter in their ways of working together.
IT professionals therefore need to be able to think in ways that reflect these challenges. IT education at all levels must teach how to take a critical approach which relates technical competencies to complex technological, human and business issues.
Dr Bandara teaches Postgraduate Computing courses at The Open University aimed at helping IT professionals advance by using technology strategically to drive the business forward.
Software consultancy Scott Logic has published a series of ‘A Day in the Life’ employee stories
Those interested in working at Scott Logic, or just interested in hearing about different IT roles, can learn about jobs such as graduate developer, developer, senior developer, lead developer, technical architect, head of development, test engineer and lead test engineer.
John Wright, recruitment manager at Scott Logic, said: “We’ve grown steadily over the past 10 years and as most companies often experience, the real people and their stories can be lost behind traditional corporate content used on company websites.
“The response we received from our staff was amazing and their stories clearly communicate what it’s really like to work at Scott Logic.”
Scott Logic has offices in Newcastle, Edinburgh, Bristol and London and plans to grow its workforce by 50% in 2015.
This is a guest blog from Martin Baker and Mike Glanville are former chief officers of Dorset Police, now directors of One Team Logic, providers of MyConcern safeguarding software for schools.
Question: Which Google search returns 366 million hits? Answer: ‘Lessons Will
Be Learned’. OK, not all of these hits refer to safeguarding but you get the point; this hackneyed phrase has become the media statement of choice following every instance of incompetence, negligence, malfeasance or tragedy. But when the media interest wanes are those lessons really learned? And what does that mean in the context of safeguarding in schools?
Firstly, some background. The 1973 public inquiry into the death of seven-year-old Maria Colwell laid the foundations for the UK’s contemporary child protection procedures. Since that time a litany of tragic incidents has resulted
in further fundamental
changes in legislation. These developments have been accompanied by a plethora of Government guidance and recommendations from innumerable Serious Case Reviews (SCRs) following the death of or serious injury to a child. So, there is no shortage of ‘lessons to be learned’ in relation to safeguarding, and not least in education.
Yet the processes underpinning one of a school’s most fundamental duties – to safeguard its pupils – continue to operate like a 1950s bureaucracy. In 2015 our ‘digital natives’ are being safeguarded by a regime steeped in paper, brown manila folders, four-ring binders and filing cabinets. In UK schools today you will find a huge range of information and technology to support almost every aspect of education – but not safeguarding. And this at a time when safeguarding has never been more complex, nor the legal duties on schools more stringent. The ever-present risk of abuse, neglect, the contemporary challenges of child mental health, e-safety, child sexual exploitation, female genital mutilation, extremism and radicalisation, the multiple issues from home and community that can affect child development and wellbeing – a whole world of risk passing through our school gates on a daily basis. And all predominantly managed on paper and email. Add to this the pressure on schools to pursue targets, the limited time available for training and the austerity-driven reductions in local authority (LA) support and it becomes clear that ‘learning lessons’ isn’t straightforward.
So who is accountable? Ultimately, it is the responsibility of headteachers, governing bodies and academy sponsors to ensure that safeguarding practices in schools are effective. But how do they know? In the age of ‘big data’ it is startling that, because of the paper-driven nature of the safeguarding systems in schools, there is little-to-no data of any practical use to assist schools, their LAs or their Local Safeguarding Children’s Boards to track threats and predict trends in order to protect children. (By law, schools must provide an annual safeguarding report to their governing bodies; this is often very short, containing only a handful of manually compiled statistics). ‘After the fact’, Ofsted inspects school safeguarding arrangements and allocates a grade – if your safeguarding is ‘Inadequate’ so is your school. But by then the damage could have been done.
During our policing careers we saw the tragic consequences for victims and families when safeguarding failed, and as school governors we have observed the endless paper trail that accompanies safeguarding in education. We have examined in detail every piece of legislation, policy and guidance and every relevant Case Review. This resulted in us designing an integrated approach to safeguarding that seeks to incorporate all of the ‘lessons learned’ in relation to: governance, leadership and management; preventing harm; recording concerns; case management; information sharing within schools and with other agencies; recruitment, vetting and training; allegations of abuse against staff; data protection and subject access; information security; records transfer between schools; the retention of records (25 years in respect of child protection – good luck with the paper!) and the vitally important issue of learning from the data.
Our schools are full of committed members of staff who succeed in safeguarding through their own skills and determination, despite the very poor systems and tools at their disposal. We are now able to provide not only the information but also the technology that they need and deserve in order to deliver on their safeguarding responsibilities. We’ve recently joined the E2BN ThinkIT framework, designed to make IT simple for schools, because we believe that schools should be able to access good support from trusted organisations, and that lessons should be learned!
 England, Wales, Scotland and Northern Ireland all have their own child protection legislation, albeit there are many similarities; this article focuses on the current arrangements in England.
 SCRs are held following the death of or serious injury to a child where abuse or neglect is thought to be involved; in 2014 in England alone there were 58 SCRs, the majority of which related to child deaths.
This is a guest blog from Stone Group and Advanced Security Consulting.
Stone Group’s Simon Harbridge and security consultant Jay Abbott of Advanced Security Consulting recently got together to discuss the worrying issue of security in schools and how technology helps and hinders its progress. What they talked about may surprise you.
Simon Harbridge: Jay, you and I recently attended the same debate on digital safeguarding, and we found a lot of common ground. I was quite surprised that the conversation seems to be still revolving around simply educating kids about the dangers of online conversations. Were you?
Jay Abbott: I wasn’t taken aback, but I was a little concerned by it, as were you I think! We were talking with highly influential people, from NASUWT, Childnet, ParentZone, as well as heads of school. I felt that there was a fixation on the damage that being online can cause, and the knock-on effects on teachers and pupils, rather than a need to solve the root issues.
Simon Harbridge: Agreed. There were several moments of clarity, one being a comment that kids don’t respect or use the term ‘e-safety’, so we shouldn’t either, and another being that kids don’t distinguish between on and offline conversations or relationships – they are all part of their social mix. I can relate to that, because we’re spending a lot of time with schools who want to foster an environment of location independent learning – bringing education to life with lessons outside the classroom that use elements such as Augmented Reality to bring things online into the offline world. BYOD and one to one device schemes are driven by this change. It’s kind of exciting, seeing technology be such an integral part of day to day life in schools, especially as it’s matching children’s expectations about how life ‘should’ be.
Jay Abbott: Precisely, but from my experience, the focus needs to also be on the ‘back office’ parts of a school’s technology, for the roots of digital safeguarding strategy to really take hold. No one to one device scheme, or digital policy is going to weather the demands on it, or the attacks on its security, without particular attention to the technology, and the people managing the devices.
Simon Harbridge: Of course. We’re working with a lot of schools at the moment to replace their obsolete Windows Server 2003 technology. Much of that is driven by the unique security threats to education that continuing to use it beyond the end-of-life Microsoft has decreed. We think about one in five schools will be left vulnerable. What kind of problems do you think sticking with obsolete technology like Windows Server 2003 can lead to?
Jay Abbott: Well, in the context of a school, where an “us vs them” culture exists between the general user base and supporting infrastructure, maintaining strong internal defences is essential. The ability to attack and exploit known vulnerabilities has literally become child’s play and can even be executed from mobile phones and tablets. Due to a combination of free access to the required tools, simple user interfaces, readily available information and video learning on how to use the tools and a general teenage desire to “mess around”, any unpatched and out of date systems accessible from networks that students are attached to is a recipe for disaster.
Simon Harbridge: Yes. I wonder if enough schools consider that these sorts of attacks can come from within? There’s a lot of focus still on the safeguarding issues sites such as ratemyteacher put into play, but more needs to be understood about the basics, such as the fact that without support on obsolete products, you are also without security, so the bottom line is, everybody in the school, and that school’s data is vulnerable, regardless of the policies, internet management software or pupil education schemes you have in place.
Jay Abbott: Ofsted focuses on digital safeguarding and the penalties for failure to make sure the standards are heavy, and lots of schools understand that. But more needs to be done to promote understanding that technology’s role in your Ofsted rating doesn’t begin and end with the device in the child’s hand. I would urge Ofsted themselves to speak more about this and offer clearer guidance.
Simon Harbridge: We met with David Brown, the ICT lead at Ofsted and had a very interesting conversation about data protection and the lack of awareness in schools of its importance. The Information Commissioner’s Office (ICO) can, and will turn its attention to education soon – the NHS has recently been audited and the public sector must be held accountable for the information it safeguards. Schools should be thinking about the safety and security of their pupil and teacher data as a matter of course, before any increased scrutiny begins.
Jay Abbott: Yes, and again, data compliance and security is a ‘back office’ issue. Education really needs to continue to get its entire house in order, not just the front line of technology.
This is a guest blog by professor Steve Ross-Talbot, senior director and venture leader, Cognizant
If you set a group of year nine pupils a challenge, it is striking how creative they can be. In a brainstorm, it always impresses me how they are able to immediately think laterally and intuitively, pulling in reference points from their friends, family, their environment and their use of modern technologies like social media. Having run similar sessions with groups of experienced executives, it is fascinating to see that these students most accurately reflect the role of the CEO, who has to approach all new ideas with an open mind.
This was one of the discussion points at the latest ‘Insight Day’ held at the Cognizant London offices earlier this month, in conjunction with Teach First and attended by students from five schools in the Hounslow area. As part of Cognizant’s STEM/STEAM (Science, Technology, Engineering, Arts and Maths) initiative designed to get children enthused about science and technology subjects, the pupils brainstormed wearable technology ideas, coming up with genuinely perceptive innovations. These included a football shin-pad with a chip to highlight diving or injuries, and a hat for explorers with GPS and a panic button synced with rescue operations. As part of the panel at the event, I was astonished at the creativity exhibited by the pupils and the energy and enthusiasm they showed in brainstorming and presenting business ideas.
This type of initiative is hugely important if we are to actively tackle the STEM/STEAM skills shortage in the UK. Figures from 2013 from the Digital Skills for Tomorrow revealed that around 745,000 additional workers with digital skills will be needed to meet the rising demands between 2013 and 2017 in the UK, with 900,000 vacancies across Europe by 2020 according to European Commission numbers.
In order to meet this shortfall, it is essential that businesses and schools work together to engage students at an early age by demonstrating the variety and range of career options available to them. Brainstorming STEM/STEAM ideas with relevance to the students’ everyday lives certainly helps, but in fact just travelling to Canary Wharf and getting a taste of office life – even if just for one day – can be a real eye opener. Running the session outside of the school environment also helps it to be memorable. Since the Insight Day, the teachers involved have told us that the students remained engaged with the topic and even wanted to continue researching and brainstorming more ideas.
Something else which was particularly encouraging was the role which the girls played in their teams. With daily reports that the STEM/STEAM skills shortfall is particularly acute amongst women who, despite making up 46 per cent of the UK’s work force, fill only 16% of IT and telecoms professional occupations, it was fantastic to see the girls at the event actively engaging with the discussions and often leading the presentations. With news also showing that less than 5% of girls in OECD countries contemplate pursuing a STEM/STEAM career, events which engage females at school age are crucial if we are to find greater balance in the future.
Crucially, with over 75% of new jobs in the next five years requiring STEM/STEAM expertise, the industry must recognise the need to increase the number and diversity of students in these subjects, working alongside the education system to encourage them to enter this field.
Days like these where we come together to give the pupils an opportunity to air their ideas in a setting like Cognizant shows them how their imagination can lead to significant results; as an outcome of this event, it is highly likely we will have seen at least one future entrepreneurial millionaire. Sparking new ideas is hugely rewarding and a proven way in which to encourage school children: one that should also inspire experienced business people to think a bit more like their inner 14 year old.
This is a guest blog by Steve Gandy, CEO at MeetingZone.
No one should be in any doubt that technology can enable people to work together anytime, anywhere any place and on any device. But although the UK’s flexible working policy has been in effect since 2014 it’s clear from recent research by Censuswide and Unify that organisations are still failing to enable employees to benefit from working remotely.
With today’s employees being able to email, access networks and manage tasks on the move – why does everyone need to be in the office all the time, at the same time? And what’s really blocking the adoption of flexible working practices?
Enabling cultural change
One of the key reasons is poor leadership from many of the people who head up UK businesses. Many company leaders continue to fight against new ways of working, happily burying their heads in the sand – taking the “if I can’t see ’em working, they can’t be working approach.” But take this decision at your peril – increasingly your competitors aren’t taking the same laid back approach and you may find that you get left behind!
The reality is cultural change has to happen within an organisation to make flexibility work, and that change needs to kick in from the top. When leaders buy into the concept of flexible working, it doesn’t take long for the domino effect to cascade swiftly throughout the rest of the organisation. To engender this change, senior management need to show a real desire to use technology, train employees on it and then trust them to get on with the job. Unless they do, all the investment in technology in the world won’t have any impact.
Tech at your fingertips
Of course, we all know you can’t always replace traditional ‘face-to-face’ interaction. And I’m not suggesting that remote working is the only way – it won’t work for all businesses – but there are a lot of times when it’s just not necessary to have everyone in one place.
It’s about utilising the array of unified communication solutions that are already available. Collaboration technologies such as WebEx or Microsoft Lync, offer a variety of options for organisations including desktop telephony and videoconferencing ‘presence’ (the ability to see when someone is online and available), instant messaging, screen-sharing and interactive whiteboarding. Most of these have been around for some years and can easily replicate face-to-face communications.
If embraced properly technology can be enabler of business change and has the potential to substantially boost to UK PLC. We’ve seen companies in diverse sectors like legal, construction and not for profit see immediate benefits with decisions being made on-the-fly, meetings taking place in real time and on-the-go. Many employers see a real boost in business agility, not to mention a significant reduction in the cost of travel and subsistence.
And it’s all easily achievable and will make companies more effective and efficient in the long term. So come on UK leaders – time to embrace change in the way we do business now and in the future.
This is a guest blog from Phil Tee, chairman and CEO of Moogsoft. This is the first installment of a three-part series that speaks to navigating the challenges for tech startups in working with universities to accelerate innovation and business goals.
1. How do tech startups consider partnership with universities and when is it valuable?
There has been a long-standing tradition of partnering between established companies and academic institutions to drive innovation. Today, this type of collaboration is booming in the US and the UK, due to policies being created in both regions by higher education authorities to foster these relationships.
However, when it comes to tech startups, the collaboration could go deeper and be more meaningful. One of the obvious reasons for the cultural disconnect is that startups (venture-backed, Silicon Valley entities, especially) are commercially motivated to pursue academic research. Universities, conversely, are bastions of domain specific knowledge looking to answer big questions and make great leaps in their respective fields. The major differentiator here is time. In the case of startups, VCs are funding 18 months of operational costs, but for researchers, that timeline barely covers pursuit of an advanced degree.
So, once this differentiation is accepted, how can these two spheres be aligned enough to make the end result resonate?
Clearly, corporate and academic partnerships can be a win-win proposition for both sectors. For tech startups, these collaborations can be a strategic addition of resources into R&D projects that they may not have bandwidth or talent internally to complete. In addition, they can add a robust recruiting pool for future staffing needs. For universities, these partnerships create an ongoing opportunity for testing the rigor of theories, or leveraging existing patents towards real-world applications. The resulting synergy is that of an engineering/design model tailored to the dynamics of a specific business use case.
But how do tech startups identify the most opportune times to partner with universities? Based on my experience gained over the past 20 years and across three startups, these are the key areas for consideration:
- · Exploratory Phase: As a startup evaluates a market opportunity in a new technical or business area, it may seek additional input and a venue for testing key concepts. University researchers can often help to support these needs and validate differentiators for the initial idea/technology.
- · Data Analysis: Especially in the case of a startup working with large quantities of data, university researchers can support computational analysis. Since graduate labs are often working with data sets in novel ways, academic researchers can act as a new set of eyes for a business problem that has a data component. Such was the case here at Moogsoft, as we wanted to apply a data-driven approach to solving important IT operational management problems.
- · Technical Challenges: If a company encounters a technical log jam, university partners can add value by taking on a piece of that challenge and addressing it part-by-part in collaboration with the startups’ R&D team. If left lingering too long, these challenges may monopolize the startups team’s time and impede technical and business processes.
At Moogsoft, we’ve built an ongoing relationship with those pursuing an advanced level of data science at a leading UK university. This collaboration has helped develop our machine learning technology, creating a significant competitive advantage. For a tech startup like ours, this type of partnership has produced a real impact on accelerating our differentiation and has allowed us to accomplish more as a startup.
Phil Tee is Chairman and CEO of Moogsoft. The second installment of this series will focus on managing intellectual property rights to move product innovations from lab to market.
This is a guest blog from Geoff Smith, Managing Director of Experis, an IT recruitment specialist
Despite the capital’s prominence as the home of silicon roundabout start-ups, Shoreditch design houses, and global banks and insurance companies, the UK’s boom in IT job creation is more than just a London success story. The government’s recent Tech Nation report and our own Tech Cities Job Watch(TCJW) both reveal a fierce hunt for top IT talent across all corners of Britain. The reality is, job seekers are firmly back in the driving seat and employers need to make themselves aware of the market trends to ensure an important competitive edge in the battle for talent.
In the midst of a protracted skills shortage, there’s little room for employers to be complacent when it comes to finding the people with the right experience to fulfill IT and digital needs. The Tech Nation report revealed that 74% of digital companies in the UK now operate outside of London. The knock-on effect is that clusters of experienced IT professionals with the sought-after skills are moving outside of the obvious hiring grounds of London, leaving our capital’s businesses with no choice but to extend their recruitment search across the UK for the skills they require.
Our first TCJW report showed that ‘Tech Cities’ such as Manchester, Birmingham, Bristol, Cambridge, Newcastle and Edinburgh have emerged as areas where multiple companies are looking to find talented IT professionals with in-demand skills. Whilst many of the emerging cities have great transport links, London-based companies that wish to access talent based in these areas should consider reviewing their flexible or remote working policies to attract some of the talent back.
Another factor employers need to consider when looking for talent is that soft skills can be just as vital as technical skills. We regularly speak to companies that have employed people solely for their tech prowess and later realise that they also need people who can share their knowledge with others and communicate the value of new developments to management, such as emerging programming languages. They need to be able to interrogate data and present it to varying audiences in plain English.
The expansion of opportunities outside the capital has big implications for candidates too, putting them in the driving seat with greater control over their careers and multiple options for location, salary and work life balance. Skilled IT candidates who have developed their experience and skillset over several years are in a strong position to find excellent job opportunities. Those who are willing to travel for work can explore lucrative contractor roles, commanding day rates averaging £521 for Cloud positions in Bristol and £600 for Big Data positions in Cambridge.
The TCJW report showed that Mobile Development skills were in massive demand across the country. This area has grown rapidly in recent years, with a particular demand for android, iOS and Ruby on Rails development skills. Web developers are also in high demand, particularly those who can work confidently with back-end .Net, Java and SQL skills and front-end UI, UX and visual design skills.
There is a growing need for IT Security skills as employers look to improve their defence against the increasing number of cyber threats. They will be looking very favourably on candidates with CLAS, Cyber Security and Information Security skills.
These highly sought after skills have a tendency to gather where there are opportunities. For example, Cambridge is leading the cities outside London for permanent Mobile Development and Web Development pay, with average salaries reaching £41,032 and £39,750 respectively. Meanwhile, Glasgow is leading on Big Data (£45,300) and IT Security (£50,804) and Birmingham on Cloud salaries (£52,684).
We intend to release our TCJW report on a quarterly basis to keep businesses updated on the trends shaping IT recruitment and enable candidates to make informed decisions based on the evolving opportunities across the country.
As demand for IT talent continues to outstrip supply, it will be increasingly important for employers to broaden their talent search in order to meet their current and future skills needs. Similarly, candidates will make themselves more attractive to employers if they ensure they are equipped with the latest technical and soft skills, and those who are flexible enough to work remotely or travel cross-country will be well positioned to secure the best jobs. As the battle of the Tech Cities rages on, a buoyant market for new jobs and opportunities will continue. IT pros and employers who embrace it will be the ones to watch.