This is a guest blog by Mark Chambers, chief executive of Naace who warns that coding has been given too much emphasis in the new curriculum.
There are currently many exciting projects around coding in schools, and it’s been great watching children getting involved with computing both within the classroom and in extra-curricular groups. It’s quite evident that a number of teachers and students in the classroom are motivated and enthused by the creation of gaming projects, application development and even the solving of coding conundrums for their own sake – just because we can.
However, the number of teachers and students engaged with computing is still small and significantly smaller than those who used to engage with digitally creative projects involving the broader ideas of ‘Information Technology’. It is my argument that we have “thrown the baby out with the bath water”.
The direction of thought that branded ICT as “bad” is well documented with anecdotal evidence such as, “my child was subject to endlessly tiresome Microsoft Office activities”, dominating any discussion that politicians and industrialists had with the education community. The subsequent development of the Computing Curriculum was intended to specifically redress this situation with the strong feeling that by introducing more explicit Computer Science all would be well. This decision and political endorsement was followed by an injection of millions of pounds of public money to upskill our teachers in Computer Science.
Yet the reality is that there are huge gaps in the provision of sufficient high quality professional development that would improve the experience of young students. Indeed, if Information Technology was taught badly (the original argument for changing the curriculum), then nothing has been done to alter this fact apart from “moving the goal posts”. There is no national commitment to improving the teaching of Information Technology or Digital Literacy and the private sector provision which used to exist is being destroyed in the face of publicly subsidised but narrowly focused Computer Science training.
Unsurprisingly, schools have picked up on the importance of Computer Science and are understandably leaping at the opportunity of having volunteers from the parental community or from industry delivering the subject. However, many are also demonstrating an attitude that suggests they feel this provision is enough and that consequently the timetabled Computing Curriculum can be made a lower priority. Two thirds of the Computing Curriculum is in danger of both poor teaching and perhaps even disappearing from our schools. This abdication from developing digital creativity is shocking and will significantly impact our competitiveness as a global economy, with UK plc rapidly falling behind in the creative industries.
My argument is not that Computer Science and specifically the glamorous “coding” is unimportant; but actually that there is far more that must be considered. There are a number of skills and competencies enshrined within the Computing Curriculum that are capable of providing young people with a varied, imaginative and creative learning experience, equipping them for higher education and employment in a much more balanced way than simply being able to repeat code.
What unites all three strands of the Computing Curriculum is a focus on “Making”; solving problems to a purpose and for an audience. For a couple of years, Naace has been developing the Third Millennium Learning Award and some associated tools to support schools in developing their pedagogical approaches when making effective use of technology. This project has shown us that when students are offered the opportunity to experience problem-solving using digital tools in an increasing complexity of circumstances, they can achieve amazing progress and outcomes.
There is a significant danger that too much of the touted practice in Computer Science is artificial, abstract and removed from the involvement of real problems and real people. Indeed, I recall with distinct clarity one strategic meeting on the development of the Computing Curriculum where the participants remembered with great fondness their days of copying code from books and magazines. For me this recollection was a personal nightmare; what I recalled was the opportunity to use building blocks, short cuts and libraries to design and make the solutions I was looking for. Why reinvent the wheel when what I need to know is how to select the right one and fit it correctly to the other components of the car to get it moving safely and effectively from A to B?
The problem with the previous approach to the Computing subject was as much the way it was taught as the skills pupils learned, which are now being neglected. These skills, such as e-safety, using search engines effectively and online communication, may lack coding’s glamour but are an essential part of learning to use technology. Coding is important, but not everything under the sun is coding. It takes the development of a variety of digital skills and knowledge if we want students to use technology, understand how it works, solve problems effectively and communicate those solutions to clients. Students who can do this will be in high demand in employment and will drive the future success of our economy.
This is a guest post by John Yurkschatt, director of the IT Services Practice Area for Direct Consulting Associates (DCA), an IT consulting and staffing firm.
It is often in conversations with leaders in the IT industry that I hear references to past mentors and how someone else played a significant role in their success. The more successful people I meet, the more I hear individuals speaking of their mentors or those they are mentoring.
I am a huge believer in a strong mentorship program. To this day, I make sure that I am continuously mentoring as well as being mentored. Although I have experienced success in my career, I am humble enough to realize that luck and good circumstances can often play a bit of a role. Now, if I can identify where I need to get better, improve, and develop additional tools, I can benefit from being on both sides of a mentoring program.
The opportunity to share success stories, horror stories, and times where we’ve hit the proverbial brick wall can benefit everyone, especially the mentee. Mentorship programs require humility, an eagerness to learn, and an eagerness to grow. A successful mentorship program requires both parties to share openly, freely, and put knowledge/insight into practice.
There are four types of IT mentorship programs:
• Career Mentoring: Assists mentee’s with the opportunity to work towards a predetermined career path. This type of mentoring is designed to allow individuals to work from point A to point B with the help of the mentor assisting with that development.
• Networking Mentoring: Enables individuals to meet others in the market place to allow for idea and name sharing. Often times this type of informal mentoring leaders to insight and knowledge transfer.
• Orientating Mentorships: This type of mentoring usually begins within the first few weeks that a new employee is at an organization. Orientating mentoring allows for newer employees to become acclimated into a new climate/culture/work environment much quicker.
• Untapped Potential Mentorship: My favorite type and often the most challenging.
Untapped potential mentoring takes place when an average or underperforming employee has a fantastic skill set or potential but has other components of their being preventing them from reaching their potential.
Without question, there are numerous types of mentorship programs, but nearly all try to accomplish the following:
-Foster personal & professional growth
-Create and/or develop a sense of career awareness
-Generate a thirst for knowledge
-Instill a desire to be great
In today’s IT marketplace, mentoring programs are often put on the backburner due to a shortage of time and availability. However, mentorship programs should not be taken lightly. In fact, a study done by Sun Microsystems University Mentoring Program followed the career progress of mentees over a five year period and it showed that mentees were 20% more likely to get a raise sooner than other employees and were promoted five times more often than those who did not have a mentor.
Here are 4 ways mentors help mentees get ahead faster:
1. Talent Development: Everyone wants top talent. A mentoring program can allow an organization to use its top performers to help others grow into top performers.
2. Knowledge and Contacts: Mentoring programs often lead to introductions, networking events, and knowledge transfer. As top performers progress in their careers, it is often more about who you know than what you know, to help get to the top.
3. Wisdom/Insight: One of the most important aspects of a mentoring programming is preventing the mentee from making the same mistakes the mentor has made. If it took the mentor 10 mistakes before they earned their promotion, sharing that insight could lead the mentee down a much quicker path to reaching new heights.
4. Improved Performance: A good mentor will provide you with valuable feedback or make suggestions that will enable you to improve your skills or to experience personal growth, ultimately leading to your improved performance.
As an organization, if a mentoring programming can help our new hires, average, struggling, or even most successful employees continue to grow, it will in-turn, allow our business to grow and will continue to push that ceiling even higher. About 70% of Fortune 500 companies have mentoring programs because the results give them a competitive advantage. If an organization not only values the success of their business, but also the relationships and long term growth of their employees, the organisation will likely better retain top employees.
To conclude, mentorships can be invaluable. The wisdom, insight, or development that employees gain today, may lead to a career track that may have never been possible.
This is a guest from David Wright, director UK Safer Internet Centre at SWGfL, who explores the dangers to children’s wellbeing posed by political extremists on social media and outlines the steps schools can take to protect students from indoctrination online.
According to estimates, approximately 700 UK citizens, including many teenagers, have travelled overseas to join conflicts. They include three teenage girls from Bethnal Green in East London who were thought to have travelled to Syria.
The BBC reported that the families of the three 15 year-olds have appealed for them to return and said that there were no signs they were planning to go to Syria.
Prevent, the government’s counter-terrorism strategy, was written in 2003 with prevention as a key priority. It focuses effort and resources on ‘priority areas’. In these areas, those working with children typically have a general understanding of the threats posed by radicalisation.
However, the increasing use of online tactics and social media by extremists has changed how these ideas spread. These technologies don’t recognise ‘priority areas’; they extend across the entire country.
The threats we are seeing take many forms. There are the high-profile incidents of young people travelling to countries such as Syria and Iraq to fight, but there are less obvious and wide-ranging risks. The Internet, and in particular social media, is being used as a channel, to not only promote and engage, but also, as suggested by Robert Hannigan (Director of GCHQ), as a command structure.
In response, towards the end of 2014, the UK Safer Internet Centre issued a briefing to all local safeguarding children boards to highlight this issue.
Clearly, everyone has a responsibility to report a concern about a child but the Counter-Terrorism and Security Act 2015 obliges schools and other authorities to prevent people from being drawn into terrorism. The guidance lists five key areas:
– Risk assessment: Schools must assess the risk of their children being drawn into terrorism, as well as their support for extremist ideas and terrorist ideology. They should have robust safeguarding policies to identify children at risk, and appropriate intervention and referral options. The policy should also cover the suitability of visiting speakers.
– Working in partnership: Schools in England are required to ensure that their safeguarding arrangements take into account policies and procedures of their local safeguarding children board.
– Staff training: Schools should ensure that their staff are equipped to identify children at risk of being drawn into terrorism, as well as challenge extremist ideas. They should know how to refer children and young people for further help. The Home Office’s free training product about radicalisation awareness, Workshop to Raise Awareness of Prevent (WRAP), may be a suitable option.
– IT policies: Schools are expected to ensure that children are safe from online terrorist and extremist material in school, typically using filtering systems.
– Monitoring and enforcement: Ofsted inspectors will assess a school’s approach to keeping children safe from the dangers of radicalisation and extremism and what is done when the school suspects that pupils are vulnerable to these threats. If a school is considered to be failing in this regard or if their staff or children’s safety is threatened, maintained schools will be subject to intervention and academies or free schools may be subject to termination of funding. For independent schools in England or Wales, they must remedy any failing or be subject to regulatory action. Early years settings are also covered by this monitoring provision.
The power of the Internet and social media has highlighted a need for an open and ongoing dialogue in our communities – among children, young people, parents, carers, schools and wider – to ensure that young people have the skills to be critical thinkers online and are resilient to online extremism, whether from groups like Islamic State or from others.
To support schools with e-safety, South West Grid for Learning (SWGfL) has created 360 degree safe, a free online self-review tool and Boost, a comprehensive online safety toolkit.
This is a guest post by Ben Dowd, business director at O2 who shares why connectivity should lie at the heart of every startup operation.
While it is true that access to broadband and transport links are essential in enabling the growth of digital start-ups, it is even more important that businesses of all sizes understand the true benefits that connectivity can bring.
It’s no secret that one of the major advantages UK small businesses have over larger enterprises is their agility and ability to respond quickly to ever changing landscapes. The rate at which the UK creates and adopts new technologies is increasing all the time and it is those businesses who are the quickest to react who will see the biggest benefits. Small businesses are able to simply avoid many of the challenges of scale that come with size, whether that’s coordinating the work of hundreds of employees across multiple locations or sharing knowledge across disparate functional groups. As a result of this greater flexibility, small businesses are naturally better able to respond to new challenges and opportunities – but it is possible for businesses of all sizes to do this and the secret to success lies in better connectivity.
Many larger businesses have already realised this and are embedding digital at the heart of their strategies. O2 research commissioned recently, showed that digital interaction in businesses leads to increased job efficiency and business productivity.
In fact, a third (36%) of senior management and a quarter (24%) of employees in Britain’s largest organisations believe using more business technology for customer and employee interaction will lead to greater business productivity and more than 70% of senior managers think the use of technology for customer interaction has already had a positive effect on their organisation.
I would urge start-ups to take heed of these statistics and ensure connectivity lies at the heart of their operation from the outset. I also have three key pieces of advice for start-ups wanting to make the most out of technology for their business.
Firstly, employees must be equipped to work anywhere. Research that we conducted with CEBR last year found that 80% of businesses report that staff do not have full access to the key business systems that would make their working lives easier and more efficient, including the technology to work away from the office. We can easily overcome this problem by something as simple as ensuring that every employee has a company laptop and mobile.
Secondly, business owners need to consider the benefits to their customer service of accessing the fast and reliable connectivity of 4G, which allows them to respond to orders and enquiries quickly and efficiently, from any location with any mobile device. The cost of connectivity equipment isn’t prohibitive. Laptops, smartphones and 4G dongles are all easily available to businesses of any size.
Finally, businesses of all sizes spend too much time and money on travel that isn’t always necessary. Whether it’s between home and the office or between multiple locations around the UK, employers should make journeys more efficient. Our research shows that measures like investing in remote working technology, from webmail to 4G connectivity for laptops, could save employees 127 hours per year.
Every business must understand its own connectivity deficit and take measures to close it. Even small improvements, such as using smartphones, mobile apps, and cloud computing services, will help drive business productivity and restore the UK’s competitiveness within Europe and beyond.
Codio has released an Ask the Expert’s video which investigates the best ways of teaching computer science through coding in schools.
The video gives advice to teachers who are considering taking the computer science accreditation.
Throughout Sue Sentance, National Academic Coordinator at Computing at School (CAS) and Senior Lecturer in Computer Science Education at King’s College London speaks to Codio about the structure of the certificate and the benefits for teachers.
BCS e-assessors Ghita Kouadri Mostefaoui and Fintan Culwin also provide their insight.
Phillip Snalune, co-founder of Codio, said, “Our collaboration with BCS underlines Codio’s commitment to helping teachers reach the level of proficiency required to deliver excellence in the teaching of Computer Science.
“We hope that this video helps teachers understand what the certificate entails and how it could help them develop both their confidence and technical skills.”
You can watch the video here: https://www.youtube.com/watch?v=a7-DaG19yC0
Simon Harrison, CIO, Kingston University
As a leading UK university, we want to make sure that our students get as much as possible from their time with us, whether that be in their recreational time or during their studies. We know what technology can give us the upper hand here because students have come to expect the same level of technology in their studies as they do in their personal lives. They have been using laptops, smartphones and tablets on a daily basis and have constant access to hundreds of applications, from social media and email to online banking and retail. Why should they not have the same supporting their education?
Matching millennial expectations
Month on month, we at Kingston are seeing more and more connected devices being used across the university, including a Bring Your Own Device (BYOD) community of 20,000 students. Not only do they expect their tech to work when they come to university; they expect to use their own devices to do their academic work.
Research by one of our key partners and suppliers – cloud computing and virtualisation company VMware – showed just how high up technology is in a student’s hierarchy of needs – it’s second only to food and shelter. Half (46%) of the 1,000 students questioned stated that they considered the level of IT on offer when choosing their university. Yet more than a quarter (26%) said they didn’t feel that the technology provisions at their university were consistent with the tuition fees they are paying. And of course with those fees being higher than ever, so are their expectations around technology. So, what can be done by IT departments to make sure they are meeting the needs of their students?
Using technology to enhance learning
From our experience, technology is absolutely crucial for enhancing the learning experience, from infrastructure and networking, to the applications they make available and how they are accessed. It’s also hugely important to be progressive and look forward, given that most of our revenues come from government funding or students. For us, that meant investing from the ground up by providing high-speed web and data access across the entire campus, and replacing our servers, storage and back-up hardware. Expanding on the existing infrastructure to create a ‘university without walls’, we then installed a virtual desktop solution with VMware. This means all students can work from anywhere on any device, with access to the applications they needed to study in a secure and efficiently managed environment. Given that we have around 600 teaching applications, which range from design and geoengineering through to statistical packages, access to these from any location is absolutely critical to support academic learning for our students.
Ultimately, universities need to realise how important technology is to students; 92% claim that having IT facilities can help them work in a more flexible way and enhance their studies. IT departments need to ensure students have a positive experience and that they can easily and reliably access the university systems and information to support their learning from any location and device – whether that is on campus or at home, in the UK or abroad. Supporting a generation of people who have grown up with the most advanced technologies at their fingertips is something we should all aspire to do.
This is a guest post from Ivan Horrocks, senior lecturer in technology management at The Open University
It’s a fairly common situation in IT: You’ve spent the last ten years becoming an expert in your field, impressing everyone along the way, and now your boss tells you they want you to go down the management route.
To some this is welcome news. They’ve done their time doing the hands on work, and their reward is a move towards senior management. But to others, nothing is more depressing than abandoning working with the technology they know and love to sit in board meetings and negotiate deals.
Most people want to feel a sense of progression in their careers, but this doesn’t have to mean changing tack into business management to get into senior management. Although relatively unrecognised, there is an important difference between mainstream business management and technology management. For those wanting to progress their careers whilst remaining in touch with the world of technology, a far more preferable route may well be the latter approach.
Ultimately technology management focuses on the relationship between the management of technology and innovation, and how these relate to other areas of management, including operations, finance, supply chain and logistics and strategy.
This means looking at how existing technologies can work together to enhance an organisation’s processes and products or services. It requires a broad perspective on what’s available and an understanding of what works best, when, where and why. Technology managers appreciate how technology can be integrated; how the skills they and their team have can be employed to improve business operation and deliver value, both internally and externally, and where and when they need to bring new skills in to do so.
Technology managers are also technology strategists: always looking at ways to foster technological innovation. But implementation and application are also key concerns. Giving the teams you work with the time and space to experiment with different technologies, explore how they might work best, and develop or customise these for the benefit of your organisation and its customers.
As a strategist you will also need to guide this experimentation, create room for it to fail, and find ways to spot and develop what works. And you will need to model the consequences of integrating technology – understanding the impact it will have on staff and customers and ensuring there is appropriate training and support to make the new ideas a success.
What does success look like?
It is now widely accepted that IT no longer just provides the infrastructure for people to do business. We hear ever more about how strategic use of IT has transformed organisations and peoples’ lives in exciting ways.
Increased connectivity has created new delivery models – from on demand video to legal and financial services delivered online – disrupting long established industries.
Sensors in wearables are transforming industries like healthcare which can monitor for warning signs without having to keep people in hospital beds, saving money and freeing up time for other patients.
Harnessing data is allowing better targeting and streamlining. Transport – from vehicles, fleets, trains and planes – all collect huge amounts of data from sensors which they are now using to model ways to use fuel, reduce delays and predict problems.
These are all examples of the transformational uses of technology, particularly in business. They would all have been developed with the support of senior management and driven by the organisation’s strategy, but requiring a detailed knowledge of technology development, selection, customisation and integration.
Becoming a technology manager
Technology has the capability to make a significant contribution to organisational performance, economic growth and social well-being. As a result, there is an increasing demand across public, and private sectors for people with both the operational and strategic capability to plan, develop and manage technology and technological innovation.
To pursue this route, you will need to build on your technical expertise by developing the knowledge and skills to make the right decisions about technology acquisition, exploitation, implementation and innovation. These may rarely be taught in undergraduate or vocational IT and computing courses but they build naturally on the education and experience of someone whose career has primarily focused on developing and delivering successful IT projects.
None of this means you shouldn’t go the pure management route into senior management, if that’s your preference. Technology specialists develop many core business skills and can make a huge contribution to a more general management role. It’s just that they don’t have to – now more than ever there is a demand for managers who have a deep knowledge and experience of technology. General management principles are necessary to progress, of course, but this can mean building on a specialism, not abandoning it.
Dr Ivan Horrocks is qualification director for technology management (TM) at The Open UniversityThe TM programme is aimed at helping technology professionals and their organisations advance by using technology strategically to deliver innovation and drive the business forward.
This is a guest post from Evanna Kearins, director of analytics, TIBCO Analytics
Harvard Business Review called data science the “sexiest” job of the 21st century, yet a survey conducted by the CBI warned that 39 percent of companies cannot find staff with the required skills and knowledge in science, technology, engineering and maths.
With the number of GCSE students studying IT-related subjects increasing, it is surprising to hear that student numbers for A-levels in computing and ICT have decreased. But whilst ICT is a subject that is helping to develop the skills data explorers need, other core subjects like Maths, English, History, Science and Business studies are also indirectly all helping to develop the next wave of data scientists and data explorers. The following core subjects being taught in schools are indirectly teaching students they skills they need to help fill the current skills gap:
ICT – The next generation is growing increasingly comfortable in using an array of devices and platforms. With the Internet of Things connecting devices, the volumes of data available are growing exponentially. In teaching children how to use platforms and understanding the nuances of each, they are also teaching fundamental skills needed by any data explorer.
Maths is another subject that can help students begin their career as a data scientist or explorer. Pupils are taught how to read charts and are able to gain a better understanding of comparing different patterns. These skills are not only needed in the class room, but by data explorers to make valuable business decisions as they seek to understand the data flowing through the organisation.
English – Being able to crunch numbers is important for a data explorer, but so too is having the ability to communicate the true value of this to the other board members. Communication is key in all walks of life so being taught English helps to translate numbers and data into meaningful insights that can transform the business. This is also crucial to bridge the gap between the business and IT functions.
History – Having an understanding of the past and how this will impact the future is also an important aspect. Just as students must analyse events in History, so must data scientists and data explorers. After all, past data trends can help shape the future through predictive analysis.
Science – Just like in an experiment whereby you have a hypothesis, methodology and then actively test for results, Science can help to identify what is fact vs. what is fiction. It is important to not just base data analysis on predictions. Understanding the difference between what the data says and what it actually means is a bit like a science experiment in itself!
Business studies – Interpreting the data is vital for any data scientist. But what happens to the data once it has been analysed? Through exploring the data and understanding the different areas of the business, data explorers should be able to identify how data analytics can impact every part of the business.
We are all now an intricate part of a new ICT ecosystem, one built on big data, apps and industry innovation. This is something that the graduates of today know better than anybody else. Making sure young people have the right skills may well be only half the battle, but there is a greater opportunity than ever before to build upon student interest, encourage the training of a wider skillset and help to find the innovators of the future who will play a valuable role in bridging the data scientist skills gap.
This is a guest blog by Dr Arosha K. Bandara, senior lecturer in Computing at The Open University
A criticism often levelled at IT education is that by the time you come to apply the skills, they might be out of date. Why learn technology skills when that technology might not be in use in a couple of years?
IT does change fast, but the fundamentals of how we design and build systems change at a slower pace. As long as we learn about today’s technology in the context of how it relates to the business world and how it is likely to evolve, then we will be in a much better position to respond intelligently to the changing world.
But this is often overlooked by both formal and in-house training programmes, which have favoured skills which address very specific challenges. In order to be adequately prepared to tackle tomorrow’s technology challenges, we need to move from a mindset of knowing how to apply technology to well understood situations, to one of being able to think critically about problems, and identify solutions to unknown as well as familiar technology issues.
To prepare IT professionals for the rapidly changing world of technology, we need to instil an approach based on critical thinking. I’ll look at how we might do this, before putting this approach in context.
The organisation you work in is complex. It is shaped by the nature of individual thinking processes as well as existing technology and business pressures. Any changes will have causes and consequences that may have a much wider impact. Solving a problem will change things, which could lead to other problems.
Different people see different priorities. There is sometimes no obvious answer, or many different reasonable answers. But there are also wrong answers, which can be pursued, sometimes at great cost. These often result from a very narrow focus on the problem out of context.
Interconnections are too often ignored, a single cause may be presumed, or an individual quickly blamed. This is not exclusive to IT, we see this in wider society all the time – it’s easier to blame crime on individual criminals than deal with the many complex societal factors that lead some to criminality. The other mistake is a focus on outcomes – ie how many criminals can we arrest rather than how many crimes can we prevent.
To avoid these mistakes, problems should be approached by thinking about the systems that affect the challenge or opportunity. This is more difficult than isolating and addressing a problem, but ultimately more likely to produce a better solution.
Thinking about systems
As well as looking at how technology works, it is necessary to think about how people will react to it. Is a great new technology too hard to learn? Will tough new security procedures incentivise people to circumvent them? We need to understand the systems in which new technology operates.
Cognitive mapping is a technique for understanding and shaping the mental models your stakeholders use to perceive, contextualise, simplify, and make sense of otherwise complex problems. Thinking through these will help ensure new technologies and programmes have the results they are supposed to.
However good your plan is, you won’t foresee everything, so it is also critical to continuously test and review, and feed that learning into your ever evolving plans. Throughout the life cycle of any project, topics such as stakeholders, finance, risk, people, project administration and quality must be constantly reviewed in the context of the project.
The world of the future will require more understanding of flexible management. We will have to place more emphasis on learning as we go and making sure that learning changes our practice and organisations. We need to get used to this.
Critical thinking in context
Two core skills of any modern IT professional are cyber security and software engineering. Both relate to complex real world challenges and can only be dealt with effectively if they think critically.
Firstly, cyber security. Any IT professional needs to fully explore the available security technologies and stay up to date with them. But they also need to think through the risks that may arise in all relevant aspects of an organisation’s operations which may impact security, including human factors, web services and system upgrades.
You also need to be able to plan for when things do go wrong. Again, this needs an understanding of attackers’ motivations and employee weaknesses, as well as of the technologies available to circumvent your defences, and a sense of how these could evolve. It also requires an understanding of the legal frameworks and technologies relevant to digital forensics, which are essential when responding to cyber security incidents. Only then can effective plans be made.
Teaching all this must be put in a real world context. In our own post-graduate courses, most students learn these techniques by crafting a fit-for-purpose Information Security Management System for the organisation where they work.
Secondly, software engineering. Contact between the business and the external world is often mediated by software, and the business has a responsibility to its wider community that may be served, or jeopardised, by this software.
Skilled software engineers can add a lot of value by creating or adapting software, from managing projects and sales, analysing performance and customer data, and automating tasks. All of these exist in a complex real world, where humans react to change in different ways. Any new system must understand how users or customers will respond to it.
The skill is not one of knowing how to do this, it is one of knowing how to model the relationships between the software, the organisation it serves, and its wider environment. This approach must be used in development, roll out, updates and maintenance – it is an evolving process.
Critical thinking doesn’t mean ignoring technology, of course. The process can be evolved further by an understanding of different software engineering tools that can help them simulate, manage and monitor. Using these effectively is part of the skill of good IT planning.
A critical approach allows you to plan effectively
IT is critical to business and will become ever more so. It exists in an increasingly networked and interconnected world, where groups, teams, organisations and even nations will have to be smarter in their ways of working together.
IT professionals therefore need to be able to think in ways that reflect these challenges. IT education at all levels must teach how to take a critical approach which relates technical competencies to complex technological, human and business issues.
Dr Bandara teaches Postgraduate Computing courses at The Open University aimed at helping IT professionals advance by using technology strategically to drive the business forward.