At IDC’s annual FutureScapes event this week, the APAC group vice-president of the technology analyst firm, Sandra Ng, took a shot at predicting the technology trends that will shape the things to come in 2019.
Among her predictions were the growing data management and monetisation capabilities, efforts to a harness APIs to build a developer and partner ecosystem, and the use of key performance indicators to measure the success of digital transformation efforts, among others.
Each of these trends underscores what many of us already know is the crux of digital transformation – to drive change across an organisation’s people and processes, undergirded by technologies such as cloud and mobile computing, artificial intelligence (AI), robotics and data analytics.
While these trends should not surprise anyone in the technology industry, what made Sandra’s presentation different this year was the ‘digitally determined’ APAC companies that she had highlighted.
These companies include Taiwanese bank O-Bank, which has 20% lower customer acquisition costs than its peers, a Hong Kong nightspot operator that uses facial recognition to gather insights about patrons, as well as Indonesian conglomerate Lippo which formed a separate digital unit that created the OVO mobile payment app.
But what captivated the audience most was the slate of Chinese companies highlighted during the presentation. From the WeChat super app that even street beggars use to solicit digital donations to insurance giant Ping An that is morphing into a tech supplier, China’s companies are clearly at the bleeding edge of innovation.
Led by China, which is now on more-or-less equal footing with the US in developing frontier technologies like AI, Asian companies are no longer playing second fiddle to their Western counterparts.
In fact, the depth and diversity of talent and ideas has drawn more venture capitalists to Asia, which according to KPMG, accounted for the majority of global venture capital investments in the third-quarter this year.
The Asian tech century is now upon us, and things will get even more interesting over the next decade.
Earlier this week, all members of Computer Weekly’s APAC CIO advisory panel gathered for the first time to share their thoughts on digital transformation and what the overused term means to them.
Kicking off the lively discussion, which was held at SAP’s office in Singapore, was MyRepublic CIO Eugene Yeo who remarked that digital transformation isn’t just about adopting new technology.
Just as important is the need for employees to embrace a mindset of change and this is already being demonstrated in how MyRepublic develops new applications with a DevOps mentality where changes are expected and not frowned upon.
Manik Narayan Saha, the CIO of SAP Asia-Pacific, and Kwong Yuk Wah, CIO of NTUC, were of the same view that digital transformation isn’t a new undertaking. In fact, digital transformation started at the dawn of computerisation in the 1960s when enterprises started using computers to run some parts of their operations.
The CIOs then went on to share more about how they managed an inter-generational workforce amid their digital transformation efforts, how they have been measuring the success of digital initiatives, and perhaps more importantly, their change management strategy.
In particular, Nigel Lim, a Singapore-based senior IT manager at a Japanese company, called for the need for corporate functions such as HR and finance to be better aligned with digital transformation efforts, which isn’t always the case.
Amid the rapid pace of change, these functions would need to relook the way they assess the financial returns of digital projects, fund new digital initiatives and hire talent who are increasingly drawn to more attractive job prospects in high-growth, emerging markets such as China.
We will be filing some stories on some of these discussions – including what the CIOs thought about bi-modal IT – but one thing is clear in the meantime: digital transformation, which may seem like a buzzword at times to some of us, is real and will only accelerate in the years ahead.
This is a guest post by Bhupendra Warathe, chief information officer for corporate and institutional Banking, information technology and operations at Standard Chartered Bank
As the world adopts real-time payments, creating massive volumes of instantaneous transfers in seconds, the challenge for banks has evolved from managing liquidity to managing velocity.
Digitisation is driving the growth and future of real-time payments. In Singapore, funds transfers between two local accounts can be done almost instantly. Hong Kong, which launched its near-instant payment scheme this month, may see bank-to-bank transfers completed just as quickly.
Such payments have not only created the need for 24×7 funds flows but also at higher frequencies. As a result, payments and treasury departments can no longer adhere to batch and daily processes, and the need to move to real-time systems is urgent.
While most of the development in fast payments has focused on domestic transfers between individuals with a capped sum, in some jurisdictions participants have included non-bank businesses such as remittance providers and e-commerce players. With the current pace of implementation, it is a matter of time that cross-border instant payments is fast becoming a reality.
Just earlier this year, Swift held exploratory talks with banks from the Asia-Pacific region about the development of a regional cross-border real-time payments system based on the Swift global payment system.
How do banks respond to the challenge?
The demand for instant liquidity, dynamic FX exposure management as well as the ability to process real-time cash flow and transaction data mean that banks have begun to deploy the combined strength of distributed ledgers, artificial intelligence (AI) and application programming interfaces (APIs) to transform into a highly effective, high-performing and value-added banking for clients.
The speed of real-time payments also makes it vital for banks to perform instant fraud and identity checks before the payment is sent. At Standard Chartered, these systems are supported by as many as 12,000 coders and technologists, and they now account for about 15% of the workforce. The numbers also underline the extent to which banking has become a digital business.
As we move forward, speed and agility are two critical factors driving success. In the past, software upgrades took place once in a few months, but with the rapid changes in today’s environment, the development of software, upgrades and deployment need to happen at a much faster pace.
DevOps is one way to deploy software into the production environment quicker. With this approach, testing and deployment processes are fully automated. New code is dropped into production while the system with the previous codes will still function, allowing the end-user to continue using the services.
A rapidly changing environment has also caused banks turned to partnerships to help them adapt quickly. In recent years, the concept of open APIs has become increasingly prominent in our industry. In the next three to five years, we project a massive integration of service providers’ platforms with banks leading the charge.
Open API-led transformations will enable banks to accelerate collaborations with outside organisations and third-party developers. Increased co-created systems will allow a bank to redraw the boundaries of the products and services it offers.
Banks need to change the way they operate
With ever-changing consumer needs, Agile ways of working can help banks embrace changing requirements. Agile software development, an approach based on iterative development that brings together small, cross-functional teams to develop solutions within weeks rather than months, allows a product to go live sooner. At the same time, a project that is not on track could “fail fast,” allowing the team to recalibrate and take a different course quickly.
The prevalence of technology in every aspect of our lives also means that IT cannot be a department that sits on its own in a corporation. As banking becomes a seamless digital process, IT professionals are now integrated with every banking department.
At Standard Chartered, besides having IT professionals across our 60 markets, four Centres of Excellence – two in India, one in Malaysia and one in China – support and provide expertise for our global operations. IT teams are now closely integrated with respective product/client solution teams for agile delivery.
Talent and resources are critical for any strategy. Besides having the best talent, there is also a need to be faster and more scalable. There is a progressive shift to cloud-based infrastructures which can connect with multiple platforms such as those of industry-specific clearing houses, e-commerce platforms, large commercial and government institutions.
Without a doubt, real-time payments are redefining the banking landscape. In the next few years, there will be a multi-fold increase in volumes, with clients expecting 24×7 availability and scalability to handle peaks and troughs.
We foresee intense competition for talent and resources, not just in the banking industry, but also with tech firms and telcos. A survival of the fastest, the organisations which can react to the change the fastest will be the true winners.
A Singapore startup that has developed a blockchain-based platform to connect farmers with global markets has turned to SAP’s S/4 Hana Public Cloud for its digital core.
Called CrowdFarmX, the startup will use smart contracts to connect farmers directly with wholesale distributors and retailers, helping them to gain a greater cut of the selling price.
Like many other food supply chain related blockchain platforms, CrowdFarm X’s platform will provide visibility of the agricultural supply chain to ensure food safety.
The startup will also equip farmers with technology knowhow, including the use of data analytics and internet-of-things (IoT) irrigation monitoring systems, to deliver higher-quantity and quality yields, potentially increasing net yield by up to 10 times and contributing towards global food sufficiency.
On the role of S/4 Hana Public Cloud, CrowdFarm X founder and CEO David Tan said the cloud-based suite will deliver “an intelligent and intuitive digital backbone that supports us to have more complete visibility of the supply chain and supports rapid scalability”.
Further, the startup will make use of the software’s enterprises resource planning (ERP) capabilities, from production planning and management of sales and distribution, to procurement and financial control.
The implementation is expected to go live in December 2018.
CrowdFarm X aims to attract 10 million Southeast Asian farmers to its platform in the next 22 years, starting with an initial goal to sign up 1,000 farmers in a pilot phase by 2020.
Its targets are bold, given that many supply-chain related blockchain applications are not yet proven in the market, along with the low technology adoption rates in the agriculture industry in Southeast Asia.
That said, the startup seems to have made some headway, having roped in partners to set up CrowdFarmX Food Cradle farming facilities in Indonesia, Thailand and Cambodia, with Myanmar and Vietnam in the pipeline.
Among the findings of the Committee of Inquiry (COI) that looked into the massive SingHealth data breach was the startling fact that a non-IT staff was tasked with managing the server which was exploited by the perpetrators to steal the personal data of 1.5 million people.
Taking the witness stand yesterday was Tan Aik Chin, a senior manager responsible for the cancer service registry at the National Cancer Centre, who admitted that he had limited understanding of IT security and had inherited the server from someone else.
And because the server was not directly managed by SingHealth’s designated IT supplier, Integrated Health Information Systems (IHiS), there was no visibility into its security posture, and whether or not it was patched regularly in accordance with existing security policies.
The server had in fact remained unpatched for 14 months, exposing software vulnerabilities that perpetrators latched on to install malware and facilitate their data exfiltration efforts.
This is an example of how shadow IT can pose a serious threat to IT security – not by way of having employees use their own software and computers to perform their jobs in the classic definition, but rather the lack of visibility and control over all IT assets operating in the shadows.
Perhaps it is time to rethink the current definition of shadow IT, which limits organisational thinking to unsanctioned systems and software used by employees for work. After all, the security risk posed by a corporate-owned system that operates in the shadows is just as high as that of personal devices.
Instead, the focus should be on improving the visibility over every system, device and application that touches a network, whether they are employee-owned or corporate-sanctioned ones.
Singapore’s National Environmental Agency (NEA) is roping in Alphabet life sciences research outfit Verily’s technology to separate male mosquitoes from female ones in an ongoing project to tackle dengue fever.
Dubbed Project Wolbachia, the multi-year project aims to curb the female mosquito population by infecting the dengue virus-carrying Aedes aegypti mosquitoes with Wolbachia bacteria that have been injected into male mosquitoes. The bacteria, transmitted during mating, would prevent hatching of mosquito eggs.
To control the precise distribution of male mosquitoes in targeted areas including the corridors of Singapore’s public housing blocks, the NEA will use Verily’s new automated release system, contained within a 1.3m x 1m cart, and lightweight enough to be pushed by an individual.
Verily’s release system should help to alleviate the challenges faced by the NEA in releasing male mosquitoes in high-rise and densely built urban environment – the mosquitoes moved easily from surrounding areas into the release sites, reducing the suppression effect of Wolbachia at release sites.
Additionally, Project Wolbachia will use Verily’s mosquito sex-sorting technology, which has been successfully used to separate male and female mosquitoes using a computer vision algorithm and artificial intelligence.
This will ensure that only Wolbachia-infected male mosquitoes are released into the wild, and prevent the build-up of female Wolbachia-Aedes mosquitoes that would have resulted from the release of fertile female Wolbachia Aedes mosquitoes over time.
Not doing so would could eventually result in Wolbachia-Aedes taking over as the dominant mosquito strain, and hamper the continued use of Wolbachia-Aedes mosquitoes to suppress the Aedes population in those areas, according to the NEA.
A study commissioned by Stripe has revealed that the Singapore economy has the potential to grow by S$1.6bn each year, if companies harness developer resources more effectively.
Conducted by Nielsen, it found that 90% of businesses in Singapore rely on software to launch products, demonstrating that most companies today are increasingly technology companies.
Presumably, this software refers to in-house applications built by internal development teams, given that over two-fifths of senior executives in Singapore said the lack of qualified software engineers and developers was one of their greatest challenges, more so than a lack of capital (34%).
At the same time, Stripe noted that businesses in Singapore are counting the cost of not using existing developer talent effectively enough, noting that “bad code” is costing local companies S$232m annually.
“Crucially, about a third of developers (32%) say that they’re spending at least half of their time reactively tackling “bad code” rather than working on strategic issues,” it said.
Out of all the countries surveyed, Singapore also had the highest number of developers (70%) reporting that the amount of time they spend on bad code was excessive.
In a statement on the release of the study, Stripe advocated the use APIs to automate peripheral business functions, speed up development time and improve developer productivity.
However, some companies like financial services firms may still need to fix the bad code on legacy systems, especially if these are systems of record, before they can implement an API strategy.
Unless a company is ready to rip everything apart and start building new applications from scratch, tackling “bad code” isn’t always counterproductive.
From Huawei’s Kirin 970 system-on-chip (SoC) that packs in a neural processing unit to Google’s new Edge TPU that performs machine learning tasks on IoT devices, the use of dedicated chips to speed up artificial intelligence (AI) tasks has been in vogue in recent years.
But not all chips are built the same way. Google’s Edge TPU, for instance, is a purpose-built ASIC (application-specific integrated circuit) processor designed to perform AI inferencing, while GPUs (graphics processing units) – a type of ASIC chip – are more apt at training AI models where massive parallel processing is used to run matrix multiplications.
Then there are also FGPAs (field programmable gate arrays) which can be programmed for different use cases but are typically less powerful than ASIC chips.
The choice of chips depends on the types of AI workloads. For image recognition and analysis, which typically involve high workloads with strict requirements on service quality, Alibaba claims that GPUs are unable to balance low latency and high performance requirements at the same time.
In response, the Chinese cloud supplier has developed a FGPA-based ultra-low latency and high performance deep learning processor (DLP).
Alibaba said its DLP can support sparse convolution and low precision data computing at the same time, while a customised ISA (instruction set architecture) was defined to meet the requirements for flexibility and user experience.
Here’s a look at how Alibaba thinking behind its DLP design:
The DLP has four types of modules, classified based on their functions:
- Computing: Convolution, batch normalisation, activation and other calculations
- Data path: Data storage, movement and reshaping
- Parameter: Storage weight and other parameters, decoding
- Instruction: Instruction unit and global control
The Protocal Engine (PE) in the DLP can support:
- Int4 data type input.
- Int32 data type output.
- Int16 quantisation
This PE also offers over 90% efficiency. Furthermore, the DLP’s weight loading supports CSR Decoder and data pre-fetching.
Re-training is needed to develop an accurate model. There are four main steps illustrated below to achieve both sparse weight and low precision data feature map.
Alibaba used an effective method to train the Resnet18 model to sparse and low precision (1707.09870). The key component in its method is discretisation. It focused on compressing and accelerating deep models with network weights represented by very small numbers of bits, referred to as extremely low bit neural network. Then, it modeled this problem as a discretely constrained optimisation problem.
Borrowing the idea from Alternating Direction Method of Multipliers (ADMM), Alibaba decoupled the continuous parameters from the discrete constraints of the network, and cast the original hard problem into several sub-problems. Alibaba solved these sub-problems using extragradient and iterative quantisation algorithms, which led to considerably faster convergence compared to conventional optimisation methods.
Extensive experiments on image recognition and object detection showed that Alibaba’s algorithm is more effective than other approaches when working with extremely low bit neural network.
Having low latency is not enough for most online services and usage scenarios since the algorithm model changes frequently. As FPGA development can take weeks or months, Alibaba designed an industry standard architecture (ISA) and compiler to reduce model upgrade time to just a few minutes.
The software-hardware co-development platform comprises the following items:
- Compiler: model graph analysis and instruction generation
- API/driver: CPU-FPGA DMA picture reshape, weight compression
- ISA controller: instruction decoding, task scheduling, multi-thread pipeline management
The DLP was implemented on an Alibaba-designed FPGA card, which has PCIe and DDR4 memory. The DLP, combined with this FPGA card, can benefit applications such as online image searches.
FPGA test results with Resnet18 showed that Alibaba’s design had achieved ultra-low level latency while maintaining very high performance with less than 70W chip power.
Australia’s eCorner has signed a deal with Macquarie Cloud Services to help it use public cloud services to fend off growing competition from larger US providers.
The Sydney-based e-commerce technology and hosting provider was founded in 2003 as the master distributor for ePages, a European e-commerce platform. Realising the e-commerce market in Australia was reasonably immature in those early days, eCorner evolved to become a one-stop-shop for local SMEs that needed guidance on selling online.
With its previous co-location datacentre, eCorner couldn’t scale its offering in the same way as its cloud-dependent competitors and meet the fluctuating demands of its customers. At any given time, data and resource usage could change dramatically, making it difficult to accurately predict what resources were needed to meet merchant and consumer demand.
eCorner made the decision to take a page out of its US rivals’ books and invest in public cloud services, making the company scalable and helping it to stay competitive at a time when Australians are spending more than A$21bn annually online.
The company has found success among Australia’s SME community, providing secure hosting services to fintech start-ups such as Lodex.co, Australia’s first auction-style loans and deposits marketplace, in addition to a range of tailored e-commerce offerings across a variety of industries.
The new cloud environment eCorner and Macquarie provide helps Lodex.co and others manage and simplify areas such as PCI DSS compliance to allow them to handle transactions and compete against big banks and other lenders.
“Large-scale e-commerce owes a lot of its success to the cloud,” said Jae Debrincat, general manager of eCorner. “We needed to adapt a similar model to be flexible and survive in an increasingly-competitive industry.”
“With Macquarie, we know where our technical support and data are, a guarantee larger cloud providers couldn’t give – that’s vital, particularly with GDPR and mandatory breach disclosure. Our customers’ primary data is in Sydney CBD, and it backs up to a datacentre up the road in Macquarie Park. Local hosting also makes our services faster because there’s no reliance on international pipes.”
Macquarie Telecom Group CEO, David Tudehope, sees this as an example of the often-elusive generational start-up ecosystem working together in Australia.
“If you look at Macquarie, eCorner and Lodex.co – it’s three generations of former and current start-ups in Australia working together and challenging large competitors,” said Tudehope.
“Smaller businesses often believe the cloud is made for large enterprises and not something they can use to their advantage. It’s quite the opposite; there might just be a different, local approach to cloud that matches their business goals. eCorner has found that mix and can use the cloud to help start-ups and SMEs tap into the fast-growing eCommerce market in Australia.”
On 20 July, the Singapore government revealed that the non-medical personal details of about 1.5 million patients who had visited SingHealth’s specialist outpatient clinics and polyclinics between 1 May 2015 and 4 July 2018 had been illegally accessed and copied in a deliberate, targeted and well-planned cyber attack.
Data taken included names, national identity card numbers, addresses and dates of birth. Information on the outpatient dispensed medicines of about 160,000 patients was also exfiltrated through an initial breach on a front-end workstation.
The unprecedented attack on Singapore’s public healthcare IT systems has since raised questions about who should bear responsibility in a medical cyber attack.
One might think that IT workers and their suppliers should be blamed, but doctors, at least in Singapore, are also “statutorily responsible for any system instituted within his practice for the management (storage, access and integrity) of medical data.”
This was pointed out by Quan Heng Lim, director of cyber operations at Horangi Cyber Security, who noted that with increasing digitisation of healthcare records and new regulations on contributions to Singapore’s National Electronic Health Record system, doctors should ideally possess baseline knowledge on information security, whether by training or otherwise.
Lim added that the legal entity of the affected clinics and hospitals – or SingHealth in this case – may be liable as well. Although SingHealth operates public healthcare services, it is in fact a corporate entity and hence is bound by the Personal Data Protection Act (PDPA).
Under Section 24 of the PDPA, organisations must protect personal data in their possession or control by making reasonable security arrangements to prevent unauthorised access, collection, use, disclosure, copying, modification, disposal or similar risks. These include technical measures such as network security, strength of access controls, and the regularity and extent of patching and vulnerability fixing.
Lim said more obligations could also apply, depending on the nature of the computer systems in question. Under Singapore’s newly-passed Cybersecurity Act, acute hospital care services and services relating to disease surveillance and response are considered “essential services”.
“If designated as critical information infrastructure by the Cyber Security Agency of Singapore (CSA), computer systems owners would be subject to various obligations, such as bi-annual audits, annual risk assessments, as well as compliance with specific codes and standards.
“It is possible for such owners to be IT vendors, corporate legal entities, or even doctors themselves,” he added.
Faced with liabilities on several fronts, what can those parties do? Even though there’s no surefire way to stop all cyber attacks, IT suppliers and software developers, for one, should strive for zero defects in their systems. This has been lost today amid the rush to put out new software releases quickly.
Besides implementing air gap measures, which seem to be the favoured solution of the Singapore government when confronted with threats targeted at critical systems, a holistic approach towards cyber security that involves people, processes and technology is necessary.
“In relation to people, doctors and IT vendors should be aware that they could be targets for phishing and impersonation. The risk of phishing is not unique to the healthcare industry, and affects other industries (e.g. financial services, legal services) which deal with sensitive data, but have no guidance dealing with cyber security or understand the risks they must deal with,” Lim said.
As for processes and technology, Lim said doctors and IT vendors strive to build and maintain a secure technology environment. This could include vulnerability and threat assessments and setting up multi-factor authentication – just ask Google, which hadn’t had any of its employees phished on their work e-mail accounts since it mandated the use of Universal 2nd Factor (U2F) security keys in early 2017.