There’s code optimisation, which, obviously, is a process where software application developers focus on an existing code base and work to improve it in terms of its total number of executable lines, its demands upon memory and other system resources such as its need to perform input/output (I/O) operations.
Then there’s also optimisation-as-code, perhaps not quite so obviously.
The latter of these two processes is delivered as an often comparatively small chunk (lines) of code designed to empower cloud applications inside any given ‘instance’ with a degree of self-awareness of the cloud resources they need based upon both business and application demands.
Additionally, optimisation-as-code tools allow the application itself to automatically self-optimise in terms core execution based upon the higher-level ‘health check’ it has received in terms of system resource needs.
Product news to follow on subject — this story will be expanded on 15 October 2018.
Why the clarification?
Because cloud optimisation company Densify has now come forward with its Cloe Aware [it’s deeply backend, so the brand marketeers used a human name] optimisation-as-code product.
The company claims that Cloe Aware allows cloud operations teams to automate application optimisation by simply adding one line of code into their infrastructure-as-code template.
As we know, application demands can fluctuate every day, hour and minute of the week.
Equally and in parallel, cloud offerings, such as a compute instances can be purchased and configured in millions of different ways and new technologies are introduced on a monthly basis.
As such, it is reasonable to argue that it is near (if not completely) impossible to align each application’s needs with the right cloud technologies available while optimising each penny spent.
“Densify analyses cloud usage patterns and proactively makes applications self-aware of their resource needs – matching application needs to available cloud resources,” said Gerry Smith, CEO, Densify.
A single line of code entered into ‘infrastructure-as-code’ templates such as Terraform, cloudforms and cloudformation tools, dynamically calls on Cloe Aware and means that the cloud instances on which the application runs are now able to automatically and dynamically change in the cloud.
Also here, there is automatic generation of both machine-readable and human-readable output, enabling approval processes, flexible business intelligence reports and integration into Github or other repositories of choice.
Technology marketplace company Spiceworks has detailed how it thinks new Artificial Intelligence (AI) capabilities can be used to drive a ‘new personalisation factor’ at the connection point between technology buyers and sellers.
The firm estimates the IT industry to be worth as much as US$3 (UK£2.3) trillion today. With the scope of that huge market worth in mind and the advances in AI decision-determination that we have today, Spiceworks is suggesting that we shouldn’t rely on human choice alone when making purchasing and deployment decisions about the technology that we use.
New AI capabilities enable Spiceworks to connect technology buyers with the people, [software and hardware] tools and information they need.
A moment of union
But it works both ways.
Simultaneously, this same AI and data-driven approach can enable technology brands to identify and engage the right buyers. The resultant beautiful union (if everything works out perfectly) is a point of increased trust between the two parties.
“We’re building the first community-powered marketplace for the IT industry, one that couples first-party data and pervasive intelligence to directly connect technology buyers and sellers with the resources they need in any given moment,” said Jay Hallberg, CEO and co-founder of Spiceworks.
“With the critical mass we’ve achieved [nearly 20 million people come to Spiceworks every quarter], this evolution marks an important milestone in our history – the development of AI technologies that can be leveraged across Spiceworks to reshape how technology buyers and sellers get their jobs done,” added Hallberg.
Of course you’d imagine that technologists (developer/programmers, database pros and systems administrators) choose what hardware and software they need to run the applications and supporting systems they are tasked with keeping online, would you?
In many cases they will, but in many large-scale enterprises there is a defined role in the shape of the technology buyer, or chief IT procurement officer (CITPO, perhaps) or some such job title. This is a tough job; technology changes every week, as we know. This means that being able to buy with confidence is difficult.
Tech buyers (who very often go by the title ‘system architect’) need to have confidence that any given piece of technology has the scope of functionality, ability to scale, capability to integrate and base compatibility with their firm’s existing IT stack before they press ‘buy’ – — and that’s a big ask.
These guys (in the gender neutral sense) will need to know user requirements, data workflow levels and a feel for the guts of the network if they are going to make the right decision. Putting AI into that decision making process is what Spiceworks is now focused on doing to reduce the possibility of errors.
According to Spiceworks, “Technology buyers and the businesses they represent find themselves lost in a landscape that’s constantly shifting and expanding. The result: a sense of frustration and lack of confidence in their ability to find the insights, content, tools, and people they can trust to make informed decisions. Simultaneously, technology brands spend over $300 billion each year on marketing and sales, much of which is wasted or spent inefficiently. Technology buyers and sellers need a new approach and partner they can trust to help them make better decisions and drive their businesses forward.”
How to do how-tos
The Spiceworks platform is now employing AI to analyse billions of data signals, including interactions across Spiceworks such as community discussions, technology how-tos, learning modules, product reviews and a suite of IT management tools. These information from these streams are then used to connect technology buyers and sellers as they’re searching for the people or information they need.
Examples of personalised experiences the platform is enabling today include new Fast Answer pages that use AI to group the best answers for frequently-asked technology questions in one place. The company says that Fast Answers allows IT professionals to more quickly find a solution to their technology challenge in any given moment by scanning hundreds of thousands of topic pages to identify the most helpful information across Spiceworks, including IT best practices, the aforementioned how-tos and pages showing product comparisons.
The Spiceworks’ platform is based on ‘purchase intent’ across 14 technology categories including security, cloud services, backup and recovery, networking and ten others.
As additional AI capabilities are applied to its platform, the company promises it will enable more personalised experiences and human connections that help technology buyers get their jobs done more efficiently while allowing technology vendors to engage buyers in a more ‘meaningful’ (or at least product-relevant) way.
VP of business operations Nicole Tanzillo spoke at the SpiceWorld keynote to explain why she thinks the ‘old way’ we used to buy and sell technology, which she says was indirect, inefficient and impersonal.
As the firm now looks to more firmly cement ‘IT marketplace’ message, Tanzillo says the focus is directed at becoming simple, smart and connected. She used the example of how we used to have to buy and sell houses in the pre-Zillo era, which involved so much legwork (driving around etc.) – a process that was equally painful for the estate agent (realtors) who had to put up with non-serious buyers who were just tyre kickers.
SVP of product & engineering Manish Dixit underlined Tanzillo’s points and explained how the product announcements tabled this week bring a new AI-driven approach to IT product procurement.
During the Spiceworks Spiceworld conference itself, one thing (okay several things, but one cool thing) stands out.
Spiceworks invites ‘bloggers’ to its technology event media programmes alongside the press. Now that’s not unusual in and of itself… but it’s who these bloggers are that matters.
Stopping and asking these bloggers what their ‘actual’ job title is, mostly they were systems architects, IT product managers and system administrators i.e. real world practitioners who actually touch enterprise software tools and applications every day. Surely there’s a lesson there for other tech vendors.
Application Programming Interface (API) development company Postman has conducted a state of APIs survey.
Well, an API dev tools company would, wouldn’t it?
Questioning its 5 million community members — but presumably not getting responses from ALL of them — Postman says it wanted to understand the API developers’ workflow, pain points and perspective for where the API space is headed.
Most notably, while microservices remain the most ‘exciting’ technology for API developers, the rise of containers and serverless architecture have emerged as the new favourites for this group.
Postman’s co-founder and CEO is bold enough to suggest that the data his team have collated might be able to help technical leads identify current norms within their technologies and where development teams need to focus their time and energy.
The survey results suggest that nearly two in three API developers spend a quarter of their week working with APIs.
Also here, the majority of API knowledge is gained on the job or from online resources, while published API documentation and online communities also contribute heavily.
Additionally, the Postman community were asked how they pronounce API, with an overwhelming 82% saying that they use “Aye-Pee-Eye” – but another 12% said “Appie” (as in rhymes with happy).
Finally, developers shared a clear preference in the Star Wars vs. Star Trek debate. A whopping 76% of API developers identified as “All Star Wars” or “More Star Wars than Star Trek”.
How many times have you heard people tell you that ‘deadline for submissions is midnight on Tuesday’ (or insert weekday of your choice)..?
Sysadmin guru and technology commentator extraordinaire Bob Plankers thinks it’s confusing, if not downright dumb.
Plankers is author of The Lone Sysadmin and in addition to his role as virtualization architect at The University of Wisconsin-Madison, USA.
Suggesting that midnight is a essentially a poor choice of timeslot for scheduling anything, Plankers says that midnight belongs to tomorrow —and that’s not how we how humans think, if we’re focused on the task in hand that is.
The focus of his argument centralises on the face that, in the world of the sysadmin, DBA and for many other operations professionals, midnight is a popular time to schedule automated processes.
“I get it, it’s easy. If you run something at midnight you don’t have to do much processing to separate yesterday from today. The problem is that there’s a ton of stuff already running on the hour, and you’re just piling on. Most people try to avoid shopping when it’s crazy busy, why would you want to run your jobs that way? If you ran your job a bit earlier or later chances are it’ll run faster because you’re not competing with everyone else,” wrote Plankers.
If not that, then what?
Plankers makes a great point, that is – in the contemporary world of always-on software application development we work in a cycle of Continuous Integration (CI) and Continuous Delivery (CD)… so slotting in nightly builds at ‘traditional midnight’ probably isn’t possible anyway and if we do push for that slot it could well cause a log jam.
As we now look to hit equally specific, but more widely dispersed deadlines, Plankers asks us to be strict about how you write your deadline times and use the time & date in the ISO 8601 format to help avoid global formatting issues (YYYY-MM-DDThh:mmTZD).
Thanks Bob, we’ll aim for diversity in every sense of the word.
Low-code application development company Appian has conducted its Future of Work survey through IDG.
Before looking at the ‘findings’, let’s remember the three cynical forewarnings that should (arguably) proceed any technology survey:
- Companies often conduct surveys when they have no solid product news.
- Companies often conduct surveys with a contrived set of questions designed to skew results in their favour.
- Companies often conduct surveys with ‘independent’ technology analyst houses that they have content/analysis contracts with.
So then, sceptical naysaying aside, what has Appian uncovered?
The firm says that an average EMEA enterprise generates [approximately] 230 internal requests for new business software applications and major feature enhancements every year.
In Germany that figure is as high as 306, in France it’s 263, and in the U.K. it’s 137 — the European average is well above the U.S. figure of 153 requests.
Despite this disparity, Appian says that global success rates are all ‘equally dismal’.
“Across all surveyed geographies, 50% of all new app development requests end in the failure. They are either not delivered at all, or are delivered without meeting the business need,” notes an Appian press statement.
Here comes the money…
Could this research be leading us towards a suggestion that low-code platforms with their componentised efficiencies and automated pre-aligned controls are a better way of building enterprise apps?
Yes indeed, Appian has said that enterprise low-code development platforms accelerate application creation with ‘robust but easy-to-use’ drag-and-drop visual design tools.
In fairness (sarcasm aside just for a moment) Appian’s tooling does offer a route to application builds that are connected and governed, so there is a certain amount of built-in future-proofing there.
“Software is defining the future of how work gets done and all companies must accelerate IT delivery of new applications without sacrificing quality or accumulating more technical debt,” said Appian CEO Matt Calkins.
The Future of Work survey, conducted by IDG, gathered responses between August and September 2018. Respondents comprised of 500 IT leaders (50% C-level, all director or above) with at companies with over 1,000 employees.
Yes, we have a lot of software failures. Yes, low-code can help. Yes, automation and visual design plays a key role in the future of software.
No, low-code is a not a cure-all. No, low-code is not no-code, it’s still quite complex in terms of engineering and application functionality flow. No, technology surveys are not getting that much better.
Yes, we’ve been cynically sarcastically acerbically cruel — sorry.
Technology ‘marketplace’ and collaboration platform company Spiceworks held its annual developer, customer, partner and related tech-practitioner conference this month in the Texan capital city of Austin this month.
Whichever way you cut it, Spiceworks pretty much had to call its convention SpiceWorld.
Given that the 1990s ‘girl power’ pop group of the same album name are now ancient history — and that Spiceworks was founded in 2006 (and the fact that the Spice Girls were probably never that big in a hard core rhythm & blues town like Spiceworks’ home town of Austin), Spiceworks SpiceWorld came to be.
Spiceworks itself exists to connect IT industry technology buyers and sellers, hence the firm’s use of the term ‘marketplace’ in its lead descriptor.
The wider market (and technology) proposition that Spiceworks puts forward is not just the sell-to-buyer marketplace element, but also a means of being able to collaborate and adopt, integrate, deploy and manage the software being dealt with.
Key news announcements this year saw Spiceworks announce a new cloud-based Spiceworks Inventory application that integrates with the cloud editions of Spiceworks Help Desk and Spiceworks Remote Support.
The concept here is: software designed to manage technology assets and support end users from a single place.
In the same way that you can now get AI-powered clothes deliveries these days, Spiceworks is using intelligence to provide users with what is calls ‘personalised insights and recommendations’ directly within its cloud-based applications.
“Our integrated cloud-based applications are part of our larger strategy to leverage AI technologies to directly connect IT professionals with the most helpful content, tools and experts they need to drive their businesses forward,” said Manish Dixit, senior vice president of products and engineering at Spiceworks.
Dixit insists that the new AI capabilities that have been applied to Spiceworks allows the company to help users to predict the technology challenges the businesses they work in are facing. This, he says, will help them become more strategic about addressing the obstacles to success.
It’s tough to understand what a marketplace platform really is… so Spiceworks breaks it down and says that the total proposition it offers includes:
- Product reviews
- Learning modules
- Technology discussions & collaboration tools
Adding AI to this set of functions will (in theory at least) give users more automated information more quickly dependent on their use case of previous applications and data.
If Spiceworks then applies its knowledge of one customer to cross-reference that with anonymised and obfuscated information relating to other users in defined specific industries and use cases, then we all get our software delivered the way we want it more efficiently – well, in theory at least.
Technology vendors like to call this a data-driven or data-powered approach to work… but you could just call it AI if you wish.
The company says that looking forward, users can expect Spiceworks applications, including Spiceworks Inventory, to become smarter and more responsive on an individualised basis.
For example, an application could proactively notify an IT professional when a laptop (or laptops, plural) may need replacing and, unprompted, provide a list of the top 10 laptop purchases by technology buyers. Spiceworks could also flag when a business is receiving an abnormal amount of help desk tickets about its anti-virus software and provide a list of alternative solutions that are better suited for their environment.
This product will discover and scan all IP-enabled devices on a network from laptops and servers to smartphones and IoT devices. It can provide device name and detect all open ports.
“Together, Spiceworks Inventory, Help Desk and Remote Support close the knowledge gap between the hardware and software on corporate networks and the technology challenges businesses are encountering. In the coming months, Spiceworks Connectivity Dashboard will also integrate with the cloud-based applications to help ensure end users stay connected to mission-critical applications,” said the company, in a press statement detailing its latest product news.
Key features and use cases for the cloud-based version of Spiceworks Inventory include the ability to document hardware details for workstations and servers, such as the CPU, memory, disk, network and serial number.
The key functions of Spiceworks Inventory also allow users to document all installed software (including operating systems) inside a firm’s central (and extended) IT stack.
Smarter more automated individualised custom-aligned AI-intelligent services, tools and functions are being brought be bear across and along every vertical and horizontal aspect of the total IT stack. To think that inventory and asset management (for hardware, software and systems) wouldn’t be a part of that new automation efficiency would be somewhat silly to say the least, Spiceworks appears to have (very arguably) seen a defined call to action here.
Technology events generally host ‘inspirational’ speakers at some stage, usually on the last day.
After the product keynotes, the technical sessions, the platform updates, the CTO breakdowns and the guest customer speakers, it’s time for the inspirational light relief.
These speakers generally fall into two categories: life heroes, or industry heroes.
Life heroes and heroines include people like Michael J. Fox (who is always a joy to listen to), William Shatner (who is also fabulous), the arguably somewhat turgidly eccentric lady who swam from Cuba to Key West (or was it the other way around?) Diana Nyad and possibly even speakers from universities and areas of academic research.
Industry (or business) heroes include people like politicians (Obama, Rice and Powell are all doing the rounds), businesspeople (Richard Branson speaks a lot, in case you hadn’t noticed) and technical gurus such as Apple co-founder Steve Wozniak.
Double up, triple up
Now here’s the issue. Many of these speakers really ‘do the circuit’, so it’s not hard to end up seeing them speak more than once in any given twelve-month period.
Michael J Fox is easy to listen to more than once; and, in fact, Richard Branson is always quite inspirational.
So it is that we come to Steve ‘the Woz’ Wozniak, who spoke at the Splunk .conf18 user conference this October in Orlando, Florida.
Wozniak doesn’t tend to present as such; he prefers the so-called ‘fireside chat’ format — an industry term used for informal couch-based interviews where the spokesperson is quizzed by an interviewer (sometimes a journalist, sometimes the CEO when there is a real ‘fan factor’ going on). Get used to the term, it even has its own 🔥 Fire Emoji emoji.
Having seen Wozniak speak twice before in the last two years, how does Wozniak x3 times feel?
Wozniak x3 times
Inevitably, perhaps, Wozniak refers to his time founding Apple. The whole “okay so one say I said something to Steve Jobs” lines aren’t going to stop featuring in any of his discussions any time soon.
Wozniak spoke at length at how much fun he had had with differential calculus and jumped back and forth into his feelings about being inspired to create great things ‘even if they weren’t going to end up being part of a company or a commercial success’.
“In the inventor’s hall of fame, a lot of people have come up with great ideas through serendipity … kind of by accident. Sometimes it’s about following through your own ideas and I was always that kind of person. You go to school to learn methods to design things, you don’t go to learn how to copy other people. That’s how I have always worked, but I’ve always also looked at how I could reduce the steps that people would normally take to create something,” said Wozniak.
He suggested that when one person does a job instead of 10 people, it’s easier to shift things around… so Wozniak urged the audience to take control of their own destiny.
“The Apple II computer was a great computer and it brought forward arcade games for the first time in colour and presented in the form of software. A nine-year old kid could use this machine to write a game in one day as opposed to the huge amount of time it used to take software engineers,” said Wozniak.
He also spoke at length about the importance of education and working on non-profit projects. Wozniak has spent a lot of time teaching ‘5th grade’ (10 to 11 year olds) students in California. He wrote every lesson himself and refused to allow press into these sessions.
Spending a lot of time reminiscing about the start of dial up Internet and the pre-WiFi era, Wozniak spun the conversation around to remind us how he thinks about the future of technology.
“Artificial Intelligence is a word, but why do we call it that? The brain works differently from computers and we know that humans can [still] work things out so much faster than computers. If Google can recognise thousands of pictures of dogs, it still can’t work out whether that is a picture of a dog on a wall or a dog just sitting right in front of you — you need a two-year old to tell you that,” said Wozniak.
He uses the above example every time he speaks to explain how he’s not so sold on the state of AI today. Wozniak suggested that perhaps we should be focused not on ‘intelligence’, but on ‘smartness’ instead as a means of measuring how well computers can actually do things.
Amusingly, for a man from Apple, Wozniak said that he thinks open standards (he used the example of WiFi around the globe) are really important for us on the technology road ahead.
So is Steve Wozniak worth listening to three times?
It would be very cruel to suggest that he should be castigated for a few repeated references to Steve Jobs. He’s part of history, part of the geek firmament and part of what we love about technology. If you get a chance to listen to him speak in person, then go for it…
… oh, you’re supposed to go out and buy his book, apparently.
Nice plug Woz.
Machine data intelligence and analytics specialist Splunk has detailed a number of product updates at its annual .conf user and customer conference this month.
The self-styled real time operational intelligence company highlighted new versions of its core Splunk Enterprise (now at version 7.2) and Splunk Cloud products – technologies designed to allow users (developers… and, increasingly, less technical users) to search and ‘ask questions’ of machine data, log files and other core system information.
Splunk president & CEO Doug Merritt asserts that today there are two types of company: those who only record events with data — and those who make things happen with data.
Also new this year is Splunk>Next, a means of applying Splunk to a wider variety of data sources — and by that, the company means more real time data, or you could say data that is ‘in motion’.
Opening the aperture
The Splunk Data Stream Processor is designed to help evaluate, transform and perform analytics on data in motion. The focus now is [data] search at massive-scale – and in this case that means trillions of events at millisecond speeds with ‘federated search’ across multiple Splunk deployments through Splunk Data Fabric Search.
“Splunk is building on our strong heritage to evolve the platform for the future,” said Splunk CTO Tim Tully.
Tully has called this an ‘opening of the aperture’ so that Splunk can be used in more places.
“[We will] help our customers pattern match everything with artificial intelligence and machine learning infused across the entire product portfolio. We’re doing this with streaming data, data at rest, data from any source and on whatever kind of device you want to use to take action,” said Tully.
To detail some more of the functions here, users can interact with Splunk products from a mobile device via Splunk Mobile and Splunk Cloud Gateway.
Users can also employ Augmented Reality (AR) to interact with (and take action from) data through features such as QR codes, scanning for dashboards, UPC scanning and near-field communications New Data sources.
Also noted here, users can ask questions of Splunk using voice and text and receive responses with natural language.
Splunk says that customers can now move any data to and from the Splunk Platform regardless of its format, state or location. New capabilities in this regard include Guided Data Onboarding, a new graphical user interface to help move data into Splunk Cloud or Splunk Enterprise and guiding through the best onboarding methodology based on a specific architecture.
Wider aperture, all round
The look and feel from Splunk .conf overall can be described in one word: bigger.
To clarify, that means bigger in terms of attendees — the Orlando convention centre used for this annual event strained in terms of its air conditioning power and ability to serve what is now some 8,000 attendees.
Splunk is also bigger in terms of employees, product sets, partners and (the whole point of many of the news announcements made and underlined by the CTO’s comments in this story) Splunk is also bigger in terms of its scope and breadth for data as it now works with more data sources from Industrial IoT feeds to core machine data and bring all that together with new mobile functionality.
Splunk .conf 2019 is logically moving to Vegas, you can bet on that.
Real-time operational intelligence specialist Splunk hosted its annual .conf conference in Orlando this October to detail the state of its platform development, showcase customer use cases and dig into what’s likely to be toothsome and flavoursome in the world of machine data and log file analytics on the road ahead.
Splunk is called Splunk in reference to spelunking (cave, or in this case data, exploration) – and the .conf event itself is amusingly called .conf18 in reference to the .config file extension, which contain the parameters and initial settings for running a software application, server process or operating system calibration.
Into this year’s updates then and it appears Splunk has been busy.
Among the product news tabled at Splunk’s .conf18 conference this year were details of the firm’s Splunk for Industrial IoT (IIoT).
This software is a combination of some of the firm’s core technologies: Splunk Enterprise, Splunk Machine Learning Toolkit, and Splunk Industrial Asset Intelligence (IAI), which are now brought together to provide a view of complex industrial data.
Operational Technology (OT)
Splunk says it sees particular deployment relevance for Operational Technology (OT) teams and organisations in manufacturing, oil and gas, power, transportation, energy and utilities.
Ammar Maraqa, senior vice president and general manager of IoT markets at Splunk says that the software itself gives users data analytics and reporting powered by machine learning, combined with new drag-and-drop capabilities specifically to address common challenges for industrial organisations, such as unplanned downtime across disparate systems.
Splunk Industrial IoT also includes Security and Compliance for Industrial Control Systems (ICS) to safeguard ICS systems from emerging and persistent cyber threats. This allows, if you will, OT environments to embrace an analytics-driven approach to security.
According to Splunk’s Maraqa, industrial OT operators are increasingly looking to their sensor and other machine data to monitor and diagnose operational issues from industrial assets such as turbines, pumps and compressors.
“Splunk for Industrial IoT gives customers real-time visibility into the health of these assets, providing monitoring, alerting and diagnostics across multiple data sources. Splunk for Industrial IoT also helps monitor the uptime and availability of ICS, SCADA systems, distributed control systems and process control software,” said Maraqa.
Splunk for Industrial IoT gives customers the ability to apply what the company claims to be ‘proven algorithms’ for prediction, anomaly detection, clustering and forecasting to help identify early warning signs and predict downtime of ICS and critical assets.
Splunk also announced Splunk IT Service Intelligence (ITSI).
The software itself is a driven by Splunk’s own machine learning engineering and is designed to automate incident investigation and workflows across all data sources.
“IT environments are complex and dynamic and IT teams are constantly under pressure to make sense of their data and take action” said Rick Fitz, senior vice president and general manager of IT markets at Splunk. “The ability to use machine learning and artificial intelligence to predict service degradation and prevent issues allows these teams to focus on driving value for the business. We make smart IT teams smarter with a platform that brings together all the data to provide complete visibility.”
With Splunk ITSI 4.0. Customers are now able to use Splunk App for Infrastructure with an interface from Splunk ITSI. This gives users access to a full-scale monitoring platform that can correlate their server data across the organisation.
Splunk App for Infrastructure gives system administrators and site reliability engineers a unified approach to monitoring and troubleshooting. Customers can now also access Splunk App for Infrastructure data directly from Splunk ITSI to get more detail at the server level.
Splunk for IIoT and Splunk IT Service Intelligence (ITSI) will be generally available on October 30, 2018.
Here’s a suggestion for you… software application developers should be looking at the technical newswires to gauge where the global interconnection pipes of the web are going to next strengthen up in order for them to target specific international application development.
It sounds far fetched, but it’s not.
The South Atlantic Cable System (SACS) is now on-stream and open for commercial traffic.
The new digital information highway is the first and fastest link between Africa and the Americas with the lowest latency and will provide a more direct routing for internet traffic in the Southern Hemisphere.
SACS has been established by Angola Cables, probably everybody’s favourite Luanda, Angola-based advanced submarine telecommunications systems company.
The new web highway was manufactured and powered by NEC Corporation and, under the management of Angola Cables, it now exists as a commercial operation connecting Angola (Africa) and Brazil (South America).
According to the company, data transfer speeds will be improved (five times faster than existing cable routings), reducing latency from Fortaleza (Brazil) to Luanda (Angola) from 350ms to 63ms.
Luanda, London, Miami
Luanda, will also connect to London and Miami with approximately 128 milliseconds latency. These two major content hubs will position Angola as a strategic point to serve the transatlantic region with low latency and resilient connections.
António Nunes, CEO of Angola Cables said that the commercialization of the cable is “more than just a game-changer” when it comes to data connectivity and services between the two continents.
“By developing and connecting ecosystems that allows for local IP traffic to be exchanged locally and regionally, the efficiency of networks that are serving the Southern Hemisphere can be vastly improved. As these developments progress, they will have considerable impact for the future growth and configuration of the global Internet,” says Nunes.
The cable will enable African internet service providers and users a more direct, secure path to the Americas – without having to pass through Europe. Content service providers in Latin America will also stand to benefit with the option of using the SACS route to reach markets in Africa and Europe without utilising the traditional and high volume, Northern Hemisphere internet traffic routings.
SACS is 100% owned and managed by Angola Cables has been designed with 100Gbps coherent WDM technology.
The global scope of web interconnectivity and the degree to which latency exists is (very arguably) of huge importance as we now approach real-time application analytics that demands millisecond response time, especially in areas like log file analytics and application orchestration and monitoring.
The Internet has to join the dots and get bigger, stronger, faster and better… so Angolan telco connectivity could be affecting more than you first thought.