SEACOM invests in fibre capacity to support cloud computing

Pan-African service provider SEACOM has announced plans to double the data capacity on its broadband submarine cable system from 1.5 terabytes to 3 terabytes. The move will see more businesses on the continent utilize emerging technologies such as cloud computing.

SEACOM CEO Byron Clatterbuck says the decision is informed by the increasing demand for cloud-based data processing by companies with multinational operations across the continent.

“It’s not just about connecting from Africa to Europe and Asia anymore,” Clatterbuck said. “A lot of content and computing power is moving onto the continent, so connectivity requirements are becoming more regional, and specifically interregional. With such a complex environment, greater capacity is essential.”

SEACOM is already providing direct broadband access to corporate customers through its SEACOM Business arm.

As a partner to African business, the undersea broadband cable services provider has already enabled cloud-based operations for a variety of companies through high-speed, secure and reliable connectivity to platforms such as Microsoft Azure and Amazon Web Services.

Going forward, the company says it plans on expanding further inland, widening fibre access across the continent while targeting large and medium corporations with its premium offerings.

“You will see more terrestrial cables being laid, and the quality of those builds will get better,” Clatterbuck explained. He added “This isn’t to say there aren’t challenges. There is a long way to go in terms of basic infrastructure provision, relating to roads, rails and highways, all of which make it easier and more affordable to deploy fibre-optic networks.

In April, SEACOM announced the conclusion of the agreement for the 100% acquisition of FibreCo Telecommunications in November 2018. FibreCo owns and operates a national open access dark fibre network, providing infrastructure and connectivity services across South Africa. Acknowledging its benefits for the South African economy and local citizens, the South African Competition Commission approved the acquisition in March.

The FibreCo acquisition represents another significant step for SEACOM in fulfilling its vision to increase the company’s 2019 national footprint in South Africa and Africa as a whole through the consolidation of fibre assets. SEACOM believes this is necessary for the evolution of the market, particularly with the increased demand for data owing to the growth in fibre based connectivity and emergence in technologies such as 5G.

The acquisition of FibreCo further enables SEACOM to scale and upgrade its African Ring by connecting its East and West coast submarine assets with a robust network of trans-South African fibre.

While SEACOM connects South Africa to the east coast of Africa, India and Europe, FibreCo network runs along South Africa’s highest-traffic transmission routes and connects over 60 points of presence across the country that include key data centres in major metros like Johannesburg, Cape Town, Bloemfontein, Durban and East London.

Additional end-to-end fibre connects the SEACOM subsea cable system (which lands in Mtunzini on the east coast of South Africa) to the WACS cable (which falls at Yzerfontein, on the west coast of the country), ensuring fully redundant high-speed ring protection around the African continent.

By expanding its wholesale portfolio to include several national long-distance services and last mile metro connectivity, SEACOM has become the provider of choice to local and international data communications customers.

Lighting up additional fibre across South Africa also creates a platform for SEACOM to deliver affordable, high-speed Internet connectivity and cloud services to traditionally-underserved mid-tier cities and towns along the new routes.

www.seacom.mu

[Kenya] VMware, Strathmore University partner to enhance digital skills in Africa

Working with VMware IT Academy: Virtualize Africa, the Strathmore University has already begun integrating a range of VMware developed courses into its curricula that cover topics such as virtualisation, cloud computing, AI and IoT.

VMware has announced the expansion of the VMware IT Academy: Virtualize Africa programme in partnership Strathmore University – @iLabAfrica Centre, Kenya.

The overarching goal according to VMware  is to empower the fast-growing, young African population to enter the digital workforce with confidence and expertise, helping to address the skills gap and supporting innovation and entrepreneurship across the continent.

Through the VMware IT Academy: Virtualize Africa programme, VMware is collaborating with key stakeholders across academia, government and industry to equip African students with the technical skills and certifications required to succeed in the digital economy.

Working with VMware IT Academy: Virtualize Africa, the Strathmore University has already begun integrating a range of VMware developed courses into its curricula that cover topics such as virtualisation, cloud computing, AI and IoT. This is facilitated through subsidised software licenses and certification vouchers from VMware.

@iLabAfrica, a Centre of Excellence in Research and Innovation in Information Communication Technology at the University, is spearheading the rollout with 20 trainers and over 100 students at the University participating. The students will benefit from access to high-quality learning online resources, hands-on lab experiences to develop technical skills, and the opportunity to achieve industry-recognised VMware certification to complement their chosen fields of study.

“We are delighted to be part of VMware IT Academy: Virtualize Africa. It provides a wonderful opportunity for our students to gain technical skills and industry-recognised VMware certifications, helping to jumpstart their careers with the best knowledge and skills of international standards. Our shared goal with VMware is to become the VMware IT Academy regional lead for East Africa, training lecturers and students from Strathmore and other universities plus facilitating their participation in the programme. Increased access to this type of education and training for students is a critical part of Africa realising the potential of its youth and a prosperous Africa,” said Dr. Joseph Sevilla, Director @iLabAfrica, Strathmore University.

“Skills development is recognised as a key component for economic growth and prosperity. VMware IT Academy: Virtualize Africa helps educational institutions align curricula with the skills needed for the labour market, thereby building the right talent for Africa’s jobs of today and tomorrow. . Our discussions to form a strategic collaboration with Strathmore University is a significant milestone in this program, and will bring new skills and opportunities to its students, and in the future to many more young people in East Africa,” said Thomas MacKay, Senior Director for Global Strategic Programs, VMware.

www.vmware.com

[Column] Harish Chib: Seven best practices for securing the public cloud

The simplicity and cost-effectiveness of the public cloud have led more and more organizations to take advantage of Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). You can spin up a new instance in minutes, scale resources up and down whenever you need while only paying for what you use, and avoid high upfront hardware costs. 

While the public cloud solves many traditional IT resourcing challenges, it does introduce new headaches. The rapid growth of cloud usage has resulted in a fractured distribution of data, with workloads spread across disparate instances and, for some organizations, platforms. As a result, keeping track of the data, workloads, and architecture changes in those environments to keep everything secure is often a highly challenging task.

Public cloud providers are responsible for the security of the cloud (the physical datacenters, and the separation of customer environments and data). However, the responsibility for securing the workloads and data placed in the cloud lies firmly with the customer. Just as organisations need to secure the data stored in their on-premises networks, so they need to secure their cloud environment. Misunderstandings around this distribution of ownership is widespread and the resulting security gaps have made cloud-based workloads the new pot of gold for today’s savvy hackers. 

Seven Steps to Securing the Public Cloud

The secret to effective cybersecurity in the cloud is improving your overall security posture: ensuring your architecture is secure and configured correctly, that you have the necessary visibility into your architecture, and importantly, into who is accessing it.

Step 1: Learn your responsibilities

This may sound obvious, but security is handled a little differently in the cloud. Public cloud providers such as Amazon Web Services, Microsoft Azure, and Google Cloud Platform run a shared responsibility model – meaning they ensure the security of the cloud, while you are responsible for anything you place in the cloud.

Step 2: Plan for multi-cloud

Multi-cloud is no longer a nice-to-have strategy.  Rather, it’s become a must have strategy. There are many reasons why you may want to use multiple clouds, such as availability, improved agility, or functionality. When planning your security strategy start with the assumption that you’ll run multi-cloud – if not now, at some point in the future. In this way you can future-proof your approach.

Step 3: See everything

If you can’t see it, you can’t secure it. That’s why one of the biggest requirements to getting your security posture right is getting accurate visibility of all your cloud-based infrastructure, configuration settings, API calls, and user access.

Step 4: Integrate compliance into daily processes

The dynamic nature of the public cloud means that continuous monitoring is the only way to ensure compliance with many regulations. The best way to achieve this is to integrate compliance into daily activities, with real-time snapshots of your network topology and real-time alerts to any changes.

Step 5: Automate your security controls

Cybercriminals increasingly take advantage of automation in their attacks. Stay ahead of the hackers by automating your defenses, including remediation of vulnerabilities and anomaly reporting.

Step 6: Secure ALL your environments (including dev and QA)

You need a solution that can secure your all environments (production, development, and QA) both reactively and proactively

Step 7: Apply your on-premises security learnings

On-premises security is the result of decades of experience and research. Use firewalls and server protection to secure your cloud assets against infection and data loss, and keep your endpoint and email security up to date on your devices to prevent unauthorized access to cloud accounts.

Moving from traditional to cloud-based workloads offers huge opportunities for organizations of all sizes. Yet securing the public cloud is imperative if you are to protect your infrastructure and organization from cyberattacks. By following the seven steps you can maximize the security of your public clouds, while also simplifying management and compliance reporting.

Harish Chib is the Vice President, Middle East & Africa of Sophos.

[Column] Kree Govender: Why cloud hasn’t had a big impact on Business Intelligence

Although the notion of network-based computing stems right back to the 1960s, the modern term “cloud computing” arose in the 2000s. Yet, almost two decades later, South Africa still lags in both its adoption, and its use for critical functions like business intelligence (BI). 

While many believe that this is largely due to a lack of local data centre infrastructure, the landing of the Azure data centres in Africa will drastically change the Cloud landscape across the continent. “This effectively eradicates the fear of shifting massive datasets offshore to global data centres,” confirms Kree Govender, Managing Director of South Africa Qlik Master Reseller (SAQMR). 

The current hesitance towards Cloud adoption in Africa is illustrated by the Qlik implementations across the continent. Statistics show that as much as 95% of Qlik’s customers in Africa are on premise. 

“Gartner predicts that by 2025, 80 percent of enterprises will migrate entirely away from on-premises data centres with the current trend of moving workloads to colocation, hosting and the cloud leading them to shut down their traditional data centre,” adds Govender. “If these predictions prove accurate, the new data centres will mean there’s no longer anything holding Africa back from catching up with the rest of the world.” 

Adam Barrie-Smith, Chief Technology Officer at SAQMR, believes that the Qlik platform is perfectly positioned to capitalise on the benefits that these data centres will offer. “This will complement extensive mobile analysis testing using Qlik’s SaaS and Cloud business, leveraging Qlik Senses’ multi-Cloud capabilities. The first advantage is the data centre, the next will be the containerised cloud environment which is set to follow soon.”  

To Barrie-Smith, one of the greatest benefits of local data centres is enhanced identity management. “Let’s consider the impact on the banking industry, for example. Most African banks still hold on-premise hardware, which is now reaching retirement age. The question now becomes, should they invest in more hardware or virutalise? With the new data centres, our banking customers will find it much simpler and more cost-effective to embrace the Cloud, through a hosted layer within Azure.” 

While making Cloud adoption easier, the new data centres also offer rich integration capabilities, enhanced virtualisation opportunities, a more elastic environment and greater security. “With the local Azure data centres, African organisations will be empowered to embrace hybrid cloud, and we predict a much greater cloud drive,” concludes Govender.  

 Kree Govender is the Managing Director of South Africa Qlik Master Reseller (SAQMR). 

[Column] Trent Odgers: Maximizing data availability using a multi-cloud approach

The ways businesses leverage cloud to manage and maximize the value of their data continues to evolve.

Following the launch of two multi-national data centers in South Africa recently, the years when adopting cloud-based solutions felt like the first step into some brave new world are well and truly behind us.

However, this is ushering a new era of multi-cloud deployment – one which is attracting attention, questions, and scepticism from local businesses.

A hybrid cloud is an amalgamation of on-premises “private cloud”, public cloud and managed Cloud Service Provider (CSPs) environments into a single entity where the data is physically located in multiple datacenters to deliver the right fit for a specific workload. It is a nod towards the fact that businesses are increasingly using different clouds for different purposes. 

In today’s digital economy, 81% of enterprises are embracing a multi-cloud strategy and South African businesses have already adopted this digital gold rush with many more who are planning to do so. 

It is common for the IT industry to promote the idea of a one-stop-shop or single provider strategy – to avoid the perceived inefficiency and confusion of dealing with multiple vendors. 
This is the “traditional way” of doing IT, which had its place, but with the speed at which the world is changing, businesses can truly deliver on IT’s requirements using the hybrid approach. 

Data is now described as the new oil of the digital economy, and it has become a company’s most valuable resource. As businesses demand an infrastructure which maximises the potential value of that data, IT departments are under pressure to deliver.

For example, a business may wish to store data from its business unit in Google Cloud for scalability at relatively low expense but use Amazon Web Services (AWS) for its R&D databases to enjoy the benefits of AI and voice-assisted search.

And in the same instance, that business could be using Microsoft Azure to help drive its productivity solutions or mission-critical enterprise resource planning processes, while keeping a copy of all the data on-premises or hosted at a local cloud provider. 

Previously, the only viable decision for the business would have been to make a judgment call based on its priority needs and budget constraints. Today, the best strategic option is to adopt a multi-cloud approach.

Data-driven transformation

Already, there is a movement for organisations to become more data-driven. Decision-makers are recognising the importance of data in both high-level business strategy as well as on the operational side of their business. 

Furthermore, consumers and employees are beginning to appreciate the true value of their data, which means businesses must ensure that the people who share data with them see the value in doing so through receiving more personalised experiences.

People want to know that their data is protected, secure and also want greater transparency about what it is being used for.

Of course, in South Africa, this is where it is critical to adhere to corporate governance requirements, especially the likes of the Protection of Personal Information Act (POPIA).

 Fortunately, with local multi-national data centres, aspects such as data sovereignty and speed of accessing data are no longer concerns.

But creating this data-driven culture is underpinned by continuous digital transformation – embracing the latest and greatest technologies which allow the business to repeatedly lift its performance levels. 

According to Gartner’s 2018 CIO Agenda report, making progress towards becoming a digital business is a top priority for CIOs – and the proliferation towards multi-cloud reflects this trend.

Despite this, the latest Veeam Cloud Data Management Report reveals that more than one in ten decision-makers said their organisation has experienced over 10 unplanned outages in the last 12 months, with 65 minutes being the average length of time unplanned outages last. 

Successful multi-cloud deployments depend on the always-on availability of all apps and data. So, businesses looking to take advantage of multi-cloud environments must ensure that their apps and data are always available – and that their culture of data-driven decision-making is fully supported to maintain customer confidence and brand reputation.

Availability in the multi-cloud

The complexity of maintaining availability within a multi-cloud environment is the reliance on multiple Cloud Service Providers (CSPs). While all major vendors and CSPs will make backup and disaster recovery (DR) solutions available to their customers, each provider has different protocols, shared responsibility models, service level agreements (SLAs) and capabilities. 

The last thing any business wants to hear when disaster strikes is that they are not adequately protected or that recovery has failed.

While no business, regardless of whether it is using multi-cloud or not, can guarantee that it will never experience unplanned downtime, every business can ensure that it is prepared for this possibility.

Even having local data centres is no guarantee that there will never be any downtime. South African businesses opting for multi-cloud need to ensure that they have an availability solution which sits across their entire cloud platform, making cloud data protection easy with a seamless process for sending data offsite to the cloud.

For businesses using multi-cloud to power their digital transformation in the bid to establish a more data-driven culture across the organisation, data is akin to running water – a utility which all rely on and must be available at all times. 

Businesses embracing multi-cloud should not be put off by the prospect of working with multiple vendors as software-based platforms can give the peace of mind and a turnkey solution to minimising downtime.

Trent Odgersis Cloud and Hosting Manager for Africa at Veeam