Kiran
6 min readMar 25, 2021

--

Sustainable development with 5G Service-based architecture and AI-Neural Networks.

Sustainability is a very common topic of discussion these days in every industry. Cellular network operators and infrastructure providers are no less exposed to this and are continuously looking for alternatives to solve this need.

UN world commission defines sustainability as “sustainable development is a development that meets the needs of the present without compromising the ability of future generations to meet their own needs.” Within Ericsson, who was my employer there is a dedicated focus to achieve this through various initiatives, some info here ->( https://www.ericsson.com/en/about-us/sustainability-and-corporate-responsibility/sustainability-report).

Simply explained, sustainability means, spend our resources efficiently, conservatively, with a great deal of prioritization to appropriately accommodate our current needs, while continuously optimizing to drive the longevity and reuse of those resources. And in the process, contribute toward the reduction in carbon emissions.

Cellular Networks are a complex piece of infrastructure. From our early days of 1g, 2g,3g, 4g and now 5g, the architecture has evolved to drive efficiencies and new value propositions. That led to changes in people competencies, processes, and the systems governing and managing this infrastructure.

However, through this evolution, there have been continuing attempts to reduce complexities by refining network architectures and the adoption of newer technologies to deliver on coverage, capacity, and throughput. These are the three most important requirements that need to be met by a cellular network. To accommodate that, 1g networks started with switching systems, and radio base stations deployed across the country to cover for voice services. Time went by and we introduced internet connectivity with 2g. Improvements continued into 3g, 4G networks, providing better coverage, increasing capacity to accommodate a larger number of subscribers, and better throughput.

These improvements have not come cheap. With each incremental improvement, we introduced additional needs across the board. This meant, building and augmenting huge data centers, with lots of hardware for computing and storage, leading to the utilization of more power to keep the lights on, which ended up increasing our carbon footprint.

In this process of providing better ubiquitous service, did we forget about its impact on the environment? In other words, did we forget about sustainability? Infrastructure suppliers like Ericsson, Nokia, others, and the wireless carriers, have worked toward optimizing their product offerings with a goal of reducing the carbon footprint, and have been looking to do more.

5G service introduction and the promise it will deliver will lead to further expansion of this infrastructure, therefore will only add to the carbon footprint. The question is, will this new 5G technology, and new revisions to it, allow us to really move towards sustainable development and help in reducing carbon emissions? What tools should we arm ourselves with to make that happen?

Can AI solve this need?

Let’s explore.

The new 5G architecture is heavily focused on the concept called “Service-based architecture” (SBA). This simply means that there is a separation of responsibility between each function (network function) and these functions co-operate to execute the overall service, ex: Voice and/or data.

Technologies like virtualization, containerization, serverless computing, etc play an important role in this architecture. These network functions are called virtual network functions(vnf), and when containerized are called containerized network functions(cnf).

The SBA architecture departs from a lot of our traditional architectures in the cellular industry where we were using a monolithic piece of software operating on purpose-built dedicated hardware, hosted in a dedicated data center. Through virtualization/containerization, it is now possible to reuse the same hardware for multiple pieces of software, each delivering specific functions.

Cellular networks are all about coverage, capacity, and throughput. So 5G SBA paradigm, virtualization and containerization will now make it possible to run these vnfs and cnfs on commercial off-the-shelf (COTS) energy-efficient and optimized hardware.

This change allows us to invoke these services on-demand to handle capacity, and scale to offer more throughput and coverage based on subscriber density at specific points in time, and location.

All of these activities can be executed using sophisticated orchestration processes currently operated manually.

However, based on the scale at which these networks operate and the dynamic nature of subscriber activity, Manuel orchestration will not help. There will be a huge need for sophisticated automation of this orchestration proactively managed by an intelligent orchestration management function.

How can AI help?

AI can play a big role in this type of automation. Let’s call this “Intelligent Automation” or self-optimizing networks. Automation will be better served with decisions based on sophisticated AI deep learning algorithms which can predict an outcome so that an intelligent action can be taken. These algorithms need to be trained using good data to make those intelligent decisions.

Cellular networks produce a lot of operational data. This is field data which is mostly temporal. This can be collected, cleaned, and made available to the AI models as training and test data.

As an example, the data set will include the following,

  1. Subscriber density at a specific location, changes based on the time of the day. We measure this as peak hour loads on the network, and most of our systems are dimensioned to handle peak hour traffic. This will drive the infrastructure configuration to accommodate,
  2. Traffic types at a particular location → can be video, voice, messaging.
  3. Volumes of data traversing through the network → different load based on the density of subscribers at that location
  4. Network behavior insights based on the traffic progression,
  5. Radio signal characteristics/behaviors,
  6. Radio and other network resources consumed,

This temporal data will be a great source of training data and test data for a deep learning model.

The main responsibility of this model will be to predict the utilization in real-time, to help the network “reconfigure” itself in an autonomous way to better utilize the resources.

Recurrent Neural Network

I am thinking neural networks like “Recurrent Neural Networks” (RNN) would be a great fit to address such a problem. The available data is temporal and sequential, the feedback mechanisms provided by RNN’s will help it to get to the accuracy level that is needed.

The operational data collected is time-series data and serves as an input for the RNNs.

Use case

An example use case would be — We want to predict “User density” at a particular location, and based on that activate specific VNFs/CNF’s in the infrastructure to support that User density on demand.

The RNN’s are good specifically for changing inputs and outputs. In our case, the consumed radio resources are the input parameters directly co-relating to these network resources are the number of users consuming those resources, which will also serve as additional input parameters.

An RNN based model can be employed and trained using multiple input vectors such as signal strength, transmit power, user-generated network data. The volume of data can be secured through existing deployment to help train the model, learn and adjust the predictions to invoke the right set of infrastructure resources needed to support that user density.

In our case, if we have few inputs vectors —

  1. Number of users connected at timestamp “t”,
  2. A number of network resources utilized at that timestamp “t”,
  3. The number of compute cycles consumed by specific “cnfs” at that timestamp “t”

RNN’s are the feed-forward neural networks, where each layer computes and maintains a state at timestamp “t” based on inputs and chosen weights and activation function, This internal state is remembered and will serve as an input to the next layer in the RNN, and this process continues as we go through all the input in time (t+n) until we run out of all the sequential data.

The values of these vectors will be changing in time. So, given a set of temporal training data, we have to train the model to accurately predict possible outcomes. Through each input iteration, the model continues to improve upon the F1 score of the predictable output. In this case, it will reach an accuracy level to indicate the required resources the network needs to prepare for in advance to accommodate a specific traffic pattern.

In our case, if these inputs are coming in a time sequence, and they are changing, an RNN based model could be continuously learning and adjusting the output to predict the resource utilization pattern which can be used to determine → the network resources (compute, storage, network) may be needed to accommodate a noticeable upward trending user population in a given area. The network can decide to auto-scale this up to handle the surge at a specific point in time — All intelligently and autonomously.

That is our desired outcome.

Conclusion

If such a predictive model is deployed in the cellular network, it could efficiently help in the automation process by forking new instances of the vnf’s and cnf’s on-demand.

This will aid in the reuse of resources (compute/storage/network) based on need and scale up or down based on traffic.

Finally, this will help in improving the overall operational efficiency, and better utilization of the resources. Therefore will contribute toward achieving the desired outcome for sustainable development.

--

--

Kiran

On a mission to build the new Enterprise. Technologist and part time cinematographer.