The UAE faces a combination of challenges, in common with many Gulf countries.
- The population is just over 1 million Emirati citizens and almost 9 million expats. As a result, elements like the police force are stretched thin.
- The average daytime temperature of Dubai has been over 29 degrees for several years, making manual labour during the day gruelling and potentially hazardous.
- It is busy transitioning away from an oil-based economy.
While these may not seem closely related, these are all factors that lend themselves towards a digital transition in the UAE, refocussing on digital systems, robotics and AI to support humans in gruelling environments. As part of this, the authorities launched the Technology Innovation Institute, or TII. 6GWorld spoke to Professor Merouane Debbah about his role there and the work being done at the leading edge of AI. Much is being done, in essence, to make AI function more like people.
Leading AI models operate by deriving insights from billions of pieces of data. “Of course the big issue today in machine learning and AI is that we humans do not need to see 1 billion cat images before we recognise a cat,” Debbah observed.
This is supervised learning, where the supervision lies in the sense that somebody has labelled the information. While these huge ‘exa-scale’ models have been the way forward to date, they are not elegant and not how people work. Instead, a better way forward may well be “A routine called self-supervision, where you have a small dataset and then it starts by its own learning. This is the key to a lot of breakthroughs in all the big companies, which are now building these models where they don’t need to label everything you can imagine.”
While this may well be a necessary step, in itself it’s not enough.
“There are several approaches where we’re trying to be more energy efficient in the way we’re learning,” Debbah explained.
This is a substantial issue. As this Techtarget article outlines, the energy consumption simply for training a single, relatively simple AI can outstrip the annual energy consumption of a household. By contrast, the human brain runs at roughly 20 watts. The differences are in the orders of magnitude. Indeed, few universities can afford their own models and instead rely on commercially operated systems such as Open AI, simply because the energy demands of running such an AI would be unaffordable.
How, then, has AI made such a difference to us in recent years? Mainly, Debbah argues, because the sheer computing power available has grown. AI itself hasn’t changed so much.
“Many of the algorithms we’re using are dating from the Nineties,” he said, “There are of course improvements, but I would say incremental stuff, and the majority of algorithms are based on them.”
Clearly this can’t continue, if we want to realise an environment which contains more pervasive AI, and at the same time is sustainable. In other words, we need artificial intelligence to work more like actual intelligence. There are a variety of angles Debbah’s team at TII is exploring.
“I would say humans are very robust, in the sense that we can transfer our learning,” he begins. “If you learn French, obviously Spanish would not be a big issue for you. So you’re very broad in how you learn; that’s the first thing.
“The second is that you have this self-supervision, in the sense that after a certain moment the system kicks off. As a kid, once your teachers or parents taught you; but there’s a moment where the kid can go out and recognise new animals from a cat without anyone telling her this is something. You start defining.”
There are two goldfish in a tank. One of them says to the other “I can’t drive this thing!”
There is another important difference that sets human reasoning apart from machine today; contextuality. We instinctively frame comments, concepts and more in contexts; subverting expected contexts is one of the more important ways humans create humour.
If anything, context is more profound as a concept than we usually think. Debbah goes on to explain:
“I can define you; you have green eyes, you’re like this, you have this character and so on.
“The other way I can define you is by knowing your friends. You’re totally defined by the people you’re surrounded with, because if you’re a friend of that guy it means there’s some kind of link in your character. So if you give me all your network, you have the context; and if I have your context I know who you are as a person.”
This concept of context – of how things or concepts are linked to each other topologically – seems to be as important for human-like cognition as how things are defined in themselves, and enabling systems to create and adapt this understanding of context is a significant missing link in AI today.
“Scientifically we haven’t achieved that,” Debbah admits, “but things are ongoing on this question, and the main reason we’re studying it is the energy usage.”
Biological and digital intelligence operate quite differently, simply because one is digital. So should we be exploring ways to make an analogue AI?
“Yes, I think this is the trend we’re seeing today. The biggest trend is to look at how optical techniques work, which are within the field of analogue,” Debbah noted.
“The main problem we have there is that we can do some functions which are very precise, but if you want to change those functions you need to rebuild the whole thing. The digital arena is good because you have some kind of general-purpose processing, so you could reuse those GPUs.”
In other words, we are currently stuck with analogue computers that are very energy efficient at doing one thing, or digital computers which are very inefficient at doing many things.
Speak it quietly, but there is another form of computing being developed that goes beyond digital: quantum computing. Even though mainstream quantum computing is decades away, is it an area of research?
“Certainly this is one way to go,” Debbah admits, “And in TII we have a team working fully on quantum computing, but it’s more a longer term thing.”
Even here, there are some problems becoming clear.
“Qubits are very specific to some functions; they’re not equal,” Debbah explained. “Quantum computing is very efficient in some specific tasks, the same as I was telling you about optical analogue today. But then if you want to do general stuff, no way – a classical computer will be much more efficient in solving it.”
AI for telecoms networks
For the telecoms industry, conversation about delivering more automated, more intelligent networks has been around for some time. However, it’s not straightforward.
“There are still many cases in telecom where we have a hard time improving the system because we don’t have a very good understanding of the end-to-end systems or end-to-end expectations,” Debbah explained. “Whenever you start improving one part of the network, it’s detrimental to another part.”
The word ‘expectations’ in the paragraph above is significant, as telecoms providers realise that users’ quality of experience is not the same as the network quality of service metrics, in an increasingly granular way.
“The way you see something is very different from another person, and when you’re complaining to the operator it will be different from others. What matters for you is the stalling of a video, let’s say, while for the other person it’s voice quality. Everyone needs a very tailored kind of metric.”
This can open up an opportunity on two fronts; by understanding what people care about it can improve their experience. At the same time it can also help tailor the use of network resources so that, in constrained situations, people are being affected in ways they don’t care about.
“All the data in terms of what kind of experience you have – how you click on the system, how much you stop on a video when you go on YouTube – helps us a lot in providing the right scheduling mechanism,” Debbah summarised.
“I think this is going to be a big trend because we don’t know how to solve it outside of the AI realm. We need to know much more about the users’ experiences to be able to improve them.”
The overview and enablement of end-to-end service quality is made possible by deploying artificial intelligence throughout the network. While this may create some unexpected emergent behaviour, Debbah pointed out that AI everywhere could also support a new paradigm in communications.
He likens this to processing chocolate. It’s not efficient to transport cacao to Switzerland for processing; the only reason you’d do that is if the expertise to process cacao into chocolate was not available where it’s grown. Much better, if possible, to transport the refined chocolate. Yet the internet, and much AI, works with data on very much a ‘cacao to Switzerland’ model.
“We send the data and we see what the cloud will do, and it will query whatever it wants on that data. So it brings about a lot more exchanges,” Debbah emphasised.
Instead, a provider in future should transport the AI algorithms to the data and process it there, only sharing the outputs demanded. This would mean that future networks would be involved in the transport not so much of data, but of AI itself. Deploying AI at the edge or on the end-point to process data there could reduce the network resources used and, by extension, reduce the energy required to transport the traffic.
There is, Debbah highlighted, an important second consideration.
“Privacy and security issues in telecoms have made it that you don’t want to transmit all the data any more. You want people to keep their data where it is and move only the “refined” version, so people would not have access to your data this way.”
To run AI everywhere, though, brings us back to the focus on making it much less resource-intensive and capable of self-supervised learning. Today we are a long way away from solving those challenges. It looks conceivable that you could have specialised, analogue AI in network nodes to fulfil the specialised tasks required for optimal network management; however, the flexibility demanded to deliver on the promise of commercial models for ‘AI everywhere’ appears to demand the flexibility of digital computing.
Debbah remains optimistic, though. “In many ways more optimistic than the telcos,” he enthuses. “They are in a process of optimisation, and there’s only so much you can optimise. We’re exploring breakthroughs, so there are currently no limits to how much that could change things.”