Typically, laws are meant to regulate or restrict. In contrast, Moore’s Law signalled a new era of technological accessibility. Now that Moore’ Law technically no longer applies, the rules of the game are changing too.
Moore’s Law No More?
Nick Harris, CEO of photonic computing company Lightmatter, recently talked to 6GWorldTM. He relayed how speeds have become relatively stagnant, in large part due to the slowdown of Moore’s Law. The law, which was always more of a projection on the part of Intel co-founder Gordon Moore, states the numbers of transistors on a chip will double every two years leading to improved processing.
“During our lifetimes up until I’d say 2010, computers were just getting better and cheaper, better and cheaper, over and over again,” he said. “If you have a [computer] from 2010, it’s going to be pretty similar in terms of speed and the reason for that is we’ve run into these technological barriers with scaling transistors.”
Harris said that has led to multiple chips being used to make calculations instead of just one to keep pace with computing requirements. He cited Intel’s own Foveros as an example. However, even though progress is being made on that front, all the added complexity increases costs.
“The democratisation of technology that’s happened through transistors and transistor-based computing is coming to an end,” he concluded. “Going forward, the chips and the systems that they’ve built out of them are going to be so expensive average people won’t have access to the latest and the greatest like they do now.”
Harris broke down the challenge further. He said the demand for compute is growing at a faster rate than ever before, largely driven by new AI applications, for which Lightmatter’s photonic computing platform is purpose-built.
“Every time we shrink transistors (this is Moore’s law) they’re supposed to get more energy-efficient, but they’re no longer doing so. At the same time you’ve got this really big drive for having more compute to run AI and high-performance computing applications,” he said. “If you want to run state-of-the-art algorithms, the only people that can afford to do that are the big companies who have massive servers.
“I think there’s a future where you’re going to be [buying] time on a cloud server to use the latest computer, because you personally could not afford to buy that computer chip or to build a system with these chips.”
Vipin Jain, CTO of Pensando Systems, which has developed a distributed service platform based on Domain-Specific Processors (DSPs), agreed with the assessment. In his opinion, there’s been more evidence of capacity being hit over the last three or four years in particular.
“I think the strategy was to go and expand it horizontally, plan out more cores and try to do distributed computing, in a way scale-out and/ or parallel processing, but saturation is hitting that wall and people have already started working around it, for example not just the AI workloads, but other workloads as well,” he said. “Already, you see how clouds are getting built, and how cloud infrastructure is getting built. It starts to prove the limitations because the economies of scale work great there for special-purpose things.”
Democratisation vs. Consolidation
Jain spoke to 6GWorld about a piece he had written as a Forbes Technology Council member on DSPs, i.e. processors for individual application areas such as automated vehicles or databases. He wrote that DSPs mitigate the ever-increasing need for compute, using increased public cloud adoption as an example of how manufacturing efficiencies are increasing. DSPs are easier to program for use in the individual domains, according to him. With that in mind, greater accessibility through software has helped with democratisation and could continue to as well.
“Software is very key [to the continued democratisation of technology],” he told 6GWorld. “The reason, for example, AI is successful is not because there’s a full piece of hardware, but it is easy to use. The consumption model really depends on how you start to put some face to the niceties of the engine that is powering these things. The engine is good to have, but what’s the user interface to that engine?”
“If I were to take a car as an example, you can have a fantastic engine, but what the user really sees is how easy it is to operate that engine. Once you make it easy to consume, it becomes super-easy for a large audience to adopt it almost instantly. That’s why I think software is really an enabler for things like that.”
Asked if he sees an alternative to the additional complexity to which Harris referred, Dean Bubley said it depends. An independent technology industry analyst, Bubley told 6GWorld companies are responding by improving the efficiency of AI algorithms. He used image processing as an example.
“There are companies out there that are saying, ‘Can we take software which today has to run in a data centre in the cloud and scale it down so it works on an individual chip?” he said. “Years ago you might have thought to do things like facial recognition, you’d have to run that as a service on a server or a cloud platform, but these days you can actually run it on the silicon of the camera chip itself.
“So there’s a certain amount of people that will look for efficiently designed algorithms rather than the sort of brute-force approach that’s been done before. That doesn’t change the efficiency of the underlying hardware, but it improves the efficiency of how it’s used.”
While there is undeniable consolidation in the chip industry, Bubley looked at it from a different angle. Only a select few companies, like Taiwan Semiconductor Manufacturing Company (TSMC), are physically making chips in factories, but he argued that opens up new opportunities in other layers.
“You’ve got a much larger ecosystem that is using design tools and creating its own products in terms of intellectual property, but then outsource the actual manufacture. So, you get democratisation of one layer. At the same time, you get consolidation at another and I think that’s replicated across the technology realm,” he said.
Heads in the Cloud
Bubley pointed to the cloud to illustrate. He argued that, even though hyperscalers put up barriers to entry in the industry, they enable other layers and forms of compute technology.
“Overall I feel like the wider concept of virtualised software or virtualised compute functions is expanding, even whilst the hyperscale data centre owners and cloud-computing platform owners are consolidating. So it’s a yin-yang really,” he said.
Jain asserted the cloud consumption model is a viable one. He agreed that it allows people to share the cost instead of shouldering it themselves. However, there are alternatives, some of which may not have been conceived yet.
“The demand for computing is not going to go away because processing technology is saturated. People will always look out for new frontiers of innovation,” he said. “In this day and age, people are very attracted to building software like crazy, which is good, but, at the same time, Apple, Amazon, you take these companies. They are super-successful, because they have been able to vertically integrate the stack and use and abuse in the layers for measurable gains, and that’s very critical I think.”
“My take is that things will not be easy for sure, but, when boundaries are pushed, and you are in the corner, then that’s what you will have to go and innovate on. I think that’s just the starting point.”
Feature image courtesy of greenbutterfly (via Shutterstock).