MIT Future Compute: A round-up from the Emerging Technology engineers
- Posted on March 26, 2021
- Estimated reading time 3 minutes
MIT hosted their Future Compute event in February and the Emerging Technology engineers attended to understand some of the trends we’ll be seeing in the near future. The tagline was ‘How the Internet of Things, the speed of 5G, and the reasoning of Artificial Intelligence are combining to transform business.’
Here’s our take on MIT Future Compute, just in time for MIT EmTech Digital in March!
If you read nothing else – then know this: everything is converging. AI and Machine Learning infused the picture from every commentator, and vendor – and this demonstrates to us that now is an inflection point, where great things will be created through the combination, re-combination and experimentation of these technologies.
Cloud – life on the Edge, or consolidation in the Center
Cloud is very much here to stay: but what will cloud look like in the future? IBM CEO Arvind Krishna argued for the growth of hybrid cloud, an approach IBM is very much invested in, but not the key trend we’re observing right now.
These talks highlighted that we exist in a tech bubble, with cloud still misunderstood outside the bubble, leading to many opportunities for modernization.
DFINITY led with the excellent phrase, “The internet is a Rube Goldberg machines of outdated components.”
This highlighted our dependence on complex and poorly connected legacy components, emphasizing the need to modernize infrastructures, and future proof services.
This was linked to the rise of the ‘open internet’, and fog computing: or the decentralization of compute. Moving forward, everyone will be thinking how to make the most of existing internet-based infrastructure, particularly with a rise in sovereign internet, and the potential for a fragmented networked world.
5G – an advantage not a necessity
Another hot topic, connectivity advances allow for improved speed, and increased usage of remote compute. Interestingly, this was described as an advantage, not a necessity. 5G enables niche use cases, but the majority of remote IoT applications are still hindered by connectivity options like 4G. The limiting factor is still availability and access rather than connection speed. There is a balance of using what is in place today, with the drive to use the latest and greatest.
Sustainability – from chip to datacenter
From chip power usage all the way to data centers and their environmental impact, sustainability was another pillar of the event. People are now focused on sustainability, particularly when considering cloud compute. Energy sustainability has always been important for mobile devices, for battery life alone, but now this is important for consumer and cloud providers alike.
Quantum – quality or quantity?
For us, thinking about future compute immediately leads to Quantum. In (our macro scale) reality, we don’t know enough to place a strong bet on timelines for availability, form factor, or access to useful quantum hardware. The race to quantum supremacy is still just that, a race. There were interesting opinions, but no industry consensus has yet coalesced. Honeywell highlighted the measure of a ‘Quantum volume,’ or the usability of a Quantum system… and others argued that only million Qubit systems are useful, a reality which seems light years from today’s double-digit Qubit systems.
Quantum computing was put into perspective for us with a focus on how it will transform whole systems of science. One such quote stated that “we don’t really understand anything about chemistry […] without a quantum computer, everything we know about chemistry today is simply trial and error.”
Whilst we may not be ready to deploy universal quantum computing solutions, we are certainly ready to optimize algorithms, and test infrastructure, with a Honeywell quantum computer now available through Azure.
Our advice? Identify your use cases, plan for quick change, and keep an eye on this field.
Classical compute – gadgets through super computers
At the other end of the spectrum, we may be reaching the limits of traditional compute. Moore’s law is ending, as it becomes challenging to cram more transistors into the same nano size spaces. What replaces Moore’s law? Efficiency of calculations? Better utilization of CPU? Specialized hardware for individual processes? It comes down to a combination of innovative approaches in the future.
As more cloud infrastructure comes online every day, providers have started offering a supercomputer as-a-service model for intensive calculations, renting out entire data centers by the hour. We expect to ‘Exoscale’ this year. This means computers with a capacity of 1 exaFLOPS (one, with 18 zeros) calculations per second. HPC focused on very specific problems, like healthcare modelling, or highly localized weather prediction.
How will this impact society?
Ultimately, much as we love technology – we live in a society, not a vacuum.
We don’t know where everything will land, but the impact of connectivity, and sovereign internet, led us to posit four scenarios:
- Collaborative Outputs – Knowledge workers leave the city, and work in a globalized highly connected world
- Global villages – Cities remain collaborative hubs for new ventures and technologies, with digital third spaces focusing on messaging
- Fragmented tribes – Travel becomes limited, and data fragments with the imposition of internal walls to accommodate data sovereignty within corporations, leading to competing standards for data sharing
- Urban foxes – Travel resurges, and corporations re-adapt to cross-border working, with connectivity focusing on the last mile, supporting the Intelligent Edge.
Find out more around the trends we’re seeing in Emerging Technology, particularly the convergence of all things, in our Trendlines report, “Experiences Without Boundaries: When devices don’t matter.”