Looking back on MIT Future Compute
- Posted on June 9, 2022
- Estimated reading time 6 minutes
MIT's Future Compute 2022 was held in May, and Avanade's Diana Wolfe and Fergus Kidd from the Emerging Technology team attended. Still virtual, this year saw some attendees in person in Boston able to better interact with the speakers and ask their questions as part of the discussions.
Didn't catch last year's blog?
GPT-3 summarizes 2021
The MIT Future Compute event 2021 was focused on how the Internet of Things, the speed of 5G, and the reasoning of Artificial Intelligence are combining to transform business. The event highlighted the importance of modernizing infrastructures and future-proofing services. 5G was described as an advantage, not a necessity, and quantum computing was put into perspective as a way to transform whole systems of science.
What stood out to us in 2022?
Arun Subramaniyan, Intel Corporation
"Thinking about this as a software-defined, hardware-enabled world rather than a hardware that requires software to run."
Still relevant in this year’s agenda was the state of computing hardware of today. However, rather than continuing last year's battle of on prem and public cloud, the panels focused on the ongoing silicone chip shortage. Notably, we've seen large investments toward getting some of the chip capabilities back on shore, but some interesting questions remain unanswered.
Jessica Kelly, NI
"The bulk of where the shortages are coming from is that loss of time from the crystal fab."
One such question relates to the fixed growth of a silicone crystal by physics – does expanding the crystal growth help to solve our present issues? Another question is how we can secure or produce the necessary static free packaging to transport the chips. In addition, considerations were given to the immense backlog of products waiting to have new chips retrofitted (this is especially true in the automotive industry).
Tomás Palacios, MIT
"We need new materials to address the amazing challenge we have in front of us."
New materials have been critical to the development of technology today (germanium and silicone) and we now look to compound semi-conductor materials. The next generation of electronics will be about edge intelligence and future microsystems that show an increase in autonomous behaviors regarding energy, sensing, and compute. The new materials needed for these systems will open up the potential of novel applications.
Amy Webb, Future Today Institute
"Right now, the metaverse is a concept...on a long-term development trajectory...but this does not give us license to ignore the forces of change."
A new term this year that cannot be avoided any which way you look is Meta. The company, the metaverse, and the massive reaction to what it means for businesses. Interestingly there isn't much new technology here, with many 'metaverse' experiences today being experiences we previously labelled as 'VR'. For compute, and future compute however, there is a huge area of growth here.
Current hardware and experiences are lacking lustre compared to the metaverse of tomorrow we are being promised. Compute is seminal to delivering on that promise. It will need to create graphically rich environments that will live up to the hype and expectation. These will require smaller more powerful devices, as well as abstraction of worlds, code, and compute, and a transition away from our headsets and into the cloud. This echoes some of the themes of last year around centralised and decentralised approaches, but with a slightly new twist.
It was really interesting to hear from Second life and Roblox, both of which have arguably already created very credible and profitable, albeit centralised, experiences of their own, and hint to the future of the metaverse. Our favourite section was from Secondlife founder Philip Rosedale, who is adamant any metaverse that tracks our behaviour and tries to sell us things simply won't work in the long term, but looks hopeful on a user-controlled approach, where your actions have real consequences on your digital identity.
Philip Rosedale, Linden Lab and Second Life
"The hope is we can build virtual worlds without causing harm to each other."
Denise Ruffner, Atom Computing
"These computers are still going to be governed by a set of rules and logic that are classical in nature, and so this will impose fundamental limits on the types of algorithms that can efficiently be run even with advancements in compute power...Quantum presents new rules."
Quantum IS so exciting, and that's why we keep coming back to it. Our takeaway though is it still isn't time. That's not unexpected, the timeline remains unclear, but realistically is there more we can be doing today? Not really. We can continue to explore, learn, and prepare for the future, but we are still waiting for those crucial headlines of quantum supremacy.
There were discussions with what that may look like, with quantum supremacy (the point at which a quantum processor can solve a problem with some measurable advantage than a classical system) probably happening in the next 5 years in some small niche or field of science that does not necessarily have a large impact on the whole of the computing infrastructure.
Jay Gambetta, IBM Quantum
"The only reason we should expect to find a quantum advantage is when the circuits that we run have some complexity. We believe they are hard to run on a classical computer and hard to simulate….What really matters is how fast you can really run something to get a time to solution"
It has however been interesting to watch and discuss the infrastructure growing around quantum, with more and more machines being made available in cloud environments, no matter how few qubits.
The technology most propelling businesses today is AI. Investment and infrastructure remain entry barriers to organizations that want to utilize AI/ML systems while consumers demand faster, cheaper, easier, and higher quality services. Toward that end, today's machine learning can benefit from operationalization. However, there are challenges such as (a) recruiting talent, (b) building models, and (c) improving those models with higher quality data.
Melisa Tokmak, Scale AI
"Many times, it is the data bottle neck that is keeping us (the companies and business problems) from getting the results that we want to get to."
When it comes to solving novel problems with AI solutions, there is not one size fits all. Models need to be agile in a diversity of environs with novel demands, finding where you can quickly iterate on the model to improve accuracy will expedite development ROI. But what about big models like GPT-3: if large models become a platform that are widely accessible how can they best be used? Leveraging the strides in human ingenuity to reimagine steps in processes that we utilize everyday will take creativity and an eye toward the future.
Johannes Gehrke, Microsoft Research Redmond
"We have to think 'how do we actually take AI and augment and make humans more productive.”
Find out more around the trends we’re seeing in Emerging Technology, particularly the convergence of all things, in our Trendlines report: Navigating the Future with Trust.