Op Amp Contest: Magnetic Core Memory The Dr Cockroach Way
Sep 26, 2023Automotive Sensor Chip Industry Report 2023: Sensor Chips Are Entering a New Stage of Rapid Iterative Evolution
Nov 02, 2023[INTERVIEW] Peter Hansford, Wayland Additive CRO, Electron Beam “highest calibur” metal additive manufacturing
Dec 11, 2023Small Particles with a Big Impact
Mar 10, 2023Contact lens embedded holographic pointer
Sep 25, 2023New AI model transforms understanding of metal
March 13, 2023
This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:
fact-checked
peer-reviewed publication
trusted source
proofread
by Ecole Polytechnique Federale de Lausanne
How does an iPhone predict the next word you're going to type in your messages? The technology behind this, and also at the core of many AI applications, is called a transformer; a deep-learning algorithm that detects patterns in datasets.
Now, researchers at EPFL and KAIST have created a transformer for Metal-Organic Frameworks (MOFs), a class of porous crystalline materials. By combining organic linkers with metal nodes, chemists can synthesize millions of different materials with potential applications in energy storage and gas separation.
The "MOFtransformer" is designed to be the ChatGPT for researchers that study MOFs. It's architecture is based on an AI called Google Brain that can process natural language and forms the core of popular language models such as GPT-3, the predecessor to ChatGPT. The central idea behind these models is that they are pre-trained on a large amount of text, so when we start typing on an iPhone, for example, models like this "know" and autocomplete the most likely next word.
"We wanted to explore this idea for MOFs, but instead of giving a word suggestion, we wanted to have it suggest a property," says Professor Berend Smit, who led the EPFL side of the project. "We pre-trained the MOFTransformer with a million hypothetical MOFs to learn their essential characteristics, which we represented as a sentence. The model was then trained to complete these sentences to give the MOF's correct characteristics."
The researchers then fine-tuned the MOFTransformer for tasks related to hydrogen storage, such as the storage capacity of hydrogen, its diffusion coefficient, and the band gap of the MOF (an "energy barrier" that determines how easily electrons can move through a material).
The approach showed that the MOFTransformer could get results using far fewer data compared to conventional machine-learning methods, which require much more data. "Because of the pre-training, the MOFTtransformer knows already many of the general properties of MOFs; and because of this knowledge, we need less data to train for another property," says Smit. Moreover, the same model could be used for all properties, while in conventional machine learning, a separate model must be developed for each application.
The MOFTransformer is a game-changer for the study of MOFs, providing faster results with less data and a more comprehensive understanding of the material. The researchers hope that the MOFTransformer will pave the way for the development of new MOFs with improved properties for hydrogen storage and other applications.
The findings are published in the journal Nature Machine Intelligence.
More information: Jihan Kim, A multi-modal pre-training transformer for universal transfer learning in metal–organic frameworks, Nature Machine Intelligence (2023). DOI: 10.1038/s42256-023-00628-2. www.nature.com/articles/s42256-023-00628-2
Journal information: Nature Machine Intelligence
Provided by Ecole Polytechnique Federale de Lausanne
More information: Journal information: Citation