Scaling Deep Learning for Material Discovery with Dr. Ekin Dogus Cubuk
Materials science data and computational resources are growing rapidly. While machine learning offers a promising set of tools for learning models for large volumes of data (IID), these tools are not inherently good at generalizing to new (OOD) data. For this reason, the impact of deep learning on the computational discovery of stable inorganic materials has been limited.
But, according to Dr. Ekin Dogus Cubuk, Staff Research Scientist at Google Brain, there are two observations from deep learning that encourage optimism:
- Scaling up neural networks with more data and compute can monotonically improve their IID generalization, and
- While OOD performance is almost always worse than IID performance, better IID performance is correlated with better OOD performance.
In a recent publication in Nature, Dr. Cubuk and his colleagues explored these two directions to investigate if stable material discovery can be made more efficient via deep learning. In Aionics Fortnightly Episode 34, Dr. Cubuk will discuss this work — he will introduce their approach for scaling up DFT calculations and graph neural networks, which enabled the discovery a large number of inorganic crystals that are stable relative to both The Materials Project and The Open Quantum Materials Database. All in all, this work increased the number of known, promising material candidates for certain applications by more than an order of magnitude.
Subscribe to our newsletter and stay updated with the latest in material informatics.
Ekin Dogus Cubuk
Staff Research Scientist, Google
Dr. Cubuk received his PhD from Harvard University where he used simulations and machine learning to study disordered solids and batter materials. He is currently a researcher at Google Brain, where he works on understanding and improving the generalization of large neural networks, with applications to physical sciences and vision systems.