Technology

AI Mixes Concrete, Designs Molecules, and Thinks with Space Lasers

AI Mixes Concrete, Designs Molecules, and Thinks with Space Lasers

Welcome to Perceptron, a weekly summary of AI news and research from around the world from TechCrunch. Machine learning is becoming a critical technology in almost every business, and there is much too much going on for anyone to keep up with it all. This column will compile and explain some of the most noteworthy recent findings and articles in the field of artificial intelligence. Previously known as Deep Science; earlier versions may be found here.

This week’s roundup begins with a pair of forward-thinking Facebook/Meta research. The first is a cooperation with the University of Illinois at Urbana-Champaign that attempts to reduce concrete production emissions. Concrete contributes around 8% of carbon emissions, so even a minor improvement might help us fulfill our climate targets. The Meta/UIUC team used over a thousand different concrete formulations to train its model, which varied in the quantities of sand, slag, powdered glass, and other elements (you can see a sample chunk of more photogenic concrete up top). 

It was able to create a number of newformulas that optimized for both strength and low emissions after discovering the subtle tendencies in this dataset. The winning combination produced 40% less emissions than the regional benchmark while also meeting… well, part of the strength criteria. It’s incredibly promising, and field studies should get the ball rolling again shortly. The second Meta research is concerned with altering the way language models operate. The business intends to collaborate with specialists in neuro imaging and other researchers to see how language models compare to actual brain activity during similar activities.

They’re particularly interested in the human ability to anticipate words far ahead of the present one while speaking or listening – for example, knowing how a sentence will conclude or that a “but” is on the way. Even while AI models are improving, they still rely on adding words one by one, like Lego bricks, and occasionally checking backwards to see whether they make sense. They’ve only just begun, but they’ve already produced some intriguing findings. Back on the materials front, Oak Ridge National Lab researchers are joining the AI formulation party. 

The scientists developed a neural network that could predict material characteristics using a dataset of quantum chemistry computations, whatever they are, and then flipped it so that they could input attributes and have it propose materials. “Rather of forecasting the qualities of a material, we wanted to identify the optimum attributes for our purpose and work backwards to design for those properties fast and effectively with a high degree of certainty.” “This is known as inverse design,” Victor Fung of ORNL explained. It appears to have worked, but you can double-check by running the code on Github.