Keith T. Butler

machine_learning

  • December 01, 2021

    Active Learning on Graph Nets

    Graph networks have become very popular for representing and modelling materials and molecular structures. Like most deep learning approaches GNNs tend to require rather large training sets (minimum a few thousand examples to outperform classical ML approaches), another issue is the lack of confidence intervals on predictions. In this work we set out a method to address both of these problems.

  • December 01, 2020

    MRS 2020 #2: Predicting materials synthesis

    Being able to predict the properties of a hypothetical material is one thing, being able to predict if that material can ever be realised is quite another thing. I can recall hours playing around with swapping elements and playing what-if games in the DFT universe. You could discover all kinds of great new electronic structures, but show them to an experimental chemist and they would laugh you out of the room. This actually highlights two problems, one (probably the main one) is that simply being able to optimise a geometry is no guarantee that a structure is actually plausible, but the other is that experimental chemists tend to be very focussed on the space of compounds and molecules that they know, it can be hard to discover truly novel systems. So some kind of AI that can reliably and in an un-biased manner explore chemical space should be able to overcome this, right? The problem is how do measure synthesisability to train such a system?

  • November 29, 2020

    MRS 2020 #1: Active learning and automated materials science

    Automation of materials discovery and robotic labs has had a big year, for a nice overview there is an article in Nature covering many recent breakthroughs. This popularity is reflected in a lot of exciting talks at MRS on the topic of automated labs, I’ll very quickly cover background on one of the most important enabling methods for automated labs, that is active learning, and then give a pick of three talks that really caught my attention.1 I obviously missed a bunch of really interesting talks, this is by no means exhaustive. ↩

  • January 12, 2020

    Materials Hipster #8: Olexandr Isayev

    Olexandr (Oles) recently moved to Carnige Mellon University, Pittsburgh. Oles is the guy I blame for getting me into machine learning - we met when we were both teaching a summer school in X'ian and over street-side clams and beer Oles preached the potential of data driven approaches to materials science and converted me. It's fair to say he has been one of the pioneers in this field with seminal works on representations of crystal structures , generative models and active learning to name a few. I caught up with Oles when he was visiting the UK last summer and we had a great chat over X'ian noodles and cocktails in London!

  • November 17, 2018

    Materials Hipster #5: Martijn Zwijnenburg

    Martijn talks about how AI and co-polymers can change chemistry and has a melancholic lament about the decline of Phil Mag.

  • March 08, 2018

    Example of overfitting and underfitting in machine learning

    A Python based example to demonstrate a model that overfits or underfits depending on complexity.