Bio-Automation and Robotics at Edinburgh  

 The University has an active community of researchers interested in all aspects of automation and robotics. It has developed a roadmap for growth in the deployment of technologies in this area and is working in close partnership with industry.

The vision for Edinburgh is to embrace the democratization of low-cost automation tools enhanced with the power of artificial intelligence algorithms and data sharing protocols which will revolutionize the way industry develops new bioproducts in addition to how biological research is done in academia.

New ways of working

We are developing new ways of working, deploying robotics and automation in our labs. This improves the reproducibility of experiments, reduces cost, increases speed and saves valuable time. Researchers can then focus on creative design and analysis, rather than ‘doing.’

We are promoting the shift from doing individual experiments manually varying one variable one step at a time, towards automating the experiments in a high throughput way involving the optimization of several variables in a single experiment.

We are embracing the capacity of automation tools to obtain large-quality datasets to move towards data-driven research, taking advantages of machine learning algorithms to elucidate new insight even if no evident correlations were initially observed.

We are incorporating artificial algorithms intro our automated platforms,  delegating the creation of internal hypothesis in the design cycle to the AI algorithm while allowing the researcher on the bigger picture generating higher-level hypotheses.

 Automation at Edinburgh 

Edinburgh has several labs using automation routinely in their processes

  • The Edinburgh Genome Foundry – The Foundry is an automated platform that delivers high throughput DNA assembly at scale
  • Edinburgh Genomics – Hamilton automated platforms for Next-generation sequencing
  • The LeoRios Lab – Low-cost open-source automation, Opentrons and Digital Microfluidics (Digi.Bio)
  • The Menolascina Lab – Microfluidics combined with Machine Learning algorithm
  • The Edinburgh Centre for Robotics – a national centre of excellence for robotics and autonomous systems (https://www.edinburgh-robotics.org/about-us
  • Dr Neil Carragher (Cancer research UK Edinburgh Centre): Automated cancer Drug Discover and Screening platforms
  • Dr Eoghan O’Duibhir (Centre for Regenerative Medicine) High throughput cell screening
  • Prof Alistair Elfick, (School of Engineering) Edwin platform: Beckman automated platform for dynamic promoter characterization
  • Dr Maïwenn Kersaudy Kerhoas, (Edinburgh Medical School): Microfluidics tools for high throughput biomarkers detection
  • Prof Mark Bradley, (School of Chemistry): Automated high throughput chemical Biology
  • Dr Guido Sanguinetti (School of Bioinformatics) Machine Learning algorithms to model high-throughput biological data
  • Dr Grant Mair (Centre for Clinical Brain Sciences): Automated tools for precision medicine
  • Dr Adam Stokes, ( School of Engineering): Soft Robotic Systems
  • Subramanian Ramamoorthy (School of informatics, Edinburgh Center for Robotics) Robot learning and decision-making

 New ways of thinking

In the future, automation and robotics will become an ever-increasing component of everyday research, indeed part of everyday life

However, to derive the greatest value from automation, we often need to rethink why and how we do things more traditionally to help us transfer them to a more automated system. Therefore, training and education in the underpinning platforms – programming (scripting) of machines, developing and adapting protocols – is essential

This will involve:

  • Training in automation and coding skills to perform experiments in a high throughput way involving the optimization of several variables in a single experiment.
  • Training skills relevant for the analysis of large datasets (i.e Machine Learning) a to elucidate new insight even if no evident correlations was initially observed.
  • Training skills in order to design automated experiments that carefully consider the different effects influencing the outputs.
  • Artificial intelligence training to delegate the creation of internal hypothesis in the design cycle to the AI algorithm, while focusing on generating higher level hypotheses