Update on Lightstage Project

In this post, I took the liberty to write some of my thoughts and reflections on why Lightstages are (“pretty cool in my book” and also) relevant amongst today’s cutting edge developments in machine learning and data-driven decision making.

Over the last few years, I’ve had the opportunity to work as a researcher on the Aber Lightstage project, under Dr. Hannah Dee. Back then, I wrote a Python-OpenGL-based application to help us visualise and numerically evaluate lighting positions on our stage — the project is open source and on Github. Dr. Dee had successfully raised a bit of funding to bring together a team of engineers, researchers and advisors, each offering their specialist skills and knowledge to the project, and I got the chance to get involved.

What is a Lightstage and why use it?

There’s some history of Lightstage work at Aberystwyth University and other collaborating universities. Two influential British Computer Scientists Dr. Hannah Dee and Dr. Bernie Tiddeman worked together with Dr. Alassane Seck to bring a Lightstage data capture platform to Aber. Past collaborations with Dr. Abhishek Dutta, while at the University of York, enabled new datasets to be collected for facial skin condition recognition.

The Lightstage platform concept originated across the Atlantic at University of Southern California (USC) as an evolution of theatrical lighting rigs to application in computer vision & computer graphics. It was Paul Debevec who was instrumental in bringing this research technology for Hollywood audience enjoyment across the globe. His and his teams’ work contributed to the cutting edge computer graphics, special effects and photorealistic 3d model integration into movie blockbusters like Spiderman, X-men and the latest Star Wars. Later Debevec and USC brought a portable Lightstage platform to the Whitehouse to capture and stitch a 3d model of Barack Obama, during his final term, from front-facing high specification cameras.

The Aber Lightstage project fixes its gaze, not yet on political figures but instead on solving applied computer vision problems in areas such as making early medical diagnoses and in morphological identification to support the development of resilient crops. With this data capture and 3d modelling research technology, we’re able to open a host of new approaches to interdisciplinary problem domains and use cases.

One of the Aber Lightstage project aims is a refreshed approach to improving crop resilience under the growing weather volatility that’s forecast to continue along with climate change. The UK National Phenomics Centre, a part funder of the project, use an advanced robotic greenhouse — a conveyor process line — to manage, systematise and phenotypically measure crop growth stages and cycles. The structure of the Aber Lightstage has the opportunity to ideally fit into such a conveyor process line in order to expand upon research in the automation of phenotypic measures.

Another aim of the Aber Lightstage project is improving recognition and classification of facial skin conditions, such as skin lesions, enlarged pores, wrinkles and acne. Early prognosis and diagnosis are key factors in reducing treatment time and public health care cost, among other future commercial benefits.

Both domains further work and articles published by Aberystwyth University Computer Science researchers together with interdisciplinary teams. The work in medical imaging furthers publications in the British Medical Vision Conference (BMVC) by Hannah Dee, Bernie Tiddeman and Alassane Seck on data from an earlier Lightstage by Debevec’s team (2007). This was followed up by data collected from a prototype Hemispherical Lightstage (capture device) with ~50cm diameter at Aber.

The project also furthers a series of interdisciplinary works at Aberystwyth on the discovery of phenotypes from crop plants genotypes. These include 3d crop modelling from laser scanning and rotational multi-viewpoint image correspondence of Arabidopsis by Lou, Lu and Doonan. Growth modelling of Arabidopsis from top-down 2d images by Wang, Dee and Doonan. Arabidopsis growth stage dataset capture with leaf and region labelling by Bell, Dee. Oat panicle segmentation and modelling from 2d images over the crop’s growth stages by Boyle, Corke and Howarth. Other recent Lightstage development works include Kampouris and Ghosh at Imperial College London. Each add to the promise that Lightstage research platforms have the structural characteristics to harmonise with problem domains requiring highly articulated photorealistic texture capture and precise 3d geometric reconstruction of targets.

The assumption is that the underlying barrier of both problem domains is overcome by modelling the feature complexity, that means improved data quality. The indicative features in both domains have a natural tendency to be masked by other features. For example, individuals have personal visual differences, have (phenotypic) change among the individuals and species and there’s natural change over an individual’s age (or growth stages). The governing idea behind the Aber Lightstage project is that success can be achieved in both problem domains, if sufficiently distinguishable data is captured. This approach falls inline with DARPA’s 2018-19 funding push on modelling intelligent decision making (machine learning and artificial intelligence) with less data (according to an MIT Technology Review article reported 03/2019), where the data is information dense. Modelling has undergone alternative methods of capture such as active stereo, laser scanning, photometric stereo and multi-viewpoint capture (without precisely equalised stage lighting). There’s a strong possibility that the structured and systematised capture, that the Lightstage platform offers, will enable the precision data capture that will help address these problems.

What’s coming?

I’m excited about this project and looking forward to see what we’ll be able to release prior to article publication. For now, we’ll have to wait and see. I can say that data collection is certainly underway and you can find Dr. Hannah Dee’s site for more updates. Until the next release or tweet(!), you can find a set of Aber Lightstage presentation slides (Nov 2018) that I wrote focused on some of the project’s light positioning work.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.