The way we access information in the virtual space is changing. Discovery and exploration are no longer constrained by a keyword entered into a blank search bar. Instead, museums, libraries, archives, and galleries worldwide are welcoming a shift to 'generous interfaces' – presenting their collections online in browsable and linkable networks of information that allow users to explore and discover new ideas through meaningful and contextualised relationships.
A key component in this emerging virtual browsing landscape is 'visual search', an AI-based method for matching similar images based on their visual characteristics (colour, pattern, shape), rather than a keyword description. The Deep Discoveries project aims to create a computer vision search platform that can identify and match images across digitised collections on a national scale. The research will focus on botanically themed content, allowing us to test how far we can stretch the recognition capabilities of the technology (can it recognise a rose in a textile pattern and the same flower in a herbarium specimen? How about on a ceramic vase?).
Deep Discoveries is a collaboration between The National Archives, the University of Surrey, Royal Botanic Gardens Edinburgh and the V&A. We will start with ~30,000 botanical images from the institutions and our project partners Gainsborough Weaving Company, the Sanderson Design Archive, and the Museum of Domestic Design and Architecture. The wide range of partners will also allow us to explore the necessary criteria for our nation’s image collections to be linked, and to survey the searching needs of diverse audiences.