Day 4 consists of only one session, but it looks to be a strong one. I’ve heard Fabio Piano speak before, and he gives a good talk.
- Developmental phenotypes in worm. Early activity is key, their methods take the form of RNAi knockdowns, but lately they have been taking high res DIC movies of embryos
- Trying to make phenotypic descriptors from images by eye, classify each embryo with these descriptors.
- Overlaid modules from PPI annotations, found lots of sensible interactions
- Data collection pipeline: strains -> protein patterns + imaging -> more protein patterns -> transformation -> clone into gfp vector -> imaging -> analysis
- Story about one particularly interesting protein (mel-28). Led them to try to sort animals using FACS.
- This pipeline showed them they could do knockdown & ts-allele screens in imaging. First round of scoring of 100 000 images scored by hand (yikes).
- Developed an algorithm to find shapes in the plate, find different parts (adules, larvae, eggs) by SVM, compute ratios (method called DevStar)
Next up is William Schaefer: Investigating genes & behaviour
- In human, rat heurons have 10^11 network size, too large to study, Drosophila & Danio, 10^5. Elegans has only 302, so can study the development from gene -> molecule -> neuron
- Elegans have many of the same senses, can seek or avoid stimulus
- Have alternative behavioural states, behavioural plasticity (simple memory)
- Presented example of reverse genetic study of nervous system genes that have subtle phenotypes (or no apparent phenotype) by eye, but that can be detected by machine vision.
- Presents the WormTracker, features related to behaviour in terms of locomotion.
- Clustered similar behavioural phenotypes (253 features using Kruskal non-metric euclidean distance). Clustering was ? Single linkage? Dunno.
- Will be publishing a database of neuron driven mutations in worm (phenotypic descriptors, in different conditions, with different knockouts): films, reduced descriptors, etc.
Next up is H. Jhuang, going to talk about automated behavioural phenotyping in mouse.
- Purpose is to reduce human induced bias (measurement, environmental) in measuring mouse phenotype
- Have a sensor based video tracking based approaches but they are too coarse grained to capture complex or subtle phenotypes
- Vision based approaches may be better, but they rely on ambiguous shape measurements. Commercial approaches are very expensive
- Their lab want to make a hybrid approach that is open source & inexpensive.
- Use a series of directional Gabor filters to extract motion in different directions -> extract directional features.
- Combine motion features with speed and directional features to feed into an SVM-HMM model to characterize the mouse behaviour in the video
- Use GPU implementation, neat!
- Paper is Jhuang, Garrote, Yu, … Nature Communications 2010
- System can be applied to find behaviour difference and classifying strains in mouse. Can learn new behaviours with minimal training video data.
- Further work will look into applying system to social behaviour
Next up is Pietro Perona talking about automating the analysis of social behaviour in Drosophila
- Machine vision guy, looking to classify and then detect behaviour.
- Has developed a factor graph of fly behaviour, using it to look at video to try and extract fly movements and use them to classify behaviour.
- Parameterizes flies with ~25 parameters. Detects them with an approach thinking of the flies as points in 25 dimensional space, with different behaviours as sudden deviations in the path through the space (path is in time)
- Performance of their system is good when compared with human expert annotation (Dankert et. al Nature Methods April 2009)
- Really really interesting results from large arena studies (regularized stopping behaviour, jousting differentiated by sex, etc.)
- A couple of other demos of Ethometrics (automating the measurement behaviour): more reproducible than other behavioural observation by humans. Advantages include: less anthropomorphic pollution, forces explicit assumptions about behavioural events.
Last is Eugene Myers from Janelia Farms. He’s giving an overview of three interesting projects that he and his group have been working on over the past year and a bit.
- First up: Tracking centrosomes in worm embryogenesis. Centrosomes are very small, hard to detect.
- Find a core sub-track you are ‘sure’ of (spatial track = appearance track) (the meat of this part of the talk was published in a paper which appeared in Bioinformatics 2010 by Steffan Jaensch). This might be useful in discovery of other less regular small objects in images, even though it was a model used from video data.
- Learn statistics of true deltas, extrapolate using learned statistics.
- Now talking about worm atlas, wants to use worm atlas to identify cells in L1 worms. Results are excellent, limiting factor is labeled data for certain cells. Work continues on this, now working on template based (rather than segmentation based methods)
- Tracking whiskers of head-fixed mice. Interesting example of a task that requires astoundingly accurate models that work fast to be useful at all (99.997 % accurate whisker detection and labeling, impressive).
- Next up is structure of fly brain. Work upcoming (Plos 2011), is image analysis to map neuronal movement in fly brain. Can trace individual neurons bu not collections of them. Can register 1-3 microns reliably
- Calls is “Hierarchical shotgun mapping”: Promoter -> individual neurons -> iterations.
- 3D registration of brain image alignment is hard (only 40% aligned well). After optimizing staining, acquisition this improved to 75%. The results were used to refine dissection protocols, they now have around 95% accuracy. Aligned consensus neurons reveals sub-compartments not visible in any one stack! Wow.
- Lastly, talking about early work on imaging a whole mouse brain. Volume is 4.2 trillion voxels!. Developing a microscope to fully image a whole mouse brain in one week. Amazing!
That’s it! More on the big picture and my impressions a little later.