Clik here to view.

Text mining on the introduction and conlcusion section of the CLARITY paper
Karl Deisseroth has published yesterday one of the most revolutionary technical papers in the history of modern neuroscience:<<Structural and molecular interrogation of intact biological systems>> Nature 2013.
Chung and colleagues describe a revolutionary tech for clarifying brain tissue (here is the protocol). Previous attempts have been made to render the tissue optically clear but all of them achieved a partially clear tissue and at the expenses of tissue integrity.
The Technique described by Chung and colleagues – named CLARITY- presents several key advantages that make this a technique that will catalyze a paradigm-shift in the neuroscience (read also the brain activity map initiative)
Here are some key features and future challenges posed by CLARITY
Key features of CLARITY:
[1] Intact brain: The brain and its structures can be accessed without sectioning. No tissue deformation and more accurate 3d reconstruction of processes and cellular compartments.
[2] Reduced protein loss (only 8% of the proteins are lost during the process)
[3] Macro-molecule permeable. THe whole brain can be stained with regular immuno-protocols. Chung et al were able to stain (and visualize!) synaptic puncta (psd95-dendritic spines) in the whole brain.
[4] Multiple-round molecular phenotyping. This is really exciting. The whole brain can be stripped (similarly to what you would do on a membrane after western blotting): Use and reuse the brain to stain for different markers. and this leads to the next point.
[5] Reduce, reuse. The very same brain can be used and reused, possibly reducing the number of subjects needed for the experiment.
[6] Fixed brain tissue. CLARITY works on fixed human brain tissue. Accurate reconstruction of neurites/projections will be finally available on human tissue. CLARITY was used on stored tissue that has been sitting in formalin for years!
Key challenges posed by CLARITY
This technique is extremely exciting and I truly believe it bears the potential for a paradigm shift in neuroscience. Of course this great potential comes with a number of technical and theoretical challenges:
1. Treating the whole brain as a source of information. From the brain to the Data-Brain.
As Chung and colleagues state in the paper (page1, end of introduction) the whole point of CLARITY was to «physically support the tissue and SECURE BIOLOGICAL INFORMATION>>. The whole brain is now a bank storing a wealth of information. We can now access this unprecedented level of multi-layered/multi-scale data (from subcellular to sytems level) about neurons and their activity.
But are we ready? We need to develop computational tools to deal with the whole brain at this expanded scale (3D registering of brains at the cellular scale, 3D image segmentation, automatic neurite tracing… see an example here). We will also need a new generation of microscopes to acquire/store all this information.
And once we got all of the brain mapped in its detail? Do we have the theoretical frameworks to deal with this wealth of information? Do we really need to map the WHOLE brain at the cellular scale to understand behavior and brain pathologies?
Are we approaching the BIG DATA era in science? storing immense databases of cellular/cellular compartments data of the whole brain. An army of data miners going through the data-brain will understand how the brain produces behavior? I find the idea of transforming a living-wet-biological brain into a data-brain extremely exciting. it is a new framework, a new approach to (or point of view on) the brain and its biology. I am just saying that now we need to develop the tools (computational, statistics..) to deal with the richness that we are about to approach.We need to change the way we conceive experiments. And this is were the paradigm shift happens.
2. Clearing the brain and preserving your own brain.
A mixture of passion, excitement and dedication to our work (and in some cases stupidity…) often makes us to sacrifice our own safety in favour of scientific discovery (read the gastritis story). I have spent many years in several labs and sometimes happens to see things that should not happen (dealing with chemicals without gloves, handling chemicals totally ignoring the risks they pose to your own health). Bad, very bad. Dangerous, very dangerous. So, back to CLARITY, caution when dealing with perfusing with the Hydrogel (nice name for a «lethal» mix of Acrylamide, Bis acrylamide, PFA and SDS too). Chung et al clearly state in the methods section that hydrogel is neurotoxic and carcinogenic. To conclude: Kids DO try this at home (lab) but your health (and your fellows’s, too) comes first! I don’t think that reading your amazing science/nature/neuron/cell papers will ever be of comfort to you if lying on a hospital bed with lung cancer. (sorry, it is an awful image, but lightness in lab-safety procedures really pisses me off)
4. What next at allen brain institute?
The allen brain Institute was set to provide a comprehensive map of the brain (see its recent connectivity project). The institute has an impressive automatic workforce (and pipeline) to process sliced tissue. Are they adopting CLARITY in the future? (luckily for them, brain slicing was the only step that they couldn’t automate…)
5. When will CLARITY become a standard for brain studies?
As I said before, this is revolutionary. It will change the way we deal with the brain (data-brain). So my question is: when will it be a common procedure in neuroscience labs? 2-5 years? 10-20 years? and this leads to the next question.
6. Can I borrow 2L of your primary antibody?
OK this is an exaggeration There is no need to use liters of antibody, increasing the incubation time will do (I think). But nonetheless, how many labs will have the infrastructures to acquire-store-process the data-brain? I am not clear yet if CLARITY procedure can be applied on brain parts (say the hippocampus, striatum or amygdala). This will make the approach more tractable (in terms of both money and complexity).
5. Reduce, Reuse, Reduce, Reuse…
Maybe we will no longer need large sample sizes. We can use and re-use the same brain. this may mean a reduction in animal usage for research purposes.
Maybe we will embrace a full paradigm shift and move from NULL hypothesis testing to a pure bayesian approach, finally testing our hypothesis and achieving higher stats power with smaller sample sizes.