Considering a broader range of responses to candidate drugs
Researchers at the Karolinska Institute were using a proteomic screen to determine the mechanism by which cancer drug candidates killed cells when they realized a potential flaw in the standard experimental design. Amir Ata Saei and colleagues usually treated adherent cancer cells with a compound, then rinsed away any cells that detached and measured how the proteome had changed in cells that survived. But, the researchers reasoned, the dying cells were responding the most strongly to the experimental drug. Shouldn’t their responses be examined?
In a study published in Molecular & Cellular Proteomics, the researchers compared proteomes of cultured cancer cells that had survived and that had not after being treated with various drug candidates. Using drugs with known targets, they found that taking dying cells into consideration improved the accuracy of target identification. To their surprise, the researchers also found that some proteins were upregulated in all detached and dying cells, regardless of the drug they used. They propose that these proteins, which had not been linked to cell death before, may be cellular decision makers and promising chemotherapeutic targets.
How soy resists salt
How will crops cope with rising sea levels and increasing salt in the water table as the climate changes? In a recent study in Molecular & Cellular Proteomics, researchers at Hangzhou Normal University studied the response of soybeans to salt stress. Salt disrupts mitochondria and increases the reactive oxygen species in the plant cells. Therefore, increasing production of antioxidant molecules like flavonoids may protect the plant. By combining phosphoproteomics and metabolomics, Erxu Pi and colleagues described a salt-stress signaling pathway in soybean roots that increases flavonoid synthesis and improves salt tolerance. DOI: 10.1074/mcp.RA117.000417
Monitoring longitudinal proteomics studies
Nothing is more frustrating than being forced to throw out data. After taking the time to prepare samples, take measurements and analyze them, a researcher may realize that there was a problem with an instrument that compromised the results. With luck, the realization happens early—but work still needs to be repeated. In long-running clinical proteomics studies, sometimes the realization doesn’t occur until all of the data are pooled, and the cost can be high. In a paper in press in Molecular & Cellular Proteomics, now available online, researchers at the Pacific Northwest National Laboratory working on a longitudinal study of the development of diabetes announce that they have developed a tool that can tell in real time whether the quality of spectra has dropped. Bryan Stanfill and colleagues developed an algorithm that considers statistical features of mass spectrometry data to make a constantly adjusted model for comparison to a baseline of known high-quality data. By intentionally manipulating the instrument— for example, by changing an ion lens setting—they showed that the algorithm could identify times when the instrument needed to be recalibrated. The software also helped to flag points when cleaning and maintenance the researchers hadn’t planned for were required. The software, called QC-ART, is freely available on the software sharing site GitHub.