- For the public
- Teachers and pupils
462 medicinal products were withdrawn from the market between 1953 and 2013. The most common reason was hepatotoxicity. Determining the toxicity at the early stages of drug discovery pipeline is therefore crucial. At this point, using ‘in silico’ techniques is quick and effective for toxicity determination. Over the past decades, rapid technological advancements in storing and processing big data allowed scientists to use so called data hungry deep learning algorithms in several tasks such as finding anomalies in climate simulations, pattern learning in cosmology mass maps, speech decoding from human neural recordings and classifying the new physics events at the Large Hadron Collider. The use of deep learning in pharmaceutical research has also started in re- cent years. The Tox21 Data Challenge results showed that the deep learning surpassed many other computational approaches like naive Bayes, support vector machines, and random forests. However, small and imbalanced dataset problems remained unsolved. To solve these problems, we setup a deep learning framework which utilises a method named COVER (conformational oversampling ). The framework is named deepHUNT. It is the first deep learning framework that uses images from 3D conformers for classification of toxicity. The architecture uses 2D convolutional layers. To reduce the possible bias towards to the training set, it uses 5x4 fold cross validation. Preliminary results are quite encouraging and demonstrate the applicability of deepHUNT for ‘in silico’ toxicology. Finally, this work was made possible with the financial support provided by eTRANSAFE.