Comprehensive coverage

Calculation method for the level of toxicity of chemicals

A new computational method, designed to assess in advance whether a chemical will be toxic or not, was published a long time ago in the scientific journal International Journal of Data Mining and Bioinformatics.

Chemical laboratory. Photo: shutterstock
Chemical laboratory. Photo: shutterstock

A new computational method, designed to assess in advance whether a chemical will be toxic or not, was published a long time ago in the scientific journal International Journal of Data Mining and Bioinformatics.

There is increasing pressure today on the chemical industry and similar industries to ensure that their products meet the many safety regulations. The ability to provide legislators, manufacturers and consumers with all the data that will allow them to make informed choices regarding use, waste disposal, recycling and everything related to issues of human health and environmental sustainability is critical these days.

Now, researchers from the University of Kansas have developed a computational method that will allow the industry to predict whether a given compound will be toxic even at a low concentration and find a substitute for it if necessary. Toxicity is almost always an issue to consider when considering drug availability and dosage. Whether the compound is natural or synthetic it can be toxic, from snake venom or jellyfish stings to petrochemicals and pesticides. However, some chemicals are more toxic than others so exposure to a small concentration of them will cause health problems or even death. Therefore, it is very important to find a way to determine whether a long-discovered synthetic or natural chemical may cause toxic problems.

The research team adds and points out that the US Environmental Protection Agency (EPA) and the Office of Hazardous Substances (OTS) published a list of 90 industrial chemicals in the 70,000s, to which 1000 are added every year, for which not even a simple toxicity test was performed. The reasons for this are mainly from cost and logistics considerations, and there is also the ethical question of whether so many experiments performed on laboratory animals are indeed required by reality.

Now, a team of researchers from the Department of Electrical Engineering and Computer Science at the University of Kansas was able to use a statistical algorithm to test more than 300 chemicals whose toxicity characteristics are already known to science and compare them to the prediction. Their state-of-the-art technique offers a computerized method for scanning a huge number of compounds to examine their toxicity very quickly and may eliminate the need for laboratory animals, provided that the regulatory authorities do not require the receipt of data from the FDA regarding a given compound.

The research is based on well-known and established principles of a computerized model in the field of pharmacy known as "Quantitative Structure-Activity Relationships" (QSARs) in which the types of atoms in a compound and the relationships between them can provide a prediction regarding its activity as a drug. Certain molecular structures, for example, are soluble in water, or they react in a defined way with different enzymes and proteins in the body, which leads to the overall activity of the drugs. Different molecular characteristics will give rise to similar molecular behavior in different environments - high or low solubility in water, strong or weak affinity to receptors in the body, etc. The researchers used the same method in order to find in the various compounds the chemical groups and the way in which the atoms are connected that may cause toxicity in the body.

The researchers note that a number of previous attempts to predict the toxicity of chemicals did succeed, but most of the approaches used were no better than random guesswork. The researchers themselves use statistical approaches that increase the chances of success far beyond the level of randomness. In light of the fact that there are about 100,000 industrial chemicals that need toxicity mapping, the new method will allow the industry and regulatory authorities to focus on a large number of the most urgent chemicals from among this collection, those for which the prediction generates the highest toxicity, and leave the treatment of less toxic ones to a later stage, until the availability of Additional data. The researchers are now focusing on improving the algorithm so that it works faster and more accurately while ignoring typical molecular features that are now known not to contribute to the substance's toxicity.

The news about the study
The original article

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.