De-novo Design

The de-novo design function uses transformers leverage the ability of these architectures to process sequential data, such as molecular representations like SMILES strings. Transformers use self-attention mechanisms to capture long-range dependencies and chemical patterns, enabling the generation of novel molecules with desired properties. By training on large molecular datasets, transformers can learn the syntax and structure of chemical sequences, making them effective for designing drug-like compounds, optimizing molecular properties, and advancing generative chemistry. We used force-field based descriptors and atom types for small molecules for the consideration of accurate chemistry and estimation of protein-ligand interactions.

Upload the protein pocket (pdb)
Number of molecules to be generated
10
Submit
Load Example