You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
to generate samples that are both diverse and interesting, like in an adversarial process (GAN)?
to direct this generative process towards certain objectives, as in Reinforcement Learning (RL)?
to work with discrete sequence data (text, musical notation, SMILES,...)?
Then, maybe ORGAN (Objective-Reinforced Generative Adversarial Networks) is for you. Our concept allows to define simple reward functions to bias the model and generate sequences in an adversarial fashion, improving a given objective without losing "interestingness" in the generated data.
If interested in the specific application of ORGANs in Chemistry, please check out ORGANIC.
How to train
First make sure you have all dependencies installed by running pip install -r requirements.txt.
We provide a working example that can be run with python example.py. ORGAN can be used in 5 lines of code:
fromorganimportORGANmodel=ORGAN('test', 'music_metrics') # Loads a ORGANIC with name 'test', using music metricsmodel.load_training_set('../data/music_small.txt') # Loads the training setmodel.set_training_program(['tonality'], [50]) # Sets the training program as 50 epochs with the tonality metricmodel.load_metrics() # Loads all the metricsmodel.train() # Proceeds with the training
The training might take several days to run, depending on the dataset and sequence extension. For this reason, a GPU is recommended (although this model has not yet been parallelized for multiple GPUs).
About
Objective-Reinforced Generative Adversarial Networks (ORGAN) for Sequence Generation Models