ppo-LunarLander-v2 / config.json
tonynhan's picture
Upload folder using huggingface_hub
512f575 verified
raw
history blame contribute delete
152 Bytes
{"policy": "MlpPolicy", "learning_rate": 0.0003, "n_steps": 1024, "batch_size": 64, "n_epochs": 4, "gamma": 0.999, "gae_lambda": 0.98, "ent_coef": 0.01}