PPO-FrozenLakeV1-rlclass / results.json
clement-w's picture
Try to upload the video preview
99b5bac
raw
history blame contribute delete
135 Bytes
{"mean_reward": 0.8, "std_reward": 0.4, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-06-07T12:54:21.033217"}