NAS
- [x] (ICLR2017, google brain)Neural architecture search with reinforcement learning
- [ ] (2019.11)Meta-Learning of Neural Architectures for Few-Shot Learning,meta与NAS的结合:https://arxiv.org/abs/1911.11090v1
[x] (2019.01)Designing neural networks through neuroevolution,NE方法综述
(2017.09)Evolution Strategies as a Scalable Alternative to Reinforcement Learning https://openai.com/blog/evolution-strategies/, https://arxiv.org/abs/1703.03864:NES方法与DQN、A3C相匹敌(但未完全脱离梯度)
(2018.04)Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning https://arxiv.org/abs/1712.06567:gradient-free的NE方法与DQN、A3C相匹敌
(2018.04)Simple random search provides a competitive approach to reinforcement learning https://arxiv.org/abs/1803.07055:简化NE方法(RS方法)与RPO、PPO、DDPG相匹敌
结合基于梯度的方法和神经进化
(2018.05)Safe Mutations for Deep and Recurrent Neural Networks through Output Gradients https://arxiv.org/abs/1712.06563:保存状态与动作之间的关系库
(ICLR 2018.05)Policy Optimization by Genetic Distillation https://arxiv.org/abs/1711.01012:Genetic policy optimization
(ICLR 2018)Noisy Networks for Exploration https://arxiv.org/abs/1706.10295
(ICLR 2018)Parameter space noise for exploration https://arxiv.org/abs/1706.01905
新一代进化算法
The Surprising Creativity of Digital Evolution: A Collection of Anecdotes from the Evolutionary Computation and Artificial Life Research Communities https://arxiv.org/abs/1803.03453 :综述
(NIPS 2018)Improving Exploration in Evolution Strategies for Deep Reinforcement Learning via a Population of Novelty-Seeking Agents https://arxiv.org/abs/1712.06560
(NIPS workshop 2018)Deep Curiosity Search: Intra-Life Exploration Can Improve Performance on Challenging Deep Reinforcement Learning Problems https://arxiv.org/abs/1806.00553
架构进化
From Nodes to Networks: Evolving Recurrent Neural Networks https://arxiv.org/abs/1803.04439
(AAAI 2019)Regularized Evolution for Image Classifier Architecture Search https://arxiv.org/abs/1802.01548