The official implementation of the Hybrid Self-Attention NEAT algorithm

Overview

REPLES LOGO

PUREPLES - Pure Python Library for ES-HyperNEAT

About

This is a library of evolutionary algorithms with a focus on neuroevolution, implemented in pure python, depending on the neat-python implementation. It contains a faithful implementation of both HyperNEAT and ES-HyperNEAT which are briefly described below.

NEAT (NeuroEvolution of Augmenting Topologies) is a method developed by Kenneth O. Stanley for evolving arbitrary neural networks.
HyperNEAT (Hypercube-based NEAT) is a method developed by Kenneth O. Stanley utilizing NEAT. It is a technique for evolving large-scale neural networks using the geometric regularities of the task domain.
ES-HyperNEAT (Evolvable-substrate HyperNEAT) is a method developed by Sebastian Risi and Kenneth O. Stanley utilizing HyperNEAT. It is a technique for evolving large-scale neural networks using the geometric regularities of the task domain. In contrast to HyperNEAT, the substrate used during evolution is able to evolve. This rids the user of some initial work and often creates a more suitable substrate.

The library is extensible in regards to easy transition between experimental domains.

Getting started

This section briefly describes how to install and run experiments.

Installation Guide

First, make sure you have the dependencies installed: numpy, neat-python, graphviz, matplotlib and gym.
All the above can be installed using pip.
Next, download the source code and run setup.py (pip install .) from the root folder. Now you're able to use PUREPLES!

Experimenting

How to experiment using NEAT will not be described, since this is the responsibility of the neat-python library.

Setting up an experiment for HyperNEAT:

  • Define a substrate with input nodes and output nodes as a list of tuples. The hidden nodes is a list of lists of tuples where the inner lists represent layers. The first list is the topmost layer, the last the bottommost.
  • Create a configuration file defining various NEAT specific parameters which are used for the CPPN.
  • Define a fitness function setting the fitness of each genome. This is where the CPPN and the ANN is constructed for each generation - use the create_phenotype_network method from the hyperneat module.
  • Create a population with the configuration file made in (2).
  • Run the population with the fitness function made in (3) and the configuration file made in (2). The output is the genome solving the task or the one closest to solving it.

Setting up an experiment for ES-HyperNEAT: Use the same setup as HyperNEAT except for:

  • Not declaring hidden nodes when defining the substrate.
  • Declaring ES-HyperNEAT specific parameters.
  • Using the create_phenotype_network method residing in the es_hyperneat module when creating the ANN.

If one is trying to solve an experiment defined by the OpenAI Gym it is even easier to experiment. In the shared module a file called gym_runner is able to do most of the work. Given the number of generations, the environment to run, a configuration file, and a substrate, the relevant runner will take care of everything regarding population, fitness function etc.

Please refer to the sample experiments included for further details on experimenting.

Comments
  • The query_cppn function returns a value of discontinuity range

    The query_cppn function returns a value of discontinuity range

    Hi,

    I have a bit of improvement point about the query_cppn function in hyperneat.py. In line 85-88, a value below the threshold is replaced with 0.0, so that range [-0.2, 0.2] of the value drop out in this implementation.

    However, the original paper (http://axon.cs.byu.edu/Dan/778/papers/NeuroEvolution/stanley3**.pdf) says "The magnitude of weights above this threshold are scaled to be between zero and a maximum magnitude in the substrate." on page 8.

    Thus, I suggest changing the query_cppn function like it returns a value of continuity range [-max_val, max_val].

    opened by yamatakeru 14
  • Config always finds 5 inputs. [RuntimeError: Expected 840 inputs, got 5]

    Config always finds 5 inputs. [RuntimeError: Expected 840 inputs, got 5]

     ****** Running generation 0 ******
    
    Traceback (most recent call last):
      File "c:\Users\Silver\.vscode\extensions\ms-python.python-2020.2.64397\pythonFiles\ptvsd_launcher.py", line 48, in <module>
        main(ptvsdArgs)
      File "c:\Users\Silver\.vscode\extensions\ms-python.python-2020.2.64397\pythonFiles\lib\python\old_ptvsd\ptvsd\__main__.py", line 432, in main
        run()
      File "c:\Users\Silver\.vscode\extensions\ms-python.python-2020.2.64397\pythonFiles\lib\python\old_ptvsd\ptvsd\__main__.py", line 316, in run_file
        runpy.run_path(target, run_name='__main__')
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\runpy.py", line 263, in run_path
        pkg_name=pkg_name, script_name=fname)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\runpy.py", line 96, in _run_module_code
        mod_name, mod_spec, pkg_name, script_name)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "g:\Emulators\ML AI open AI\env2.py", line 51, in <module>
        winner = run(200, env)[0]
      File "g:\Emulators\ML AI open AI\env2.py", line 37, in run
        winner, stats = run_es(gens, env, 200, config, params, sub, max_trials=200)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\shared\gym_runner.py", line 50, in run_es
        pop.run(eval_fitness, gens)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\neat\population.py", line 89, in run
        fitness_function(list(iteritems(self.population)), self.config)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\shared\gym_runner.py", line 25, in eval_fitness
        net = network.create_phenotype_network()
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\es_hyperneat\es_hyperneat.py", line 46, in create_phenotype_network
        hidden_nodes, connections = self.es_hyperneat()
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\es_hyperneat\es_hyperneat.py", line 151, in es_hyperneat
        root = self.division_initialization((x, y), True)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\es_hyperneat\es_hyperneat.py", line 110, in division_initialization
        c.w = query_cppn(coord, (c.x, c.y), outgoing, self.cppn, self.max_weight)
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\pureples\hyperneat\hyperneat.py", line 84, in query_cppn
        w = cppn.activate(i)[0]
      File "C:\Users\Silver\AppData\Local\Programs\Python\Python37\lib\site-packages\neat\nn\feed_forward.py", line 14, in activate
        raise RuntimeError("Expected {0:n} inputs, got {1:n}".format(len(self.input_nodes), len(inputs)))
    RuntimeError: Expected 840 inputs, got 5
    

    I ran this through the Debugger and found that at some point some random float values replace the existing number of inputs that initially gets set.

    I could even see that at some point during execution the correct number of inputs was actually used.

    I've been fighting to find the cause and I've come to the conclusion that something has to be wrong in the module.

    for some context, I took one of the examples and attempted to configure it to run a gym retro env.

    As you can see though the only thing stopping me is the inputs being messed up somehow.

    If you need more information please let me know.

    opened by SilverDash 12
  • Question about discrete gym runner observation space

    Question about discrete gym runner observation space

    Hi!

    Very cool project, thanks for making it available. I have a toy project I am working on with Gym for function approximation, and which is a discrete-valued observation space consisting of 12 integers; action space is also discrete-valued, three integers used to determine the correct agent action based on the sequence of 12 integers.

    So does pureples support discrete observation and action spaces, and would the cartpole experiment make for a good starting point for this?

    Thanks in advance!

    opened by pablogranolabar 5
  • Line 169 in es_hyperneat.py is different from the algorithm in the original paper

    Line 169 in es_hyperneat.py is different from the algorithm in the original paper

    Hi,

    The following part seems to be different from the algorithm in https://eplex.cs.ucf.edu/papers/risi_alife12.pdf.

    160 | for i in range(self.iteration_level):  # Explore from hidden.
    161 |     for x, y in unexplored_hidden_nodes:
    162 |         root = self.division_initialization((x, y), True)
    163 |         self.pruning_extraction((x, y), root, True)
    164 |         connections2 = connections2.union(self.connections)
    165 |         for c in connections2:
    166 |             hidden_nodes.add((c.x2, c.y2))
    167 |         self.connections = set()
    168 | 
    169 | unexplored_hidden_nodes -= hidden_nodes
    

    According to pseudocode on page 47, line 169 should be indented once again. Also, unexplored_hidden_nodes will always be the empty set if we remove hidden_nodes from unexplored_hidden_nodes (because hidden_nodes is always greater than unexplored_hidden_nodes). I think it needs to be corrected as follows.

    160 | for i in range(self.iteration_level):  # Explore from hidden.
    161 |     for x, y in unexplored_hidden_nodes:
    162 |         root = self.division_initialization((x, y), True)
    163 |         self.pruning_extraction((x, y), root, True)
    164 |         connections2 = connections2.union(self.connections)
    165 |         for c in connections2:
    166 |             hidden_nodes.add((c.x2, c.y2))
    167 |         self.connections = set()
    168 | 
    169 - unexplored_hidden_nodes -= hidden_nodes
        +     unexplored_hidden_nodes = hidden_nodes - unexplored_hidden_nodes
    
    opened by yamatakeru 3
  • ES-HyperNEAT for OpenAI-Gyms SpaceInvader

    ES-HyperNEAT for OpenAI-Gyms SpaceInvader

    Hey,

    First of all you did great work, easy to use and understand! What I am trying to do is, using ES-HyperNEAT to exploit the Geometrical Informations in the Picture's Pixels of an Atari Game. OpenAI Gym gives an observationspace of (210, 160, 3), i have downsized it to (84, 84, 1) without colours. These are 7056 input-Nodes, instead of 100800.

    Now the Problem is that the outputs of the substrate's outputnodes are always Zero.

    The Input Layout is:

    for y in range(1,85):
    	for x in range(1,85):
    		input_coordinates.append((x , y))
    

    Is there some configuration in the CPPN i should watch out for, is the substrate too large, or is there a max Range for the Node-Placment in the substrat (exp just between -1, 1)?

    Thanks in advance!

    opened by Multiv4c 3
  • Question about inference with evolved ANN

    Question about inference with evolved ANN

    Hi @ukuleleplayer,

    I've been working on a PUREPLES-based project with your gym runner but I can't find any resources on inference with an evolved ANN? It looks like the phenotype gets pickled and model saved whenever the reward in +1., but what type of model format is that in and how to deploy for inference tasks?

    What I want to do is implement an additional loop whenever a +1. reward is found, to test it n more times to see if it has generalized to other examples.

    And does it make sense to restart an episode on each of those saved pickles for subsequent runs?

    TIA!

    opened by pablogranolabar 2
  • Connection's __eq__ does not return a boolean in es_hyperneat.py.

    Connection's __eq__ does not return a boolean in es_hyperneat.py.

    Hi.

    Connection's __eq__ is expected to return a boolean, but it returns a tuple (float, float, float, bool, float, float, float). However, the library seems to be working correctly at first glance.

    Tentatively, I will create a PR.

    opened by yamatakeru 2
  • Missing list() in es_hyperneat.py / unsupported operand type(s) for +: 'range' and 'range'

    Missing list() in es_hyperneat.py / unsupported operand type(s) for +: 'range' and 'range'

    Hi, I think in es_hyperneat.py on line 30/31 the ranges for the input- and output_nodes should be transformed to a list with list().

    Otherwise return neat.nn.RecurrentNetwork(input_nodes, output_nodes, node_evals) throws an error: unsupported operand type(s) for +: 'range' and 'range'

    Without that change skripts like es_hyperneat_xor_large.py do not work.

    The same problem seems to appear in hyperneat.py

    opened by DaKnick 2
  • The relationship between ESNetwork.activations and max_depth

    The relationship between ESNetwork.activations and max_depth

    Could anyone please explain the following line of code in es_hyperneat.py?

            # Number of layers in the network.
            self.activations = 2 ** params["max_depth"] + 1
    

    Thank you very much.

    opened by lester1027 1
  • network.create_phenotype_network() executing for more than 30 minutes when input and output sizes are (49360,) and (1024,) respectively

    network.create_phenotype_network() executing for more than 30 minutes when input and output sizes are (49360,) and (1024,) respectively

    I have been trying to use ES-Hyperneat on a custom environment. The size of input to ES-Network is (49360,) and for output is (1024,). The "net = network.create_phenotype_network()" method is sometimes taking more than 30 minutes to execute for a single genome. Does it mean that the larger the size of input and output of network the more time it will take to create network?

    Is there any solution for this?

    opened by Abdul-Wahab-mc 1
  • Multiple activation function support for ES-HyperNEAT?

    Multiple activation function support for ES-HyperNEAT?

    Hi @ukuleleplayer

    I've noticed that all of the examples use sigmoid activation functions for ES-HyperNEAT; is the use of multiple activation function at the per-neuron level possible with PUREPLES?

    Or any activation function other than sigmoid for ES-HyperNEAT?

    TIA

    opened by pablogranolabar 1
  • Question about run_hyper()

    Question about run_hyper()

    Hi, first of all thank you for your library, it's great! I am going through the code trying to understand what each step does, regarding the pole balancing environment. There is a point that really leaves me confused: in run_hyper(), it seems we create the population and test it for one trial, then again for 10 trials, and then for max_trials trials. Any reason to do that? Thanks

    opened by ValerioB88 0
Releases(v0.0-alpha)
Owner
Adrian Westh
Data Conscious Software Developer
Adrian Westh
True per-item rarity for Loot

True-Rarity True per-item rarity for Loot (For Adventurers) and More Loot A.K.A mLoot each out/true_rarity_{item_type}.json file contains probabilitie

Dan R. 3 Jul 26, 2022
A framework to train language models to learn invariant representations.

Invariant Language Modeling Implementation of the training for invariant language models. Motivation Modern pretrained language models are critical co

6 Nov 16, 2022
Disagreement-Regularized Imitation Learning

Due to a normalization bug the expert trajectories have lower performance than the rl_baseline_zoo reported experts. Please see the following link in

Kianté Brantley 25 Apr 28, 2022
Si Adek Keras is software VR dangerous object detection.

Si Adek Python Keras Sistem Informasi Deteksi Benda Berbahaya Keras Python. Version 1.0 Developed by Ananda Rauf Maududi. Developed date: 24 November

Ananda Rauf 1 Dec 21, 2021
PixelPyramids: Exact Inference Models from Lossless Image Pyramids (ICCV 2021)

PixelPyramids: Exact Inference Models from Lossless Image Pyramids This repository contains the PyTorch implementation of the paper PixelPyramids: Exa

Visual Inference Lab @TU Darmstadt 8 Dec 11, 2022
Algorithmic encoding of protected characteristics and its implications on disparities across subgroups

Algorithmic encoding of protected characteristics and its implications on disparities across subgroups This repository contains the code for the paper

Team MIRA - BioMedIA 15 Oct 24, 2022
Official implementation of the article "Unsupervised JPEG Domain Adaptation For Practical Digital Forensics"

Unsupervised JPEG Domain Adaptation for Practical Digital Image Forensics @WIFS2021 (Montpellier, France) Rony Abecidan, Vincent Itier, Jeremie Boulan

Rony Abecidan 6 Jan 06, 2023
mPose3D, a mmWave-based 3D human pose estimation model.

mPose3D, a mmWave-based 3D human pose estimation model.

KylinChen 35 Nov 08, 2022
Open CV - Convert a picture to look like a cartoon sketch in python

Use the video https://www.youtube.com/watch?v=k7cVPGpnels for initial learning.

Sammith S Bharadwaj 3 Jan 29, 2022
iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis

iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis iPOKE: Poking a Still Image for Controlled Stochastic Video Synthesis Andreas Bl

CompVis Heidelberg 36 Dec 25, 2022
Minecraft Hack Detection With Python

Minecraft Hack Detection An attempt to try and use crowd sourced replays to find

Kuleen Sasse 3 Mar 26, 2022
PyTorch-lightning implementation of the ESFW module proposed in our paper Edge-Selective Feature Weaving for Point Cloud Matching

Edge-Selective Feature Weaving for Point Cloud Matching This repository contains a PyTorch-lightning implementation of the ESFW module proposed in our

5 Feb 14, 2022
Fashion Landmark Estimation with HRNet

HRNet for Fashion Landmark Estimation (Modified from deep-high-resolution-net.pytorch) Introduction This code applies the HRNet (Deep High-Resolution

SVIP Lab 91 Dec 26, 2022
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more

PyTorch Image Models Sponsors What's New Introduction Models Features Results Getting Started (Documentation) Train, Validation, Inference Scripts Awe

Ross Wightman 22.9k Jan 09, 2023
Pytorch library for seismic data augmentation

Pytorch library for seismic data augmentation

Artemii Novoselov 27 Nov 22, 2022
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight)

About Code release for Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy (ICLR 2022 Spotlight)

THUML @ Tsinghua University 221 Dec 31, 2022
Simple embedding based text classifier inspired by fastText, implemented in tensorflow

FastText in Tensorflow This project is based on the ideas in Facebook's FastText but implemented in Tensorflow. However, it is not an exact replica of

Alan Patterson 306 Dec 02, 2022
Deploy optimized transformer based models on Nvidia Triton server

🤗 Hugging Face Transformer submillisecond inference 🤯 and deployment on Nvidia Triton server Yes, you can perfom inference with transformer based mo

Lefebvre Sarrut Services 1.2k Jan 05, 2023
RepVGG: Making VGG-style ConvNets Great Again

RepVGG: Making VGG-style ConvNets Great Again (PyTorch) This is a super simple ConvNet architecture that achieves over 80% top-1 accuracy on ImageNet

2.8k Jan 04, 2023
Age and Gender prediction using Keras

cnn_age_gender Age and Gender prediction using Keras Dataset example : Description : UTKFace dataset is a large-scale face dataset with long age span

XN3UR0N 58 May 03, 2022