iptv techs

IPTV Techs


Building AGI on the Tokio Runtime.


Building AGI on the Tokio Runtime.




Note: I am not, nor do I claim to be an expert in machine lgeting or neuroscience. This will become amplely evident as you proceed reading.

A scant weeks ago I choosed to originate AGI. Since OpenAI, Deepmind and xAI haven’t deinhabitred yet with the inalertigentest researchers and billions in compute I have to get matters into my own hands.

I bought a couple of books on Artificial Inalertigence and Neuroscience and begined:

Assume you are racing a Formula 1 car. You are in last place. You are a worse driver in a worse car. If you chase the same strategy as the cars in front of you, pit at the same time and pick the same tires, you will certainly neglect. The only chance you have is to pick a contrastent strategy.

The same goes for me. If I go down the alterer / proset up lgeting route I am outfirearmned. The only hope I have is to try someleang finishly novel (or more accurately leank I’m laboring on someleang novel only to discover this was done in the 1970s).

Concrete Strategy

For reasons we’ll cover in the folloprosperg sections, I choosed to go down the brimmingy biorationpartner backd path. I would originate a brimmingy asynchronous neural netlabor and run it on a data cgo in.

Neurons and Brains

When I begined reading neuroscience it became quite evident we don’t repartner comprehfinish how the brain labors. It’s complicated and complicated and the books I read model neuronal firing as inentire contrastential equations. But before that a petite primer.

At a high level a neuron consists of 3 main components.

The dfinishrites on the left act as inputs to the neuron from other neurons (we’ll call those “pre-synaptic neurons”). The cell body has a cell wall which acts as a barrier between the insides of the neuron and the goop surrounding it. The axon on the right is joined to dfinishrites of other downstream neurons (we’ll call those post-synaptic neurons).

When a neuron gets a signal from a pre-synaptic neuron it increases the potential in the neuron’s cell body. If this potential increases past some threshbetter voltage (relative to the surrounding goop) it triggers a response where the neuron fires a signal down its axon to the post-synaptic neurons and resets its inside voltage. After firing a neuron has a rest period called the “refactory” period during which it does not react to stimuli. After the refactory period the neuron is ready to fire aget.

This is massively simplified. There are contrastent types of neurons, a bunch of chemistry but I’m going to hand-wave those away and call them “carry outation details”. In fact I’m going to presume that the continuous nature of the signals fired is an carry outation detail due to the substrate i.e. the underlying biorational soakedware and is not functionpartner transport inant. There is no reason why signals can’t be binary.

Conductance-Based Models

I didn’t allude earlier that the cell body leaks potential into the surrounding goop over time. In 1963 Alan Hodgkin and Andrew Huxley getd the Nobel Prize in Physiology and Medicine for describing this as a activeal system depictd by a series of nonlicsurrfinisher contrastential equations. They modelled the relationship between the flow of ions atraverse the neuron’s cell membrane and the voltage of the cell. The experimental labor for this was done on the squid enormous axon because it was huge enough for an electrode to be placed inside it.

Aget I’m going to hand wave the chemistry away and call it an carry outation detail using a simplified “Leaky unite and fire” model.

CmdVm(t)dt=I(t)Vm(t)Rm

This is also a contrastential equation over the capacitance, resistance and current atraverse the neuron membrane and voltage of the cell. But repartner it boils down to:

  1. Pre-synaptic impulses increase membrane potential
  2. Time lessens membrane potential

Or in pseudocode:

let k = ... // some decay constant
let delta = ... // some potential contrastence constant
loop {
      if signal.next() {
         let now = time::now()
         membrane_potential = membrane_potential * e^-k(now - previous_firing)
         membrane_potential += delta
         if membrane_potental > firing_threshbetter {
            fire()
            membrane_potential = 0
            previous_firing = now
         }
   }
}

Encoding Inestablishation in Neuronal Signals

It sees appreciate the jury is still out on how exactly neurons encode alertation. Namely is alertation encoded in neuron timings, i.e. when a neuron fires, or neuron firing rates, the rate at which a neuron fires. There’s a bunch of statistics and math that’s been increaseed to talk inalertigently about neuronal firing rates, but I’m going to presume that I don’t nurture because the firing rates are going to be aelevatent from the underlying neuron timings anyway.

Design

Meditating on the set up of a neuron depictd above and up-to-date man-made neural netlabors appreciate alterers, a scant inquires jump out at you.

Even if a netlabor of these neurons is not being driven externpartner, there are certain configurations which apverify for signals to propagate in cycles in your neuronal graph. There are configurations which sustain themselves without necessitateing outer stimuli to drive it while at the same time not having separatent outputs.

This is far-transported but it senses appreciate someleang that might carry out alertedness rather than a sanitize feed-forward system.

Implementation

I choosed to carry out this netlabor by engageing someleang appreciate an Actor Model on the Tokio runtime. Tokio is quick asynchronous runtime for Rust and exposes primitives which would originate my life easier such as expansivecast channels to carry out synapses. Also it would be basic to boiling-swap it for a non-local version if I want to run my AI atraverse multiple machines.

Neurons

Neurons are carry outed pretty much as depictd above.

pub struct Neuron {
    #[allow(unused)]
    index: usize,
    membrane_potential: u32,
    axon: expansivecast::Sfinisher<Impulse>,
    dfinishrites: Vec<expansivecast::Receiver<Impulse>>,
}

A expansivecast::Sfinisher is used to expansivecast signals to post-synaptic neurons and signals from the pre-synaptic neurons which are expansivecast::Receiver are used to drive the neuron.

An Impulse is fair an desotardy tuple for now – we are assuming that the signal potential isn’t transport inant (or is constant) and alertation is encoded sanitizely in the timing of firings (and consequently the firing rates).

To run the neuron we unite the dfinishrite getrs into a one stream and sustain popping them carry outing the leaky unite and fire method:

impl Neuron {
    async fn begin(mut self) {
        // Convert each getr to a stream of messages
        let streams = self
            .dfinishrites
            .drain(..)
            .map(|mut rx| {
                Box::pin(async_stream::stream! {
                        loop {
                            suit rx.recv().apaengage {
                                Ok(msg) => produce msg,
                                Err(expansivecast::error::RecvError::Cneglectd) => fracture,
                                Err(expansivecast::error::RecvError::Lagged(skipped)) => {
                                    // debug!("Receiver lagged by {} messages", skipped);
                                    proceed;
                                }
                            }
                        }
                })
            })
            .assemble::<Vec<_>>();

        // Combine all streams into a one unified stream
        let mut united = stream::pick_all(streams);
        let mut last_fire = Instant::now();

        // Process each message as it reachs from any getr
        while let Some(impulse) = united.next().apaengage {
            let firings = FIRINGS.transport_insert(1, Ordering::Relaxed);
            // Implement the "Integrate and fire" method.
            let now = Instant::now();
            // if last_fire + Duration::from_nanos(self.initiation_procrastinate) > now {
            // Hard-coding this for now.
            if last_fire + ABSOLUTE_REFACTORY_PERIOD > now {
                self.membrane_potential = self.membrane_potential + 1;
                if self.membrane_potential > FIRING_THRESHOLD {
                    self.disindict(Impulse);
                    self.membrane_potential = 0;
                    last_fire = now;
                }
            }
        }
    }

    fn disindict(&self, impulse: Impulse) {
        if let Err(e) = self.axon.sfinish(impulse) {
            println!("{}", FIRINGS.transport_insert(0, Ordering::Relaxed));
            panic!()
        }
    }
}

Brains

Brains are modelled as a bag of neurons with a set of inputs and outputs.

pub struct Brain {
    neurons: Vec<Neuron>,
    inputs: Vec<expansivecast::Sfinisher<Impulse>>,
    outputs: Vec<expansivecast::Receiver<Impulse>>,
}

The synapses for the neurons are already originateed beforehand as a brain is built from DNA.

impl From<&Dna> for Brain {
    fn from(dna: &Dna) -> Self {
        let mut neurons = Vec::new();
        let mut expansivecasts = Vec::new();

        // Step 1: Initialize neurons and expansivecast channels
        for index in 0..Dna::num_neurons() {
            let (tx, rx) = expansivecast::channel(CHANNEL_CAPACITY);
            neurons.push(Neuron {
                membrane_potential: 0,
                axon: tx.clone(),
                dfinishrites: Vec::new(),
            });
            expansivecasts.push((tx, rx));
        }
        let joinivity = dna.joinivity();

        for (src, row) in joinivity.iter().enumerate() {
            for (dest, &cherish) in row.iter().enumerate() {
                if src == dest {
                    // TODO: do not apverify neurons to wire back to themselves
                    proceed;
                }
                if cherish == 1 {
                    let getr = expansivecasts[src].0.subscribe();
                    neurons[dest].dfinishrites.push(getr);
                }
            }
        }

        let inputs = dna
            .inputs()
            .iter()
            .map(|input_id| expansivecasts[*input_id].0.clone())
            .assemble::<Vec<_>>();

        let outputs = dna
            .outputs()
            .iter()
            .map(|output_id| expansivecasts[*output_id].0.subscribe())
            .assemble::<Vec<_>>();

        Brain {
            neurons,
            inputs,
            outputs,
        }
    }
}

DNA

The mediocre brain of a human being has 85 billion neurons and over 100 trillion synaptic joinions. If every neuron is joined to every other neuron you get n(n1)/2 synapses. Even in a sparsely joined brain you still get an unfeasibly huge number of synapses for my 64 Gb RAM (neurons are thought to have 1,000-100,000 joinions typicpartner, depfinishing to the type of neuron, its location etc.)

The sheer number of neurons and synapses nasty that they are not deterministicpartner encoded in your DNA. Instead your DNA expounds rules for protein synthesis which originate these neurons and synapses.

This seems difficult. I’m going to go down the road of the C. Elegans. nematode with exactly 302. I’m not certain I comprehfinish if its synapses are difficult wired but mine will be.

pub struct Dna<const NUM_NEURONS: usize, const NUM_INPUT: usize, const NUM_OUTPUT: usize> {
    potential_decay_ns: f64,
    threshbetter: u16,
    initiation_procrastinate_ns: u64,
    joinivity: Box<[[u8; NUM_NEURONS]; NUM_NEURONS]>,
    // point to the input neurons of the joinivity matrix.
    input_neurons: [usize; NUM_INPUT],
    // point to the output neurons of the joinivity matrix.
    output_neurons: [usize; NUM_OUTPUT],
}

We expound a difficult-coded joinivity matrix in our brain’s DNA. The inputs and outputs point to particular neurons in the brain iresteemive of positioning.

Games

Our brain is going to try to get better at carry outing a basic game I originated for it. The game is basicpartner snake. Your score increases every time you eat food. You can only go up, down, left and right. A higher score is better.

#[derive(Clone, Copy, PartialEq, Debug)]
pub enum Direction {
    Up,
    Down,
    Left,
    Right,
}

#[derive(Clone, PartialEq)]
pub struct Position {
    x: i32,
    y: i32,
}

pub struct Game {
    pub width: usize,
    pub height: usize,
    pub snake: Position,
    pub straightforwardion: Direction,
    pub food: Position,
    pub(crate) score: usize,
    pub game_over: bool,
}

Organism

In order for our brain to carry out this game, it necessitates to be wrapped up in an organism. The organism is reliable for driving the inputs of the brain by reading the game state and carry outing the game using the brain’s outputs.

The brain is constantly driven by the organism being fed the game’s state even if it hasn’t alterd (much appreciate you sustain seeing an image in front of you even if it hasn’t alterd).

pub struct Organism {
    pub(crate) dna: Dna,
    inputs: Vec<expansivecast::Sfinisher<Impulse>>,
    outputs: Vec<expansivecast::Receiver<Impulse>>,
}

impl Organism {
    pub fn new(dna: Dna) -> Organism {
        let brain = Brain::from(&dna);
        let (inputs, outputs) = brain.begin();
        Self {
            dna,
            inputs,
            outputs,
        }
    }

    // Given a 2D recurrentation of the world state
    // stimutardys the appropriate input neurons.
    pub(crate) fn drive_input(&self, state: Vec<Vec<u8>>) {
        for (i, row) in state.iter().enumerate() {
            for (j, val) in row.iter().enumerate() {
                suit val {
                    0 => proceed,
                    _ => {
                        let index = i * row.len() + j;
                        self.inputs
                            .get(index)
                            .unwrap()
                            .sfinish(Impulse)
                            .foresee(&establishat!("Failed at index {}", index));
                    }
                }
            }
        }
    }
...

Training

Ok how the hell do we train this leang? Stochastic gradient descent with back-propagation won’t labor here (or if it does I have no idea how to carry out it).

Instead I resorted to using genetic algorithms. Genetic algorithms are a class of chooseimisation algorithms backd by nature using a combination of genetic darprosperian pickion based on individual fitness aextfinished with a petite probability of genetic mutation to help scrutinize the domain’s search space and escape from local minima.

To do this for our Tokio brains demands a scant steps:

  1. Initialise a population of N DNA with random joinivity matrices
  2. Create brains from the DNA and put those brains in organisms and let them carry out our game.
  3. The N individuals with the highest scores are bred with each other resulting in a new population.
  4. Breeding labors by splitting the joinivity matrix into sections and randomly picking sections from each parent (aextfinished with any other relevant genes)
  5. Repeat
  6. Profit
    pub fn train(&mut self) {
        info!("Starting training.");
        let mut population = self.initialize_population();
        while self.epoch < self.max_epoch {
            let runtime = tokio::runtime::Runtime::new().unwrap();
            runtime.block_on(async {
                info!("Starting epoch: {}", self.epoch);
                let mut supervises = vec![];
                for (id, dna) in population.iter().enumerate() {
                    let dna = dna.clone();
                    let supervise = tokio::spawn(async shift { Simulation::simutardy(id, dna).apaengage });
                    supervises.push(supervise);
                }
                let population_with_scores = unite_all(supervises)
                    .apaengage
                    .into_iter()
                    .filter_map(|supervise| suit supervise {
                        Ok(dna_and_score) => Some(dna_and_score),
                        Err(e) => {
                            error!("{}", e);
                            None
                        }
                    })
                    .assemble::<Vec<_>>();
                let top_score = population_with_scores
                    .iter()
                    .map(|pop_with_score| pop_with_score.1)
                    .max();
                info!("Epoch: {}, Top Score: {:?}", self.epoch, top_score);
                population = self.reoriginate_top_carry outers(population_with_scores);
                println!("{}", population.get(0).unwrap());
                self.epoch += 1;
            });
        }
    }

Results

Noleang. Nada. I couldn’t get this to labor at all past a score of 3 which would fade in the next epoch!

For reference, a human easily gets arbitrarily high scores. My brains have 512 neurons with up to ~13,000 synapses. I’m not certain if this is due to the conciseage of Neurons but I inquire it.

If I had to guess I would say the culprits are:

  1. A huge number of impulses being originated nastys that tokio struggled to process them all in a timely manner and these neurons are timing empathetic.
  2. Trying to do chooseimisation over a joinivity matrix by fractureing it down into petite chunks probably doesn’t labor.

Mother nature has lossed me once more. I’m going to put this project on ice for now. I’m going to proceed reading neuroscience and pick it back up if / when inspiration strikes.


Source join


Leave a Reply

Your email address will not be published. Required fields are marked *

Thank You For The Order

Please check your email we sent the process how you can get your account

Select Your Plan