Artificial Neural Network

artswan

Member
I was thinking of experimenting with an artificial neural network using Picaxe-08M or Picaxe-14M as each individual neuron "cell". I was going to start with a small network of 3 "cells" and instead of trying to use serial communications between the cells (serin wait problem) I am going to use analog, more like nature does. ADC would be for input (dendrites) and PWM for output (axon). Also, I would use onboard eeprom to store levels of each input and output as necessary. By using analog levels I could "weight" the inputs and outputs, just as real neurons do, and there would be near instantaneous influence of affected inputs and outputs of other "cells" rather than wait for serial digital transmission and interpretation. I was also thinking of starting off with simple 0 - 255 single byte analog levels to ease storage in eeprom.

Do you think it would work and any constructive thoughts on the subject are welcome.
:)
 

BeanieBots

Moderator
This is something I've toying with for years.
I've had a few half hearted attempts but keep running into the same problems.

The only reasons 'neurons' have the ability to 'learn' the function is required is because the 'maths' is embedded within the input weight and the fine tuning associated with transfer function (usually a sigmoid such as 1/(1-e^x)).

Using a basic 8-bit input resolution might yeald something useful but I don't think it would sufficient resolution to warrant more than a two layer net and probably only a few wide. The bigger concern is the internal mathematics. Not only is floating point required but also very high resolution. This can be worked around by using a PICAXE which supports I2C and the addition of a floating point co-processor such as the uFPU V3 which also offers two 12-bit inputs.

Bottom line, IMHO 8-bit input with 16-bit integer fixed point will not offer anything of much use but please try and let me (us) know how you get on.

To save you some time (if you were thinking of trying this), I did try using a look-up table for the sigmoid function. It was completely hopeless with anything other than functions which had very clear distinctions between inputs. Unfortunately, such a function doesn't really require a network as the solution is already obvious. (I only used 16 bit fixed point internals). Anything above that I deamed too slow to be worth while.

I never got as far as putting in any back-propogation so I'd be interested in any thoughts you have about that.
 
Last edited:

tiscando

Senior Member
Thats a very interesting subject.
To make a leech's brain, 350 08Ms will do, and will be a 350-core analogue PICAXE processor with about 80kB of shared program and EEPROM, over 100kB of RAM, takes up at least a 10 cm cube, or about a 25x25cm (ATX-motherboard-size) PCB, has lots of IO, and costs about £600.

To make a honey bee's brain, 950,000 08Ms will do.

To make a human brain, 100 billion 08Ms will do. :eek: this would have made a 100,000,000,000-core analogue PICAXE processor with 25600GB of shared program and EEPROM, over 30000GB RAM, takes up more than a 50 metre cube of space, or a 4km x 4km PCB, and has loads of IO too, and would cost £200 billion! :eek: which is way too much and not worth doing.

However many cells (not thousands, millions or billions), it is a project worth doing! :)
 
Last edited:

artswan

Member
BeanieBots - I was just going to use the 8-bit input just to get the darn thing up and running, then try to move up to 10-bit, splitting words into separate bytes for eeprom, etc. If that worked, and I wanted to go beyond, then I figure I will have to pull out my PicBasic Pro or C and jump over to regular PIC or Atmel chips. But I want to learn to crawl before I try to run. :) Thanks for your encouragement.
 

BeanieBots

Moderator
You should be able to get something approaching the level of a wasp (excluding flight, which I guess would be an ant??) with just 15 neurons (assuming infinite resolution).

A 28X2 + uFPU V3 would make a viable neuron but that would not be cost viable for a dozen off. I'm sure it can be done with an 08M, it's just that I've tried and failed and given up. Another 'mind' having a look is refreshing and in itself has inspired me to have another think about it.

Perhaps just working with a single quadrant transfer function might be an option?
 

artswan

Member
tiscando - At 950,000 08Ms I demand one hell of a quantity discount!! LOL :)

If I can ever build something that would simply duplicate that science article from 2004 in which some science guys got a few thousand rat brain cells to fly an F-22 flight simulator, then my purpose in life will have been fulfilled. :) ;)
 

inglewoodpete

Senior Member
For those interested, I (with a team of others) have been interfacing a network of PICAXEs with a real rat neural network over the past couple of years.

The rat neurons are growing in-vitro at Georgia Tech in Atlanta, USA. The "body" of the semi-living being is currently showing in an Art Gallery, Itau Cultural, in Sao Paulo, Brazil. The "body" consists of a network of 32 PICAXE-powered robotic objects arranged in a matrix. There are 37 PICAXEs networked together in the gallery, connected to the neurons via the internet. We hope to exhibit Silent Barrage in Europe early next year.

For a PICAXE perspective on the project see my previous thread: PICAXE forms part of robotic body of semi-living art

Trying to simulate neural activity in an 08M or 14M would be a challenge. I think that the number of inter-neural connections required would require something larger, with a whole lot more programme space. The learning and ramdomness "diciplines" of the neural relationships would be quite a challenge to simulate.

Peter
http://silentbarrage.com
 

John West

Senior Member
The numbers and letters "18M2" come to the attention of this decidedly human neural network. Perhaps we can finally declare an exception to the validity of the old phrase: "Cheap-fast-good. Pick two."

Soon... soon...
 

chris bate

New Member
tiscando - At 950,000 08Ms I demand one hell of a quantity discount!! LOL :)

If I can ever build something that would simply duplicate that science article from 2004 in which some science guys got a few thousand rat brain cells to fly an F-22 flight simulator, then my purpose in life will have been fulfilled. :) ;)

never mind the discount, what about the 19kwatts worth of power it would take to run them ... assuming they are using 20ma and no loss along the way!..:)
 

tiscando

Senior Member
never mind the discount, what about the 19kwatts worth of power it would take to run them ... assuming they are using 20ma and no loss along the way!..:)
My 08m @8MHz (s/n OHT 0727, firmware 9.2) doing a varying loop of PWMOUT 2, 249, xxx commands only uses 1.7mA with no IO connected. :) Here, I would use the PULSIN command for the data inputs rather than READADC, as it would be more accurate and eliminate the need for analogue buffer circuitry and hence slow transition times (plus may use less milliamps). If PULSIN is used, then the mega-multi-core PICAXES I explained earlier would rather be xxx-core pulse-width-modulated PICAXE processors.

350 08Ms, if each consumes just 2mA on (above or below) average, would consume 0.7A of 5V power, and would require a 12V 1A wall wart plus a 2A switching regulator, which is enough. Make sure you have stripboards with altogether more board space than a WATX motherboard.

950,000 08Ms would consume 1,900A average of 5V, and would need 20 IPS-1000TNS 1000W PSUs, each with a maximum 83A output on the 12V line, and 33 LT1070 5A switching regulators for each PSU, for a total of 660 switching regulators. :eek: This is really not worth doing, but the specs are amazing.

Some more specs on 950,000 08Ms: a 950,000-core pulse-width-modulated PICAXE processor with 243MB of shared program and data memory, over 300MB RAM, loads of IO, and would cost at least £1,000,000. Still not worth doing. :rolleyes:
 
Last edited:

BeanieBots

Moderator
Ok, so we've established that it would require a budget of >£1e6 and a dedicated nuclear power station to replicate a human brain made using 08M's:p

So what would be realistically possible?
What would be a sensible number of neurons to aim for?
This is where we need to do some maths:eek: Oh... where did everybody go:rolleyes:

The "knowledge" stored in a neuron is limited by its mathematical resolution so there is a clear limitation there for starters. So, even with an infinite number, it will reach saturation quite early on. (adding more neurons will have no advantage).

All the neurons need to be networked so that would make the use of pulsin very tricky. Without complex hardware, that would limit us to using analogue inputs which defines the input resolution as either 8 or 10 bit.

The inputs need to be scaled, this adds a further finite limit to avoid any overflows unless complex maths is employed which may well render it too slow to be of any practical use.

In addition to that, we need to run the sum of all the inputs through a non-linear transfer function. This is usually (but doesn't have to be) a sigmoid function of the type 1/(1-e^-x). Again, resolution will be a key factor here.

Long story short, my estimates indicate that it should be possible to create a three layer network with no more than four neurons to each layer giving a maximum of a 4-4-4 network but more realistically a 2-3-2 structure. Anything beyond that would offer no advantage.

Anyone up for the challenge?
Or anyone even have a problem for a PICAXE based neural network to solve?
Maybe something simple. A common (and actually quite difficult) problem to solve is the X-OR function but I was hoping somebody might have something a little more imaginative and maybe even of some use.
 
Last edited:

boriz

Senior Member
What's the sigmoid for? I thought the inputs were just weighted, summed and compared to a threshold.
 

BeanieBots

Moderator
The sigmoid is the most essential part of how a neuron works.

All inputs when combined together produce (via some function) the required output.
Each inputs contributes to the output via it's own function which for a finite range is of the type y=mx+c.
EVERY input requires its own function.
The function is determined by where it sits on the sigmoid. Where it sits is determined by the absolute input value scaled its weighting.

The resultant effect of any input is then:-
delta y = c+Ax+bx^x2+cx^3+dx^4+... up to infinity but limited by mathematical resolution in reality.

Without the sigmoid it would just be a linear equation and not actually able to solve anything.

How would you solve any non linear expression (which most real life problems are) by only using linear equations?

Take for example the simple problem of a two input X-OR gate.
If A and B are the inputs and C is the output. If you only had linear scaling, you would need to determine two multipliers for A and B which would give C.
A*x + B*y = C
You should be able to clearly see that is impossible.
However f(A) + g(B) = C is a solution where the two functions f(x) and g(x) can have any non-linear relationship of any index.

The advantage that the 1/(1-e^-x) sigmoid has over of the hyperbolic functions is that it's derivative is the very same function. This means that once you have calculated the forward propergation value, the backward propergation for learning has already been calculated (as it's the same value). This makes the internal calculations significantly faster as only a single pass is required.
 

boriz

Senior Member
Hmm. Sounds awfully complicated. To date, my experiments have allowed only 1 and 0 for inputs and outputs, and a floating point -1 to +1 as the weights (multiplier). When the sum of the weighted inputs is equal to or greater than 1, then the output is 1, otherwise the output is zero. So far this has worked for me. I’ve never even heard of a sigmoid. Though I agree, a single node XOR function would be impossible. I guess my neural networks are the primitive type :)
 

BeanieBots

Moderator
It simply doesn't work without the sigmoid (or at least a non-linear function).
This explains it better than I can.
http://www.cs.nott.ac.uk/~gxk/courses/g5aiai/006neuralnetworks/neural-networks.htm

EDIT:
When you say "So far this has worked for me." what exactly "worked"?
How did you 'teach' it?
What problems did it solve?

A linear network would be cabable of solving linear problems but there are so few linear problems that can't be solved with a simple equation that I don't see the point of using a network for such problems.

Neural networks really aren't complex in regard to an overview of their functionality (except the maths if you want to UNDERSTAND) them. The real beauty is their SIMPLICITY. You don't NEED to understand them. The only issue with a PICAXE implementation is requirement to understand the consequences of resolution restrictions which unfortunately DOES require an understanding of how they work in slightly more detail.
 
Last edited:

kranenborg

Senior Member
Could my SerialPower network play a useful role here? A SerialPower network essentially implements a distributed processing system on 08m-based nodes where all nodes may have equal rights to send messages on a commonly shared bus... It i is a loosely coupled network where nodes decide themselves what to do with the network information.

Link: http://www.picaxeforum.co.uk/showthread.php?t=7694

/Jurjen
 

BeanieBots

Moderator
Yes, I think it could.
It would certainly open up the ability for each 'neuron' to be able to pass a high resolution value to any other 'neuron'.

I can see it working well once the netwrok has learned a function. Where I see a gotcha, is in the learning process. Learning requires many thousands of of passes of the sample data set. Serial comms would take a lot longer than a hardware (eg analogue read) communication. This might it time prohibative.

One thought I had was to use something like a VB simulation running on a powerful PC to teach a network and then hard code a PICAXE version with the resulting weights. Your network would certainly be of use in such a scheme. However, it would mean that the network would not be able to learn any more which somewhat defeats the object of using it. An interesting option though. Certainly worthy of further investigation.
 

chris bate

New Member
here's a thought, since 900,XX &%(%& 08m's is out of the question
how about using somthing similar to what one of the SETI projects use...

why not write a program that uses an easily accessable component eg the icq components to form an interconnected network across multiple pc's across the world each simulating a neuron or a handfull of neurons...

personally i've got three pc's constantly running and connected to the internet i'm sure there are a few people out there with servers that would be glad to donate some processing power

you could even have certain groups of neurons performing certain tasks
 

BeanieBots

Moderator
why not write a program that uses an easily accessable component eg the icq components to form an interconnected network across multiple pc's across the world each simulating a neuron or a handfull of neurons...
One of the most significant advantages that a neural network has over more 'conventional' processing methods is the speed of data transfer within the network. Even hardwired PCs would simply be too slow which is also the main concern with kranenborg's suggestion.

However, hooking up several PC's all running a LARGE number of neurons would be an interesting 'thought'. (pun intended).

For me, the problem would be what type of problem could such a system solve and what (who) would mediate. A bit like posting a question on a forum. You would get lots of answers comming back, some good, some bad and you would need to sort them out yourself. You would also then need to 'teach' each one with the "correct" answer.:eek:
 

chris bate

New Member
One of the most significant advantages that a neural network has over more 'conventional' processing methods is the speed of data transfer within the network. Even hardwired PCs would simply be too slow which is also the main concern with kranenborg's suggestion.

However, hooking up several PC's all running a LARGE number of neurons would be an interesting 'thought'. (pun intended).

For me, the problem would be what type of problem could such a system solve and what (who) would mediate. A bit like posting a question on a forum. You would get lots of answers comming back, some good, some bad and you would need to sort them out yourself. You would also then need to 'teach' each one with the "correct" answer.:eek:
one would imagine that it would be very usefull for complex pattern recognition eg Forex price patterns to start with, cross data type correllation, advanced data mining ...

control over the whole thing would have to be under 1 or several people, as for teaching..... could be a interesting adventure

how many neurons would it require for simple english comprehension that would be a massive step in the right direction for a bit of usefull data mining
 

BeanieBots

Moderator
I'd guess. :D
And that's exactly what a neural network does. It has a guess:(
Then you tell it how good it's guess was.:)
Next time it makes an educated guess and again you tell it how well it did.
After that, it usually gets it spot on:D

There are many real life (and published and in use) neural network solutions based around PC's and high level software but let's have a few examples of what a PICAXE solution might be good for.

One simple task I've toyed with is simple robotic object avoidance.
Maybe for micromouse to get the absolute optimum turning point when approaching a wall?
 
Last edited:

hippy

Ex-Staff (retired)
In addition to neural networks there is fuzzy logic and the two can be combined. With fuzzy logic being the 'big thing' a while ago there are quite a few examples of using 8-bit micros so tools and algorithms could probably be adopted for the PICAXE.

There are two things in any such project though; the ability to do it and the ability to do it well ( or quickly ). The first is most important, how well the second is achieved dictates its practical usefulness. Don't get bogged down in the premature optimisation of How Best To, proof of concept should come first.
 
Top