Welcome, Guest. Please Login.
Surveyor Corporation Surveyor SRV-1
Home Help Search Login

Surveyor Robotics Forum

Welcome to the user support forum for Surveyor SRV-1 robots, SRV-1 robot controllers and SVS stereo vision systems. To register for this forum, please send an email to support@surveyor.com which includes your desired forum user name, your registration email address, and a brief explanation of why you wish to join, and we will create a forum account for you.

Please note that there is a Search button in the forum toolbar for forum topics. Another effective search method for the entire surveyor.com site is to use Google, e.g. "xyz site://www.surveyor.com" where "xyz" is the search topic.



Pages: 1 2 3 
Send Topic Print
Neural Net (finally) ... (Read 18349 times)
admin
YaBB Administrator
*****




Posts: 3606
Neural Net (finally) ...
01/27/09 at 3:18pm
 
Okay - I've put this one off long enough.  Now that the new vblob() code seems to be stable, we need to add the long-promised neural net functions.
 
The basic plan is to start first with console functions, and then follow up with support in C and Scheme.  The core code is here -  
    http://code.google.com/p/surveyor-srv1-firmware/source/browse/trunk/blackfin/srv /neural.c
 
This code is configured to match 8x8 pixel patterns, which means that we have a net with 64 inputs.  This isn't a hard limitation, and we can build larger nets, but we'll use this as a starting point.
 
If you look at the neural.c code, you'll see that we already have some pre-programmed 8x8 pixel patterns, represented like this
 
pattern_t pattern[] = {
    {  // solid ball
    {    0,    0,    0, 1024, 1024,    0,    0,    0,
        0, 1024, 1024, 1024, 1024, 1024, 1024,    0,
        0, 1024, 1024, 1024, 1024, 1024, 1024,    0,
     1024, 1024, 1024, 1024, 1024, 1024, 1024, 1024,
     1024, 1024, 1024, 1024, 1024, 1024, 1024, 1024,
        0, 1024, 1024, 1024, 1024, 1024, 1024,    0,
        0, 1024, 1024, 1024, 1024, 1024, 1024,    0,
        0,    0,    0, 1024, 1024,    0,    0,    0 },
    { 1024, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }
    },  
 
That means of storage is very inefficient, so the first step in modifying the code will be to collapse the pattern storage to 8x8 bits (8 bytes).  As such, the above pattern would collapse to
    { 0x18, 0x7E, 0x7E, 0xFF, 0xFF, 0x7E, 0x7E, 0x18 }
 
Initially, we'll support 16 different 8x8 patterns, and will allow overwrite of the built-in patterns using 'np'  followed by the pattern number 0-F, followed by the 8 bytes in hex format.  So to program the above pattern into neuron "bin" #0, the command will be
    np0187E7EFFFF7E7E18
 
Also, we'll need a function to display stored patterns
    nd #
 
I'm not certain yet whether we will have the addition of a new pattern automatically trigger a retraining of the net, or whether we'll use a specific command for training, e.g.
    nt1000     (neural-net train 1000 * 1000 iterations)
 
Two other commands:
 
One to initialize the net
    ni
 
One will be for testing the network with a user specified pattern
    nx1E7C7EFDFF7E7B1C
 
One will use blob coordinates to scale a color segmented portion of the live image down to 8x8 dimensions and pattern match against the stored patterns
    nb   color#  blob#
 
So, in summary, here are the commands we will add initially -
 
np - store a new pattern
nd - display a stored pattern
ni  - initialize the network with random weights
nt  - train the network from stored patterns
nx  - test the network with sample pattern
nb  - match pattern against specific blob from "vb"
 
Comments ?
 
 
Back to top
 
« Last Edit: 02/01/09 at 9:20am by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
michael
YaBB Newbies
*




Posts: 37
Re: Neural Net (finally) ...
Reply #1 - 01/27/09 at 7:17pm
 
My only comment is that I'm looking forward to see this develop. Do you have an ETA for the console code integration?
Back to top
 
 
WWW   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #2 - 01/27/09 at 7:45pm
 
This shouldn't take more than a day to code.  If there aren't any distractions, I'll probably start on it tomorrow, as I finally have a pretty clear idea of what approach to take.
Back to top
 
 

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
mshapiro
Full Member
***




Posts: 143
Re: Neural Net (finally) ...
Reply #3 - 01/27/09 at 8:13pm
 
I'm also looking forward to it.  I am hoping that it will find intersections for the line maze that I want to navigate.  It seems like this should be the best way to do it.  I haven't actually used neural nets before.  Do the 'nc' and 'nb' commands return an integer index to the matching pattern in the stored array?
Back to top
 
 
  IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #4 - 01/27/09 at 9:15pm
 
Exactly.
 
The net will have 64 inputs and 16 outputs, corresponding to how well each of the stored patterns matches the input pattern.   My plan is to dump out the score (0 - 100) for each of the pattern matches, though it would make sense to first display the index of the best match.  So for example, if the best matching pattern was #5, the data for patterns 0-15 might look like
    15  10  7  20  15  85 ... ... 15
 
So we might display the result from nx, nb, or nc in this format -
 
##nx 5    15 10 7 20 15 85 ... ...
 
Once we move these into C and Scheme calls, the return value would be the index of the best matching pattern, but it will be useful to see the raw data in the console version of the functions.
 
Back to top
 
« Last Edit: 01/27/09 at 9:18pm by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #5 - 01/28/09 at 3:20pm
 
Quick update ...  
 
np - store a new pattern
nd - display a stored pattern
ni  - initialize the network with random weights
nt  - train the network from stored patterns
nx  - test the network with sample pattern
nb  - match pattern against specific blob from "vb"
 
From the above list, I have the np, nd, ni, nt and nx functions coded, and they seem to be working correctly.  All that remains is to code nb, which is a bit more complicated, but once I have a working version, I'll post code and usage instructions.
 
By the way, we have a complete 8x8 font pattern for letters and numbers in  
    http://code.google.com/p/surveyor-srv1-firmware/source/browse/trunk/blackfin/srv /font8x8.h
Back to top
 
« Last Edit: 01/30/09 at 2:40pm by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
mshapiro
Full Member
***




Posts: 143
Re: Neural Net (finally) ...
Reply #6 - 02/01/09 at 8:35am
 
Now that the neural net code is included in the posted firmware, can you please post a short tutorial on just what the proper steps are to utilize it?
Back to top
 
 
  IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #7 - 02/01/09 at 8:43am
 
The neural net code is included in the latest firmware build (020109 or later).  There are 16 built-in patterns -
 
unsigned char npattern[NUM_NPATTERNS * 8] = {
    0x18, 0x7E, 0x7E, 0xFF, 0xFF, 0x7E, 0x7E, 0x18,  // solid ball
    0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF,  // solid square
    0x18, 0x18, 0x18, 0xFF, 0xFF, 0x18, 0x18, 0x18,  // cross
    0xFF, 0xFF, 0xC3, 0xC3, 0xC3, 0xC3, 0xFF, 0xFF,  // box
    0x18, 0x7E, 0x66, 0xC3, 0xC3, 0x66, 0x7E, 0x18,  // circle
    0xC3, 0xC3, 0x24, 0x18, 0x18, 0x24, 0xC3, 0xC3,  // xing
    0x18, 0x3C, 0x66, 0xC3, 0xC3, 0x66, 0x3C, 0x18,  // diamond
    0x00, 0x00, 0xFF, 0xFF, 0xFF, 0xFF, 0x00, 0x00,  // horizontal line
    0x3C, 0x3C, 0x3C, 0x3C, 0x3C, 0x3C, 0x3C, 0x3C,  // vertical line
    0x03, 0x03, 0x04, 0x18, 0x18, 0x20, 0xC0, 0xC0,  // slash
    0xC0, 0xC0, 0x20, 0x18, 0x18, 0x04, 0x03, 0x03,  // backslash
    0x18, 0x18, 0x3C, 0x3C, 0x66, 0x66, 0xC3, 0xC3,  // up arrow
    0xC3, 0xC3, 0x66, 0x66, 0x3C, 0x3C, 0x18, 0x18,  // down arrow
    0xC0, 0xF0, 0x3C, 0x07, 0x07, 0x3C, 0xF0, 0xC0,  // right arrow
    0x03, 0x0F, 0x3C, 0xE0, 0xE0, 0x3C, 0x0F, 0x03,  // left arrow
    0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,  // blank
};
 
You can display a pattern (0-f) using the "nd" command, e.g.
nd5
##nd 5
 ** **             ** **
 ** **             ** **
       **       **      
          ** **          
          ** **          
       **       **      
 ** **             ** **
 ** **             ** **
 
You can replace a pattern using the "np" command
np57830303030307800
##np 5
nd5
##nd 5
   ** ** ** **          
       ** **            
       ** **            
       ** **            
       ** **            
       ** **            
   ** ** ** **          
 
Look at http://code.google.com/p/surveyor-srv1-firmware/source/browse/trunk/blackfin/srv /font8x8.h and you'll find a full set of 8x8 ASCII patterns -
 

 
Once you have your patterns,  
 
1.  send "ni" to initialzed the network with random weights
2.  then send "nt" to train the network on the stored set of patterns
3.  then use "nx" to test the network against various patterns
 
For example, after adding the 'I' character as pattern 5, I tried
nx3030303030303030
##nx
   0   0   1   0   0  97   0   0  16   0   0   0   0   0   0   5
 
and you can see that it matches best against pattern 5
 
The next step is to add the "nb" command for matching blobs against patterns.  The problem I'm having is that the blob needs to be scaled into an 8x8 pattern to match, and I'm having an issue with aspect ratios, as demonstrated here  
 

 
These patterns will be okay, because they occupy the full width and height of a template.  The problem will occur with characters such as 'I' or numbers such as '1' which don't occupy the full width, or '-' which doesn't occupy full height.  I'm open to suggestion on how to handle this.  Once we have scaling, we can directly connect the blob search to the neural pattern matching.  At that point, we'll add functions to the C and Scheme interpreters for accessing these features.
 
Back to top
 
« Last Edit: 02/01/09 at 9:36am by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #8 - 02/02/09 at 11:05pm
 
I have been thinking about the blob scaling for neural net processing, and have decided that for the present, we won't worry about matching patterns with non-square aspect ratios.  So the plan is the scale any arbitrary blob to an 8x8 pattern for input to the neural net.  So in the case of these blobs
 

 
##vb0
1344 - 114 156 78 198  
1221 - 50 96 73 195  
932 - 174 212 77 192  
641 - 230 264 82 186    
 
where dimensions were typically around 40 wide by 120 tall, we'll squish the 1:3 aspect ratio down to 1:1 in an 8x8 image.  So that means that 5 x 15 pixels will subsample down to each of the 8x8 pixels (i.e. the inputs to the neural net), by counting each of those 75 pixels to come up with an average value.  As a result, the pattern matcher will think it's looking at an image like this -
 

 
Hopefully, I will have a chance to code this tomorrow, replacing the current builtin patterns with numeric patterns '0' - '9' and maybe some math symbols ('+'  'x'  '/'  '=', using '~' for '-') to populate remaining slots.  I suspect we will be expanding the limit on number of patterns before long.
Back to top
 
« Last Edit: 02/02/09 at 11:20pm by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
wheagy
Junior Member
**




Posts: 64
Re: Neural Net (finally) ...
Reply #9 - 02/03/09 at 3:15pm
 
Hi,
 
I'm able to replicate your above example of adding the char I.  After training, I test it with
nx3030303030303030
and get
##nx
   0   0   0   0   0  59   6   0  11   0   0   0   0   1   0   4
It finds pattern 5 as the best match, but not with as much certainty...59 compared to your 97.  What could cause this?  Is it the training?  Is it possible to train longer than the default produced by simply entering nt (10000)?  Or is it irrelevant as long as it finds the best match?
 
Thanks...Win
Back to top
 
 
WWW wheagy wheagy   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #10 - 02/03/09 at 4:13pm
 
Glad to see that someone else is trying this...
 
Make certain to first initialize the neural net weights by sending "ni"
 
Each time you send nt, it will train for 10000 iterations.  The training results are dumped out on completion, with the patterns tested against the network, so you'll see a diagonal line of matches.  The value of each match should be 95+.  I've never seen values less than 101 after a single round of training, so I wonder why you are getting different results.
 
Sorry I haven't been able to post the neural blob function yet - events have conspired against me today. It's still at the top of my list.
Back to top
 
« Last Edit: 02/03/09 at 4:14pm by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
wheagy
Junior Member
**




Posts: 64
Re: Neural Net (finally) ...
Reply #11 - 02/04/09 at 3:18pm
 
Ok, I reran tonight and got better results.  The first time I tried, it matched on #5 with a 78.  I did everything identically a second time (including initializing) and it was 92.  I know neural nets can have varying results, so is that what I'm seeing?  Better than yesterday at 59.   But then ran a third time and got this...
 
##nd 5
    ** ** ** **
      ** **
      ** **
      ** **
      ** **
      ** **
    ** ** ** **
 
##ni - init neural net
##nt - train 10000 iterations
 102 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 102 000 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 102 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 102 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 102 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 102 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 102 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 102 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 102 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 102 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 102 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 102 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 102 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 102 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 000 102 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 101
##nx
   6   0   0   0   0  19   0   0   2   0   0   0   0   0   0   0
 
And a final time...
 
##ni - init neural net
##nt - train 10000 iterations
 102 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 102 000 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 102 000 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 102 000 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 102 000 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 102 000 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 102 000 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 102 000 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 102 000 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 102 000 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 102 000 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 102 000 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 102 000 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 102 000 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 000 102 000
 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 101
##nx
   0   0   0   0   0  62   0   0  44   0   0   0   7   2   0   4
 
 
Thoughts?  For nx I'm using
nx3030303030303030  
 
This is a very cool piece of work.  I'll be interested to see how creatively it can be used.
 
Win
Back to top
 
 
WWW wheagy wheagy   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #12 - 02/04/09 at 3:48pm
 
Try running "ni" each time before you run "nt".
 
I'm getting pretty consistent results -  
 
ni
##ni - init neural net
np57830303030303078
##np 5
nt
##nt - train 10000 iterations
 102   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
   0 102   0   0   0   0   0   0   0   0   0   0   0   0   0   0
   0   0 102   0   0   0   0   0   0   0   0   0   0   0   0   0
   0   0   0 102   0   0   0   0   0   0   0   0   0   0   0   0
   0   0   0   0 102   0   0   0   0   0   0   0   0   0   0   0
   0   0   0   0   0 102   0   0   0   0   0   0   0   0   0   0
   0   0   0   0   0   0 102   0   0   0   0   0   0   0   0   0
   0   0   0   0   0   0   0 102   0   0   0   0   0   0   0   0
   0   0   0   0   0   0   0   0 102   0   0   0   0   0   0   0
   0   0   0   0   0   0   0   0   0 102   0   0   0   0   0   0
   0   0   0   0   0   0   0   0   0   0 102   0   0   0   0   0
   0   0   0   0   0   0   0   0   0   0   0 102   0   0   0   0
   0   0   0   0   0   0   0   0   0   0   0   0 102   0   0   0
   0   0   0   0   0   0   0   0   0   0   0   0   0 102   0   0
   0   0   0   0   0   0   0   0   0   0   0   0   0   0 102   0
   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0 101
nx3030303030303030
##nx
   0   0   0   0   0  97   0   0   2   0   1   0   0   0   0   0
np53c1818181818183c
##np 5
nt
##nt - train 10000 iterations
 102   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
   0 102   0   0   0   0   0   0   0   0   0   0   0   0   0   0
   0   0 102   0   0   0   0   0   0   0   0   0   0   0   0   0
   0   0   0 102   0   0   0   0   0   0   0   0   0   0   0   0
   0   0   0   0 102   0   0   0   0   0   0   0   0   0   0   0
   0   0   0   0   0 102   0   0   0   0   0   0   0   0   0   0
   0   0   0   0   0   0 102   0   0   0   0   0   0   0   0   0
   0   0   0   0   0   0   0 102   0   0   0   0   0   0   0   0
   0   0   0   0   0   0   0   0 102   0   0   0   0   0   0   0
   0   0   0   0   0   0   0   0   0 102   0   0   0   0   0   0
   0   0   0   0   0   0   0   0   0   0 102   0   0   0   0   0
   0   0   0   0   0   0   0   0   0   0   0 102   0   0   0   0
   0   0   0   0   0   0   0   0   0   0   0   0 102   0   0   0
   0   0   0   0   0   0   0   0   0   0   0   0   0 102   0   0
   0   0   0   0   0   0   0   0   0   0   0   0   0   0 102   0
   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0 101
nx1818181818181818
##nx
   0   0   6   0   0  78   0   0   0   0   0   0   0   0   0   0
nx3030303030303030
##nx
   0   0   0   0   0 101   0   0   0   0   1   0   0   0   0   0
nd5
##nd 5
      ** ** ** **      
         ** **          
         ** **          
         ** **          
         ** **          
         ** **          
         ** **          
      ** ** ** **      
nx3838383838383838
##nx
   0   0   0   0   0  99   0   0   2   0   0   0   0   0   0   0
nx3c3c3c3c3c3c3c3c
##nx
   0   0   0   0   0   0   0   0 102   0   0   0   0   0   0   0
 
Back to top
 
 

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #13 - 02/04/09 at 3:55pm
 
By the way, I am in process of creating a new set of patterns - numbers '0' through '9' and graphics for spade, heart, diamond and club.   That leaves two free slots - maybe a 'J' in one.
 
Here's what I have for 0 - 5.  These aren't great, and I welcome others to propose better patterns.  Unfortunately, the font8x8 patterns aren't useful because they leave the last row and column blank, so they are actually 7x7.
 
unsigned char npattern[NUM_NPATTERNS * 8] = {
    0x18, 0x3c, 0x66, 0xc3, 0xc3, 0x66, 0x3c, 0x18,  // '0'
    0x18, 0x38, 0x18, 0x18, 0x18, 0x18, 0x18, 0xff,  // '1'
    0x1c, 0x7e, 0xff, 0x46, 0x0c, 0x38, 0x7f, 0xff,  // '2'
    0x7e, 0xff, 0xc7, 0x1c, 0x1c, 0xc7, 0xff, 0x7e,  // '3'
    0x18, 0x38, 0x78, 0xff, 0xff, 0x18, 0x18, 0x18,  // '4'
    0xff, 0xff, 0x70, 0x1c, 0x06, 0xc7, 0x7e, 0x1c,  // '5'
...
...
};
 
Once I have the patterns, I'll go ahead and hook up the blob pattern matching code and look at adding C and Scheme functions.  
Back to top
 
« Last Edit: 02/04/09 at 4:12pm by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
admin
YaBB Administrator
*****




Posts: 3606
Re: Neural Net (finally) ...
Reply #14 - 02/05/09 at 8:12am
 
Perhaps a better idea - instead of the pain of coding these patterns, we should have a command that can grab an actual blob as the template for the input pattern to be saved, e.g.
    "ng"  pattern#   (this will grab the largest blob as pattern input)
 
then use
    "nd" pattern#  
to see what we captured
 
So the entire training sequence, assuming colors had already been set with "vc", would look like
 
vb0    (do a blob search on color #0)
ng5    (grab and save pattern #5 from largest blob)
nd5    (display the results)
ni       (if satisfied with results, retrain the network)
nt
 
then to test the result
 
vb0    (do a blob search on color #0)
nb0    (look for best match in neural net against largest blob)
 
Back to top
 
« Last Edit: 02/05/09 at 8:49am by admin »  

SRV-1 Development Team
Surveyor Corporation
Email WWW   IP Logged
Pages: 1 2 3 
Send Topic Print