Neural networks from scratch in forth

Liste des GroupesRevenir à cl forth 
Sujet : Neural networks from scratch in forth
De : melahi_ahmed (at) *nospam* yahoo.fr (Ahmed)
Groupes : comp.lang.forth
Date : 02. Dec 2024, 21:12:56
Autres entêtes
Organisation : novaBBS
Message-ID : <06eabe944364625b1eba7ea6e09791ad@www.novabbs.com>
User-Agent : Rocksolid Light
Hi,
Here is a session (with gforth) using neural networks
(neural_networks.fs) applied to the XOR operation.
----------------the session begins here---------
Gforth 0.7.9_20200709
Authors: Anton Ertl, Bernd Paysan, Jens Wilke et al., for more type
`authors'
Copyright © 2019 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later
<https://gnu.org/licenses/gpl.html>
Gforth comes with ABSOLUTELY NO WARRANTY; for details type `license'
Type `help' for basic help
\ this is a session using neural networks for the XOR operation  ok
include neural_networks.fs
neural_networks.fs:134:1: warning: redefined b with B
locate1.fs:142:3: warning: original location ok
\ create data  ok
4 >n_samples  ok
create data1  ok
0e f, 0e f, 0e f,  ok
0e f, 1e f, 1e f,  ok
1e f, 0e f, 1e f,  ok
1e f, 1e f, 0e f,  ok
data1 >data  ok
\ this concerns the XOR operation  ok
\ create the neural network, it has 2 inputs, 1 output, 2 hidden layers
with 5 neurons in each hidden layer  ok
1 5 5 2 2 neuralnet: net1  ok
' net1 is net  ok
net_layers  ok
\ activation functions   ok
' dlatan is act_func  ok
' dllinear is act_func_ol \ a linear activation function for the output
layer  ok
\ setting learning rate   ok
1e-3 >eta  ok
0e >beta  ok
\ tolerance and relative tolerance  ok
1e-4 >tol  ok
0e >rtol  ok
\ epochs  ok
1000000 >epochs  ok
\ this is maximal epochs, the algorithm terminates when the Cost is less
then the tolerance tol  ok
\ setting display steps when learning  ok
1000 >display_step  ok
\ adaptation of eta to speedup learning if possible  ok
false >adapt_eta  ok
\ initialize the weights and biases at each learning if redoing learning
phase  ok
true >init_net  ok
\ method to initilize weights and biases  ok
' init_weights_2 is init_weights  ok
' init_biases_2 is init_biases  ok
\ now we lauch the learning (Backpropagation algorithm)  ok
learn
Learning...
-----------
epochs| Cost
------+ ----
0    1.9799033462046
1000    0.478161583121087
2000    0.435711003426376
3000    0.376641058924564
4000    0.289059769511348
5000    0.175586135423502
6000    0.0717553727810072
7000    0.0181228454797771
8000    0.00315094688675379
9000    0.000449783250624701  ok
\ now we verify it  ok
test
inputs | outputs (desired outputs)
-------+--------------------------
0. 0.  |  0.006715207738167  (0. )
0. 1.  |  0.991841706392265  (1. )
1. 0.  |  0.993839285400743  (1. )
1. 1.  |  0.00680589396777978  (0. )   ok
\ we can also do predictions  ok
0e 1e to_inputs forward_pass .outputs
out_n°| value
------+------
0     | 0.991841706392265  ok
\ wich it is true (approximately equals 1)   ok
-----------the session finishes here----------------------
The program works with gforth, iforth and vfxforth.
Ahmed
--

Date Sujet#  Auteur
2 Dec 24 * Neural networks from scratch in forth11Ahmed
2 Dec 24 `* Re: Neural networks from scratch in forth10Ahmed
2 Dec 24  `* Re: Neural networks from scratch in forth9Ahmed
2 Dec 24   `* Re: Neural networks from scratch in forth8mhx
3 Dec 24    `* Re: Neural networks from scratch in forth7Ahmed
3 Dec 24     `* Re: Neural networks from scratch in forth6mhx
3 Dec 24      `* Re: Neural networks from scratch in forth5Ahmed
3 Dec 24       `* Re: Neural networks from scratch in forth4albert
3 Dec 24        `* Re: Neural networks from scratch in forth3Ahmed
3 Dec 24         `* Re: Neural networks from scratch in forth2Ahmed
3 Dec 24          `- Re: Neural networks from scratch in forth1Ahmed

Haut de la page

Les messages affichés proviennent d'usenet.

NewsPortal