Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 617a2eb

Browse files
committed
Added example of JRuby calling Java neural network code
1 parent 15f991e commit 617a2eb

File tree

5 files changed

+74
-1
lines changed

5 files changed

+74
-1
lines changed
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
clean:
2+
rm -r -f *.jar temp_build
3+
4+
jar:
5+
rm -r -f temp_build
6+
mkdir -p temp_build/neuralnetworks
7+
cp ../../src/neuralnetworks/Neural_2H_momentum.java temp_build/neuralnetworks/
8+
(cd temp_build; javac neuralnetworks/Neural_2H_momentum.java)
9+
(cd temp_build; jar cvf ../nn.jar .)
10+
rm -r -f temp_build
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# Using the Backpropagation neural network example in JRuby
2+
3+
One reason why I like Java is that it is easy to package up Java code and use with
4+
other JVM languages like JRuby, Clojure, and Scala.
5+
6+
Here is an example using a Makefile (works on Max OS X and Linux - for Windows developers, it would
7+
be great if someone would provide me with a Windows .bat command file) to build a JAR file
8+
and using it in a small JRuby test program
9+
10+
~~~~~~~~
11+
make jar
12+
jruby test.rb
13+
~~~~~~~~
14+
15+
This is a simple example with a tiny amount of training data. The Java code that this example uses
16+
can be used to train very large networks but you must increase the number of training iterations.
17+

jruby_examples/neural_network/nn.jar

6.1 KB
Binary file not shown.

jruby_examples/neural_network/test.rb

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
require 'java' # for Java interop
2+
3+
require 'nn.jar' # load the neural netowrk compiled Java code
4+
java_import 'neuralnetworks.Neural_2H_momentum'
5+
6+
## training data to rotate three input values:
7+
8+
in1 = [0.1, 0.1, 0.9]
9+
in2 = [0.1, 0.9, 0.1]
10+
in3 = [0.9, 0.1, 0.1]
11+
12+
out1 = [0.9, 0.1, 0.1]
13+
out2 = [0.1, 0.1, 0.9]
14+
out3 = [0.1, 0.9, 0.1]
15+
16+
test1 = [0.1, 0.1, 0.9]
17+
test2 = [0.1, 0.9, 0.1]
18+
test3 = [0.9, 0.1, 0.1]
19+
20+
nn = Neural_2H_momentum.new(3, 3, 3, 3, 0.75)
21+
p nn
22+
23+
nn.addTrainingExample(in1, out1)
24+
nn.addTrainingExample(in2, out2)
25+
nn.addTrainingExample(in3, out3)
26+
27+
100.times do |training_iteration|
28+
p nn.train() # train and print the error at output neurons
29+
end
30+
31+
## test to make sure we have learned the input patterns:
32+
33+
p nn.recall(test1)
34+
p nn.recall(test2)
35+
p nn.recall(test3)

src/neuralnetworks/Neural_2H_momentum.java

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@
4848
public float TRAINING_RATE = 0.5f;
4949
private float alpha = 0f; // momentum scaling term that is applied to last delta weight
5050

51-
Neural_2H_momentum(int num_in, int num_hidden1, int num_hidden2, int num_output,
51+
public Neural_2H_momentum(int num_in, int num_hidden1, int num_hidden2, int num_output,
5252
float alpha) {
5353
this.alpha = alpha;
5454
numInputs = num_in;
@@ -81,6 +81,12 @@ public void addTrainingExample(float[] inputs, float[] outputs) {
8181
outputTraining.add(outputs);
8282
}
8383

84+
/**
85+
* Load a trained network from a serialized file
86+
*
87+
* @param serialized_file_name
88+
* @return
89+
*/
8490
public static Neural_2H_momentum Factory(String serialized_file_name) {
8591
Neural_2H_momentum nn = null;
8692
try {
@@ -102,6 +108,11 @@ public static Neural_2H_momentum Factory(String serialized_file_name) {
102108
return nn;
103109
}
104110

111+
/**
112+
* Save a trained network to a serialized file for re-use without re-training
113+
*
114+
* @param file_name
115+
*/
105116
public void save(String file_name) {
106117
try {
107118
FileOutputStream ostream = new FileOutputStream(file_name);

0 commit comments

Comments
 (0)