Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 61ff0e3

Browse files
committed
Added TravisCI badge.
1 parent 0c74a9a commit 61ff0e3

File tree

1 file changed

+21
-19
lines changed

1 file changed

+21
-19
lines changed

README.md

Lines changed: 21 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
# DN2A - Digital Neural Network Architecture #
22

3+
[![Build Status](https://travis-ci.org/dn2a/dn2a-javascript.svg?branch=master)](https://travis-ci.org/dn2a/dn2a-javascript)
4+
35
---
46

57
## About ##
@@ -17,8 +19,8 @@ DN2A aims to allow you to create and train simple Neural Networks as well as ver
1719
- **Modularized components**: helps the development and the clear separation of concerns with great benefits for who wants to use mixed solutions.
1820
- **Configurable computation precision**: helps to avoid the noise deriving from operation errors and default system precision limits with great improvement of the learning speed and performance stability.
1921
- **Configuration checker**: helps to write less details about configuration and to keep compatibility with older version while the project evolves.
20-
- **StepByStep or Continuous training**: helps to train neural networks without being limited to a particular approach for the greater good of projects with very complex project's needs.
21-
- TODO (Bios) **Data normalization**: helps to simplify the interaction within your real domain.
22+
- **StepByStep or Continuous training**: helps to train neural networks without being limited to a particular approach for the greater good of projects with very complex project's needs.
23+
- TODO (Bios) **Data normalization**: helps to simplify the interaction within your real domain.
2224
- TODO (Host) **Networks composition**: helps to create very effective architectures of multiple neural networks able to obtain advanced behaviours like in deep learning.
2325
- TODO (Host) **Computation parallelization**: helps to improve the scalability of your whole system.
2426
- TODO (Bios) **Sessions intercommunication**: helps to improve the scalability of your whole system.
@@ -126,12 +128,12 @@ To import through your preferred loader configure it to point to the right place
126128

127129
### Training/Querying a Network with default parametrization ###
128130

129-
// Importation
131+
// Importation
130132
var DN2A = require("dn2a");
131-
133+
132134
// Instantiation
133135
var neuralNetwork = new DN2A.NetworkAlpha();
134-
136+
135137
// Training
136138
var trainingPatterns = [
137139
{
@@ -152,7 +154,7 @@ To import through your preferred loader configure it to point to the right place
152154
}
153155
];
154156
neuralNetwork.train(trainingPatterns);
155-
157+
156158
// Querying
157159
//
158160
// The object passed to the callback function contains information about the querying process.
@@ -172,13 +174,13 @@ To import through your preferred loader configure it to point to the right place
172174

173175
// Importation
174176
// ...
175-
177+
176178
// Instantiation
177179
//
178-
// The object passed to the constructor function contains properties describing the neural network.
180+
// The object passed to the constructor function contains properties describing the neural network.
179181
// The list of the properties is reported in the main README file.
180-
// In case one or more properties are not present they are substituted with defaults.
181-
// Same thing happens if the object is not passed at all.
182+
// In case one or more properties are not present they are substituted with defaults.
183+
// Same thing happens if the object is not passed at all.
182184
var neuralNetwork = new DN2A.NetworkAlpha({
183185
layerDimensions: [2, 4, 4, 1], // the default would be [2, 4, 1]
184186
learningMode: "continuous",
@@ -195,21 +197,21 @@ To import through your preferred loader configure it to point to the right place
195197
},
196198
numbersPrecision: 32
197199
});
198-
200+
199201
// Training
200202
// ...
201-
203+
202204
// Querying
203205
// ...
204206

205207
### Training/Querying a Network with evolution feedback ###
206208

207209
// Importation
208210
// ...
209-
211+
210212
// Instantiation
211213
// ...
212-
214+
213215
// Training
214216
//
215217
// The object passed to the callback function contains information about the training process.
@@ -235,15 +237,15 @@ To import through your preferred loader configure it to point to the right place
235237
neuralNetwork.train(trainingPatterns, function(trainingStatus) {
236238
console.log("Epoch: " + trainingStatus.elapsedEpochCounter);
237239
});
238-
240+
239241
// Querying
240242
// ...
241243

242244
### Training/Querying a specific Network through the Host ###
243245

244246
// Importation
245247
// ...
246-
248+
247249
// Instantiation
248250
var cerebrum = new DN2A.Cerebrum({
249251
minds: [
@@ -277,7 +279,7 @@ To import through your preferred loader configure it to point to the right place
277279
"firstNeuralNetwork"
278280
]
279281
});
280-
282+
281283
// Training
282284
//
283285
// The name passed to the trainMind method specifies which specific mind to train
@@ -302,7 +304,7 @@ To import through your preferred loader configure it to point to the right place
302304
cerebrum.trainMind(trainingPatterns, function(trainingStatus) {
303305
console.log("Epoch: " + trainingStatus.elapsedEpochCounter);
304306
}, "firstNeuralNetwork");
305-
307+
306308
// Querying
307309
//
308310
// The name passed to the queryMind method specifies which specific mind to query
@@ -346,4 +348,4 @@ To import through your preferred loader configure it to point to the right place
346348

347349
# License #
348350

349-
MIT
351+
MIT

0 commit comments

Comments
 (0)