Anaimo AI SDK User’s Guide
Artificial Intelligence to Turn Challenges into Benefits.
Version: 2022-03 (build 10900).
Copyright
This document and all its content are protected by copyright international laws as is property of a private company which legal name is ANAIMO SPAIN, S.L. (since now on “Anaimo”) registered in the country of Spain with fiscal id B16943706. Modifying this document or removing any copyright is expressly prohibited without prior written permission by Anaimo. For more information, please contact https://anaimo.com
Introduction
The Anaimo AI SDK is a software library which allows to create, train, and use neural networks for whatever application developed in C/C++, C#, VB.NET and more. Its main advantages are:
- Wide range of functionalities:
- Create neural networks, from fully connected, to convolutionals, to manually connected neurons.
- Create easily, with mnemotechnics, different architectures of neural networks.
- Augment images automatically.
- Reorder data sets automatically.
- Snapshot to make predictions based on a previous knowledge.
- Optimized:
- For high speed with multi core, thread and SIMD.
- To consume the least memory possible.
- Runs on cloud or on-premises (Edge AI), and potentially on any hardware.
- Unique features:
- Dynamic outputs: the number of outputs can change without losing knowledge.
- Flexible topology: connect neurons from fully connected automatically, to fully manual as you need.
- Auto adaptative: it can self-adapt its topology for faster and better performance, like the human brain.
This document constitutes the User’s Guide of the Anaimo AI SDK.
For more information or support, please visit or contact us at: https://anaimo.ai
License
Anaimo AI SDK can run with a:
- Free license: for neural networks with less than 1000 neurons.
- Paid license: for neural networks with 1000 or more neurons.
If you want to obtain a paid license, you need to provide Anaimo with all the hardware id codes of the computer which will run the neural network. To obtain them, please run the library on the computer which you need to license and use the HardwareId public function to obtain and provide with all the hardware identificators.
Then please email, with subject “License request”, these 16 hardware identificators to marketing@anaimo.com
Installation
The Anaimo AI SDK comes in different versions:
Anaimo AI SDK Version | Anaimo AI SDK File |
Windows (x32 & x64) | |
General Release | AnaimoAI.dll |
Optimized for computers with the AVX2 instruction set. | AnaimoAI_AVX2.dll |
Optimized for computers with the AVX512 instruction set. | AnaimoAI_AVX512.dll |
Linux | |
General Release | libAnaimoAI_nn.so |
The Windows version of the Anaimo AI SDK requires to have pre-installed the following component:
- Microsoft’s Visual Studio C++ Redistributable (version 2015 or higher).
Linux version was compiled with: Ubuntu GLIBC 2.31-0ubuntu9.9.
Depending on the folder, where the previous files are, the following software must be additionally preinstalled before use:
- Intel folder: requires Intel’s Redistributable Libraries for Intel® C++ and Fortran 2020 Compilers for Windows.
If you try to execute in a computer your software developed with the Anaimo AI SDK without having pre-installed in that computer these cited softwares, you will probably get an error indicating a “dll not found exception”.
For Python, only the General Release version has been tested.
When opening the source code of the NNViewer application (see later in this document), please remember that it requires .NET 5.0 and MS Visual Studio version 2022 or higher.
Inputs, ouputs
Inputs and ouputs of the neural network are regular neurons. You indicate how many inputs your network has in the NetCreate function. After that, to indicate how many outputs your network has, you use the function NetOutputAdd. Therefore, inputs will be the initial neurons added until the number of inputs indicated in NetCreate and there does not exist any specific function to add inputs.
Values of inputs and outputs should be in the range [0, 1].
Topology
The calls NetTopologyGet and NetTopologySet allow to get or set the desired topology. Supported topologies are:
- Manual: neurons and their connections are stablished programmatically.
- Full connected: neurons per layer are stablished programmatically, but they are all automatically fully connected between layers.
Control of cycles
Depending on the network topology, neurons can be interconnected so that they stablish closed loops, for example when neuron A outputs to neuron B which outputs back to neuron A. This sometimes can happen after multiple levels and therefore cycles will not be easily seen.
If your topology connects neurons in loops, the neural network could enter an infinite loop and hang or even drain the resources of the computer. The functions which might experience this problem will require the parameter CyclesControl to be with value 1 to avoid this problem.
Numbering of neurons
Neurons are numbered starting with 0. In general, this is applied to all other items (inputs, outputs, weights, etc.).
The set and the auto adaptative feature
One of the disadvantages of the learning process of an artificial neural network versus a person, is that the first needs thousands of records and iterations to learn what the second can learn with just a few records and with almost no iteration.
To reduce that difference, Anaimo AI developed:
- Set: all the records that are used to learn can be memorized inside the neural network.
- Self-training from the set: the neural network can use the set for self-training.
- Auto adaptation: once the self-training has finished, the neural network self-adapts (auto adaptative feature) its topology to increase speed and save computational resources. This feature is only available in Dynamic propagation mode
Only when using the set functionality, the Anaimo AI SDK will enter into the auto adaptative feature. The related Set functions are also explained in this guide.
Please note that the set cannot be changed (neither add nor delete records) while the learning process is being applied on the set.
Modes
There are different working modes for the neural network. The mode affects mainly to training, but it could also affect other operations. There are these modes available:
- Normal: fully optimized for learning and highly parallel computation.
- Standard back propagation:
- Calculates neurons outputs.
- Calculates deltas.
- Adjusts biases and weights.
- Dynamic propagation (beta): for all neurons, adjusts biases and weights via calculating neurons outputs and deltas when needed. This mode supports any network topology. This mode is more than 50% faster than standard back propagation, although is still being validated in different use cases.
- Back propagation (beta): same as mode standard back propagation but optimized for parallel computation and high volume of neural networks and data.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore you need to manually connect the neurons first.
Modes Normal and Back propagation (beta) will automatically create an internal memory structure. You can also create this internal memory structure manually, every time you change the topology of the neural network, with the function NetLayersAnalyze, which will consume some time and memory. Once created, it will be automatically maintained during operation. In these modes, some functions (for example NeuWeightUpdatedGet and NetValueUpdatedGet) will not work, therefore please see the documentation related to mode (NetModeSet) later. This is because these internal counters will not be updated for efficiency.
Convolutional layers (Conv2D, MaxPool and AvgPool) and SoftMax are only supported in the mode Normal.
Performance
The neural network is highly optimized for speed and the use of the least memory possible. For the best results:
- Use the Normal mode (this is the default).
- Use the full connected topology (this is the default).
- Set the number of threads, normally equal or less to the number of cores of your processor.
- Set the memory cache, normally equal or less to the size of the L3 cache memory of your processor (this feature is subject to additional testing: beta).
Disclaimer of use
The Anaimo AI SDK is a powerful tool capable of providing great benefits to its users. But, unlike an algorithm, and similarly to other Artificial Intelligences, humans cannot fully understand how it is predicting its outputs. Because of this, we recommend validating the outputs before applying them to actions that could have potential negative impacts such as, and not being an exhaustive list, ethical or safety problems. Therefore, ANAIMO WILL NOT ACCEPT, UNDER NO CIRCUMSTANCE, NEITHER RESPONSIBILITY NOR LIABILITY FOR ANY NEGATIVE IMPACT RELATED TO ANY USAGE OF ANAIMO’S PRODUCTS, as it is finally the total responsibility of the user if and how to use them.
For awareness and more information, please consult:
- https://www.codedbias.com/
Public functions
The following are the published available functions.
Common functions
HardwareId
Purpose
Provides 16 hardware identificators, in 4 rows by 4 columns, which uniquely identify the computer running the neural network. This data must be sent to Anaimo to obtain a licensed version of the neural network library.
Declarations
Standard C:
extern int HardwareId(unsigned int pIntRow, unsigned int pIntCol);
MS Visual Studio:
extern int _cdecl HardwareId(unsigned int pIntRow, unsigned int pIntCol);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall HardwareId(unsigned int pIntRow, unsigned
int pIntCol);
Parameters
- Row [0,3]
- Column [0,3]
Returns
An integer.
Usage
The following C example will provide the hardware id of the machine running the neural network:
#include <stdio.h>
#include <string.h>
#include <stdbool.h>
using namespace std;
#include "AnaimoAI_nn.h"
int main(int argc, char *argv[]) {
char lStrTmp[1024] = "";
char lStrTmp2[1024] = "";
for(int i = 0; i < 4; i++){
for(int j = 0; j < 4; j++){
snprintf(lStrTmp, sizeof(lStrTmp), "%X", HardwareId(i, j));
strcat(lStrTmp2, lStrTmp);
if(!((i==3)&&(j==3)))
strcat(lStrTmp2, ":");
}
}
printf("%s\n", lStrTmp2);
return 0;
}
To compile the above program, you can execute:
g++ -Wall -O3 -o main main.cpp -I./ -L./ -l:libAnaimoAI_nn.a -lgomp -pthread -lm -fopenmp
Licensed
Purpose
Returns the status of the license of the library.
Declarations
Standard C:
extern int Licensed();
MS Visual Studio:
extern int _cdecl Licensed();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall Licensed();
Parameters
None.
Returns
An integer with the status of the license. Please see the possible values in the method NetCreate.
Usage
The following C example will provide the status of the license:
int lIntLicenseStatus = Licensed();
Version
Purpose
Returns the version and build numbers of the library.
Declarations
Standard C:
extern int Version(int *pIntBuild);
MS Visual Studio:
extern int _cdecl Version(int *pIntBuild);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall Version(int *pIntBuild);
Parameters
A returned parameter indicating the build number.
Returns
An integer with the version number.
Usage
The following C example will provide the version and the build of the library:
int lIntBuild, lIntVersion = Version(&lIntBuild);
Network functions
A network is a group of neurons.
NetActivationGet
Purpose
Returns the activation function to be used.
Declarations
Standard C:
extern int _cdecl NetActivationGet(int pIntLayerNumber);
MS Visual Studio:
extern int NetActivationGet(int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetActivationGet(int pIntLayerNumber);
Parameters
- Layer number to apply the activation function. The inputs layer is the first layer and is number 1. This parameter can be 0 and then it will set the default for layers created in the future.
Returns
Activation function, with possible values of:
- 0: for Sigmoid.
- 1: for ReLU.
- 2: for fast Sigmoid.
It can also return -1 if the parameter layer number is not correct.
Usage
The following C example gets the activation function which will be used for layers created in the future:
#define ACTIVATION_F_Sigmoid 0
#define ACTIVATION_F_ReLU 1
#define ACTIVATION_F_FastSigmoid 2
int lIntActivation = NetActivationGet(ACTIVATION_F_ReLU, 0);
NetActivationSet
Purpose
Selects the activation function to be used.
Declarations
Standard C:
extern void _cdecl NetActivationSet(int pInt, int pIntLayerNumber);
MS Visual Studio:
extern void NetActivationSet(int pInt, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetActivationSet(int pInt, int pIntLayerNumber);
Parameters
- Activation function, with possible values of:
- 0: for Sigmoid.
- 1: for ReLU.
- 2: for fast Sigmoid.
- Layer number to apply the activation function. The inputs layer is the first layer and is number 1. This parameter can be 0 and then it will set the default for layers created in the future.
Returns
Nothing.
Usage
The following C example will set ReLU as the activation function for layers created in the future.
#define ACTIVATION_F_Sigmoid 0
#define ACTIVATION_F_ReLU 1
#define ACTIVATION_F_FastSigmoid 2
NetActivationSet(ACTIVATION_F_ReLU, 0);
NetConnect
Purpose
Connects two neurons.
Declarations
Standard C:
extern bool NetConnect(int pIntSrc, int pIntDst);
MS Visual Studio:
extern bool _cdecl NetConnect(int pIntSrc, int pIntDst);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnect(int pIntSrc, int pIntDst);
Parameters
- Number of the source neuron.
- Number of the destination neuron.
Returns
True if connection was successful.
Usage
The following C example connects the 10th neuron to be the input of the 20th neuron and returns in a boolean variable, true or false if there was an error.
bool lBolTmp = NetConnect(9, 19);
NetConnectConsecutive
Purpose
Connects a consecutive list of neurons to one another neuron. Its purpose is to speed up the connection of a high number of neurons to a destination neuron.
NetConnectConsecutive will behave as NetConect when the first 2 parameters are the same number (pIntSrc1 equals pIntSrc2).
Declarations
Standard C:
extern bool NetConnectConsecutive(int pIntSrc1, int pIntSrc2, int pIntDst);
MS Visual Studio:
extern bool _cdecl NetConnectConsecutive(int pIntSrc1, int pIntSrc2, int
pIntDst);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnectConsecutive(int pIntSrc1, int
pIntSrc2, int pIntDst);
Parameters
- Number of the source initial neuron.
- Number of the source final neuron.
- Number of the destination neuron.
Note that all neurons from initial to final will be connected to the destination neuron.
Returns
True if connection was successful.
Usage
The following C example connects the 10th, 11th, and 12th neurons to be the input of 22nd neuron and returns in a boolean variable, true, or false if there was an error:
bool lBolTmp = NetConnectConsecutive(9, 11, 21);
NetConnectLayer
Purpose
Connects a layer of neurons to another layer of neurons. Its purpose is to speed up the connection of a high number of neurons.
Declarations
Standard C:
extern bool NetConnectLayer(int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
MS Visual Studio:
extern bool _cdecl NetConnectLayer(int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnectLayer(int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
Parameters
- Number of the source initial neuron.
- Number of the source final neuron.
- Number of the destination initial neuron.
- Number of the destination final neuron.
Note that all neurons from source initial to source final will be connected to all neurons from the destination initial to destination final neurons.
Returns
True if connection was successful.
Usage
The following C example connects the 10th, 11th, and 12th neurons to be the input of the 20th, 21st, and 22nd neurons and returns in a boolean variable, true or false if there was an error:
bool lBolTmp = NetConnectLayer(9, 11, 19, 21);
NetCreate
Purpose
Creates the basement of the neural network by dynamically creating part of the memory structures. This function will only create the base memory structure, but not the neural network itself. You will have to create it manually. If you want to create the full neural network automatically, do not use this function but NetTopologyCreate.
Declarations
Standard C:
extern int NetCreate(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
MS Visual Studio:
extern int _cdecl NetCreate(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetCreate(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
Parameters
- Maximum number of neurons.
- Number of inputs.
- Number of outputs.
- If the snapshot should be also freed first.
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural
- network library is not licensed for the current hardware running it. In this case, you need to send to
- Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your
- device supports.
- 4: unknown error.
- 5: the number of inputs must be greater than the number of outputs.
- 6: select a full connected topology to perform this function.
- 7: error in parameters.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the neural network.
#define NetCreate_Success 0
#define NetCreate_LicenseExpiresInLessThan30Days 1
#define NetCreate_NotLicensed 2
#define NetCreate_OutOfMemory 3
#define NetCreate_UnknownError 4
#define NetCreate_InputsMustBeGreaterThanOutputs 5
#define NetCreate_OnlyAvailableForFullConnectedTopology 6
#define NetCreate_IndicatedParametersAreIncorrect 7
#define NetCreate_ReductionBetweenLayersMustBeGreaterThanOne 8
#define NetCreate_MaximumNumberOfNeuronsReached 9
#define NetCreate_LastLayerIsLessOrEqualThanTheNumberOfNeededOutputs 10
#define NetCreate_CouldNotAddNeurons 11
#define NetCreate_LayersCouldNotBeAnalyzed 12
int lIntTmp = NetCreate(pIntFinalNumberOfTotalNeurons, pIntInputsNumber,
pIntOutputsNumber, true);
NetDecayRateGet
Purpose
Returns the current decay rate in percentage expressed in the range [0, 1]. The decay rate will be applied to the learning rate following this formula:
LearningRate = LearningRate * DecayRate ^ CurrentEpochNumber
Declarations
Standard C:
extern float NetDecayRateGet();
MS Visual Studio:
extern float _cdecl NetDecayRateGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetDecayRateGet();
Parameters
None
Returns
A float with the current decay rate in percentage in the range [0, 1].
Usage
The following C example gets the decay rate and puts it into a variable:
float lSngDecayRate = NetDecayRateGet();
NetDecayRateSet
Purpose
Sets the current decay rate in percentage expressed in the range [0, 1]. For more information, please read the function NetDecayRateGet in this document.
Declarations
Standard C:
extern void NetDecayRateSet(float pSng);
MS Visual Studio:
extern void _cdecl NetDecayRateSet(float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetDecayRateSet(float pSng);
Parameters
- The decay rate in percentage and in the range [0, 1].
Returns
Nothing.
Usage
The following C example sets the decay rate at 95%:
NetDecayRateSet(0.95);
NetDropoutGet
Purpose
Returns the current dropout rate in percentage, indicated with a number in the range [0, 1].
Please see NetDropoutSet for more information.
Declarations
Standard C:
extern float NetDropoutGet(int pIntLayerNumber);
MS Visual Studio:
extern float _cdecl NetDropoutGet(int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetDropoutGet(int pIntLayerNumber);
Parameters
The layer number of which to obtain its current dropout rate, or 0 to obtain the default dropout rate which will be applied to future new created layers.
Returns
A float with the current dropout rate as a value in the range of [0, 1].
It will return -1 if the number of layer is not correct.
Usage
The following C example gets the default dropout rate and puts it into a variable.
lSngDropOut = NetDropoutGet(0);
Purpose
NetDropoutSet
Sets the dropout rate in percentage, indicated with a number in the range [0, 1]. You can set it to 0 for no dropout. It is set to 0 by default.
If different than 0, will make that the dropout rate percentage of the inputs of the neurons will not be considered on each training cycle. These ignored neurons will be randomly selected.
Inputs not ignored are scaled up by 1/(1 – dropout) for the sum over all inputs to remain unchanged.
Note that the rate and the actual percentage might not be exactly, please check the following table and graphic:
DropOut Rate | Actual measured % of weights considered depending on the DropOut rate |
0,05 | 90,98% |
0,10 | 83,90% |
0,50 | 50,11% |
0,90 | 16,09% |
0,95 | 8,92% |
Declarations
Standard C:
extern void NetDropoutSet(float pSng, int pIntLayerNumber);
MS Visual Studio:
extern void _cdecl NetDropoutSet(float pSng, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetDropoutSet(float pSng, int pIntLayerNumber);
Parameters
- The dropout rate [0, 1].
- The layer number to apply the new dropout rate. Indicate 0 for a default dropout rate which will be applied to all new layers.
Returns
Nothing.
Usage
The following C example sets the dropout rate at 10% for all new future layers:
NetDropoutSet(0.1, 0);
NetErrorGet
Purpose
Returns a float which is the sum, in absolute values, of the errors of all neurons.
Declarations
Standard C:
extern float NetErrorGet(int pIntCyclesControl);
MS Visual Studio:
extern float _cdecl NetErrorGet(int pIntCyclesControl);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetErrorGet(int pIntCyclesControl);
Parameters
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
Returns
A float.
Usage
The following C code puts into a float variable the network total error:
float lSngError = NetErrorGet(0);
NetFree
Purpose
Destroys the neural network and frees memory. This function does not destroy the set.
Please remember that the last NetFree should also free the snapshot.
Declarations
Standard C:
extern void NetFree(bool pBolSnapshotsFree);
MS Visual Studio:
extern void _cdecl NetFree(bool pBolSnapshotsFree);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetFree(bool pBolSnapshotsFree);
Parameters
- Boolean to indicate if the snapshot should be also freed.
Returns
Nothing.
Usage
The following C example frees the memory of the current neural network, including the snapshot:
NetFree(true);
NetInitialize
Purpose
Initializes the neural network, by initializing in memory:
- Bias: puts 0.
- Values: puts 0.
- Neurons’ weights: a random number.
Random numbers are generated with a Gaussian distribution.
Declarations
Standard C:
extern bool NetInitialize(float pSngMean, float pSngVariance, float pSngPercentage);
MS Visual Studio:
extern bool _cdecl NetInitialize(float pSngMean, float pSngVariance, float pSngPercentage);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetInitialize(float pSngMean, float pSngVariance, float pSngPercentage);
Parameters
- Mean of the random values, normally zero.
- Variance of the random values, normally 1.
- Percentage of the network that should be initialized. Indicate 1 for all the network.
Returns
A boolean indicating with true if the initialization went ok, or false otherwise. False will be returned typically when there is not enough memory to analyze all the layers and to create the internal memory structures.
Usage
The following C code initializes the neural network:
NetInitialize(0, 1, 1);
NetInitTypeGet
Purpose
Returns the type of initialization performed by the function NetInitialize.
Declarations
Standard C:
extern int NetInitTypeGet();
MS Visual Studio:
extern int _cdecl NetInitTypeGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInitTypeGet();
Parameters
Nothing.
Returns
Initialization type:
- 0: Normal.
- 1: Xavier.
- 2: HE.
Usage
The following C code gets the initialization type of the neural network:
#define INIT_TYPE_Normal 0
#define INIT_TYPE_Xavier 1
#define INIT_TYPE_HE 2
int lIntInitType = NetInitTypeGet();
NetInitTypeSet
Purpose
Sets the type of initialization performed by the function NetInitialize.
Declarations
Standard C:
extern void NetInitTypeSet(int pIntInitType);
MS Visual Studio:
extern void _cdecl NetInitTypeSet(int pIntInitType);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetInitTypeSet(int pIntInitType);
Parameters
Initialization type:
- 0: Normal.
- 1: Xavier.
- 2: HE.
Returns
Nothing.
Usage
The following C code sets the initialization type of the neural network to be Xavier:
#define INIT_TYPE_Normal 0
#define INIT_TYPE_Xavier 1
#define INIT_TYPE_HE 2
NetInitTypeSet(INIT_TYPE_Xavier);
NetInputGet
Purpose
Gets the current value of an input.
Declarations
Standard C:
extern float NetInputGet(int pIntInput);
MS Visual Studio:
extern float_cdecl NetInputGet(int pIntInput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetInputGet(int pIntInput);
Parameters
- The number of the input.
Returns
The current value of the input.
Usage
The following C example gets the value of the 10th input and puts it into a variable:
lSngInputValue = NetInputGet(9);
NetInputSet
Purpose
Sets the value of an input.
Declarations
Standard C:
extern void NetInputSet(int pIntInput, float pSng);
MS Visual Studio:
extern void _cdecl NetInputSet(int pIntInput, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetInputSet(int pIntInput, float pSng);
Parameters
- The number of the input.
- The new value.
Returns
Nothing.
Usage
The following C example sets the value of the 10th input to 0.1.
NetInputSet(9, 0.1);
NetInputsAddedNumberGet
Purpose
Returns the number of inputs that have been added with NetNeuronsAdd or NetNeuronsAddConsecutive to the neural network. The maximum number of inputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetInputsAddedNumberGet();
MS Visual Studio:
extern int _cdecl NetInputsAddedNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInputsAddedNumberGet();
Parameters
None.
Returns
An integer.
Usage
The following C example gets the maximum number of inputs that have been added and puts it into a
variable:
lIntTmp = NetInputsAddedNumberGet();
NetInputsMaxNumberGet
Purpose
Returns the number of inputs that the neural network has.
Declarations
Standard C:
extern int NetInputsMaxNumberGet();
MS Visual Studio:
extern int _cdecl NetInputsMaxNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInputsMaxNumberGet();
Parameters
None.
Returns
Nothing.
Usage
The following C example get the number of inputs of the neural network and puts it into a variable:
int lIntTmp = NetInputsMaxNumberGet();
NetKnowledgeFilePathSet
Purpose
Sets the full or relative path and file name (.nnk) that will be used to read and save the knowledge of the network.
Declarations
Standard C:
extern void NetKnowledgeFilePathSet(unsigned long pLngBufferLength, const char* pStrBuffer);
MS Visual Studio:
extern void _cdecl NetKnowledgeFilePathSet(DWORD pLngBufferLength, LPCSTR pStrBuffer);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetKnowledgeFilePathSet(DWORD pLngBufferLength, LPCSTR pStrBuffer);
Parameters
- Length of the string which defines the path to the file.
- Pointer to the string which defines the path to the file.
Returns
Nothing.
Usage
The following C code changes the file path and name to read and save knowledge:
char lStrTmp[1024] = "";
// note that the directory ./know must exist
snprintf(lStrTmp, 1024, "./know/AnaimoAI.nnk";
NetKnowledgeFilePathSet(strlen(lStrTmp), lStrTmp);;
NetKnowledgeLoadFromFile
Purpose
Loads the knowledge of the neural network from a file.
For the characteristics of the file, please read NetKnowledgeSaveToFile.
Declarations
Standard C:
extern int NetKnowledgeLoadFromFile(bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
MS Visual Studio:
extern int _cdecl NetKnowledgeLoadFromFile(bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetKnowledgeLoadFromFile(bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
Parameters
- Boolean to indicate with true if the neural network should be created to match the knowledge being loaded.
- The number of outputs for the network that will be created. Indicate here 0 to create the network with the number of outputs contained in the file, or put a number different than 0 and then the knowledge will only be loaded if the number of outputs of the filed is exactly as this parameter.
- The mean of the random numbers to initialize the network, in case it is created, normally 0.
- The variance of the random numbers to initialize the network, in case it is created, normally 1.
- The percentage of the network that will be initialized, normally 1.
Returns
An integer indicating with:
- 0: file was loaded successfully.
- -1: file could not be opened.
- -2: version could not be read.
- -3: line with rows, cols, … could not be read.
- -4: architecture of network could not be read.
- -5: could not calculate internal architecture of network.
- number of layers and neurons per layer could not be read.
- -6: knowledge file is not compatible with number of layers or neurons per layer of current neural network.
- -7: number of layers and neurons per layer could not be read.
- -8: number of layers or neurons per layer was not provided.
- -9: architecture of network did not include information for all layers.
- – 10: knowledge file is not compatible with number of layers or neurons per layer of current neural network.
- -11: line with total number of records and maximum number of neurons, inputs and outputs, could not be read.
- -12: any of these parameters was not provided: total number of records, max number of neurons, max. number of inputs or max. number of outputs.
- -13: file indicates to have 0 data records.
- -14: knowledge file is not compatible with maximum number of neurons, number of inputs or outputs of current neural network.
- -15: a neuron record was not correct.
- -16: error setting the bias of a neuron.
- -17: number of inputs does not match those of a current neuron.
- -18: error setting the weight of an input of a neuron.
- -19: file does not include complete data for inputs of a neuron.
- -20: file does not include complete data for the number of records indicated in the header.
- -21: knowledge file is not compatible with number of inputs or outputs of current neural network.
- In case the value is positive, then please read the help of NetArchitectureCreate in this document.
Usage
The following C code loads all the neural network knowledge from a file:
int lIntRes = NetKnowledgeLoadFromFile();
NetKnowledgeSaveToFile
Purpose
Saves current knowledge of neural network, which is useful to load it later with NetKnowledgeLoadFromFile.
The file has the following characteristics:
- The file is internally a comma separated value (CSV) file.
- It can be opened by the NNViewer application which source code is provided.
- The file is named AnaimoAI.nnk and is created where the AnaimoAI.dll is.
- The content of the file is:
- 1st row: the version of the Anaimo AI SDK which created the file, in format YYYYMM. For example, for this version: 202203
- 2nd row are 5 integers indicating the number of:
- Number of inputs channels, for example 3 for RGB (red, green, blue) images.
- Rows for inputs. It is optional and therefore can be zero.
- Columns for inputs. It is optional and therefore can be zero.
- Rows for outputs. It is optional and therefore can be zero.
- Columns for outputs. It is optional and therefore can be zero.
- 3rd row is a string indicating the network architecture. This string will contain as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- The number of channels of the layer.
- The type of the layer. For more information, please read NetLayersTypeGet.
- The width of the layer.
- The height of the layer.
- The width stride of the layer, to convolve.
- The height stride of the layer, to convolve.
- The padding to convolve.
- 4th row:
- An integer indicating the number of layers.
- A comma separated string indicating the number of neurons per layer.
- 5th row:
- The number of the total number of following records.
- The maximum number of neurons.
- The maximum number of inputs.
- The maximum number of outputs.
- Rest of rows are any of these types and have the following content:
- Type of record for a neuron. Containing:
- The number of the neuron.
- The bias.
- The number of inputs of this neuron. If this value is different than zero, then there will be as many records of the following type as inputs of this neuron.
- Type of record for an input of a neuron. Containing:
- The weight for the input of the neuron.
- Type of record for a neuron. Containing:
As you can see, in the 2nd row of the file, channels, columns and rows for inputs and outputs can be zero indicating that no specific visualization grid is recommended. Read the parameters section below for more information.
Declarations
Standard C:
extern int NetKnowledgeSaveToFile(int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl NetKnowledgeSaveToFile(int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetKnowledgeSaveToFile(int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
Integers indicating the number of:
- Channels in the inputs (for example 3 for RGB [red, green, blue] images).
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Decimals for the floating-point numbers used to store the information. Less than 15 decimals can be enough to save knowledge, but to save exactly what is in memory that minimum is needed in this parameter.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
- 2: could not analyze current architecture of neural network.
Usage
The following C code saves the current data set into a file as indicated above:
int lIntRes = NetKnowledgeSaveToFile(3, 50, 50, 10, 1, 6);
NetLayersAnalyze
Purpose
Analyzes the current neural network and automatically builds the internal memory structure. This function will be automatically called internally when setting certain modes. For more information, please read the information about the different working modes.
Declarations
Standard C:
extern bool NetLayersAnalyze();
MS Visual Studio:
extern bool _cdecl NetLayersAnalyze();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersAnalyze();
Parameters
None.
Returns
A boolean, indicating:
- true: success.
- false: failure, possibly out of memory.
Usage
The following C example analyzes the neural network and creates the internal memory structure:
NetLayersAnalyze(0);
NetLayersBiasGet
Purpose
Returns the value of the bias of an element of a layer.
Declarations
Standard C:
extern float NetLayersBiasGet(int pIntLayerNumber, int pIntIndex);
MS Visual Studio:
extern float _cdecl NetLayersBiasGet(int pIntLayerNumber, int pIntIndex);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API float _stdcall NetLayersBiasGet(int pIntLayerNumber, int pIntIndex);
Parameters
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
Returns
A floating-point number which is the bias.
Usage
The following C example obtains the bias of the 1st element of the 2nd layer:
float lSngBias = NetLayersBiasGet(2, 0);
NetLayersBiasSet
Purpose
Sets the value of the bias of an element of a layer.
Declarations
Standard C:
extern bool NetLayersBiasSet(int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio:
extern bool _cdecl NetLayersBiasSet(int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersBiasSet(int pIntLayerNumber, int pIntIndex, float pSng);
Parameters
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
- The new value of the bias.
Returns
True if the bias was set correctly.
Usage
The following C example sets to 1.41 the bias of the 1st element of the 2nd layer:
NetLayersBiasSet(2, 0, 1.41);
NetLayersInfoGet
Purpose
Returns information of a layer.
Declarations
Standard C:
extern bool NetLayersInfoGet(int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
MS Visual Studio:
extern bool _cdecl NetLayersInfoGet(int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersInfoGet(int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
Parameters
- The number of the layer of which to obtain information. The first layer is the inputs layer and is the layer number 1.
- A returned parameter indicating the number of channels (for example 3 for RGB [red, green, blue] images).
- The type of layer (for more information read the help of NetLayersTypeGet in this document).
- The width of the layer.
- The height of the layer.
- The stride in width to convolve.
- The stride in height to convolve.
- The padding to convolve.
Returns
A boolean, indicating:
- true: success.
- false: failure.
Usage
The following C example returns information about the 4th layer:
NetLayersInfoGet(4, &lIntChannels, &lIntType, &lIntWidth, &lIntHeight, &lIntStrideW, &lIntStrideH, &lIntPadding);
NetLayersNeuronsNumberGet
Purpose
Returns the number of neurons in a layer.
Declarations
Standard C:
extern int NetLayersNeuronsNumberGet(int pIntLayerNumber);
MS Visual Studio:
extern int _cdecl NetLayersNeuronsNumberGet(int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersNeuronsNumberGet(int pIntLayerNumber);
Parameters
The number of the layer. The first layer is the layer number 1 and is the inputs layer.
Returns
The number of neurons of the parameter layer.
Usage
The following C example get the number of neurons in the 2nd layer of the neural network and puts it into a variable:
int lIntTmp = NetLayersNeuronsNumberGet(2);
NetLayersNumberGet
Purpose
Returns the number of layers in the current neural network, including the outputs but not the inputs.
Declarations
Standard C:
extern int NetLayersNumberGet();
MS Visual Studio:
extern int _cdecl NetLayersNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersNumberGet();
Parameters
None.
Returns
The number of layers. This function is only available after calling NetLayersAnalyze.
Usage
The following C example get the number of layers of the neural network and puts it into a variable:
int lIntTmp = NetLayersNumberGet();
NetLayersQuantityOfNeurons
Purpose
Returns the number of neurons that will be created for a layer.
Declarations
Standard C:
extern int NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
MS Visual Studio:
extern int _cdecl NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
Parameters
- The number of channels (for example 3 for RGB [red, greeb, blue] images).
- The type of the layer. For more information, please read NetLayersTypeGet also in this manual.
- The width of the layer.
- The height of the layer.
- The padding used to convolve.
- The number of channels of the previous layer.
- The width of the previous layer.
- The height of the previous layer.
- Returning parameters with the resulting:
- Number of channels.
- Width of the layer, in case of convolutional layers, of the feature map.
- Height of the layer, in case of convolutional layers, of the feature map
- Stride of width, to convolve.
- Stride of height, to convolve.
Returns
The number of neurons needed to store all the information of the layer.
Usage
The following C example obtains the number of neurons needed for a 3×3 convolutional 2D layer which is after the inputs layers, composed of RGB images of 50×50 pixels:
int lIntNumberOfNeurons = NetLayersQuantityOfNeurons(3, LAYERS_TYPE_Conv2D, 3, 3, 0, 3, 50, 50, &lIntResultingChannels, &lIntResultingWidth, &lIntResultingHeight, &lIntResultingStrideW, &lIntResultingStrideH);
NetLayersTypeGet
Purpose
Returns the type of the parameter layer.
Declarations
Standard C:
extern int NetLayersTypeGet(int pIntLayerNumber);
MS Visual Studio:
extern int _cdecl NetLayersTypeGet(int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersTypeGet(int pIntLayerNumber);
Parameters
The number of the layer of which to obtain its type. The first layer is the inputs layer and is the layer number 1.
Returns
The type of the layer, according to the following list:
#define LAYERS_TYPE_Normal 0
#define LAYERS_TYPE_Conv2D 1
#define LAYERS_TYPE_MaxPooling 2
#define LAYERS_TYPE_AvgPooling 3
#define LAYERS_TYPE_SoftMax 4
Usage
The following C example obtains the type of the 3rd layer:
int lIntLayerType = NetLayersTypeGet(3);
NetLayersWeightGet
Purpose
Returns the value of the weight of an element of a layer.
Declarations
Standard C:
extern float NetLayersWeightGet(int pIntLayerNumber, int pIntIndex);
MS Visual Studio:
extern float _cdecl NetLayersWeightGet(int pIntLayerNumber, int pIntIndex);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API float _stdcall NetLayersWeightGet(int pIntLayerNumber, int pIntIndex);
Parameters
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
Returns
A floating-point number which is the weight.
Usage
The following C example obtains the weight of the 1st element of the 2nd layer:
float lSngWeight = NetLayersWeightGet(2, 0);
NetLayersWeightSet
Purpose
Sets the value of the weight of an element of a layer.
Declarations
Standard C:
extern bool NetLayersWeightSet(int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio:
extern bool _cdecl NetLayersWeightSet(int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersWeightSet(int pIntLayerNumber, int pIntIndex, float pSng);
Parameters
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
- The new value of the weight.
Returns
True if the weight was set correctly.
Usage
The following C example sets to 1.41 the weight of the 1st element of the 2nd layer:
NetLayersWeightSet(2, 0, 1.41);
NetLearn
Purpose
Makes the neural network to learn with the current inputs and outputs.
Declarations
Standard C:
extern int NetLearn(int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern int _cdecl NetLearn(int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLearn(int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- The current number of epoch, which must start with zero.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
An integer, indicating:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero.
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example makes the neural network to learn, without considering cycles, that the current inputs generate the current outputs.
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingTopology 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Topology_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
#define NetLearn_Out_Of_Memory 13
#define NetLearn_Could_not_augment_data_set 14
#define NetLearn_Options_Finished 15
#define VALIDATION_TYPE_ByThreshold 0
#define VALIDATION_TYPE_ByMax 1
#define VALIDATION_TYPE_ByOutputs 2
NetLearn(0, true, 0, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
NetLearningRateGet
Purpose
Returns a float with the current learning rate.
Declarations
Standard C:
extern float NetLearningRateGet();
MS Visual Studio:
extern float _cdecl NetLearningRateGet();
MS Visual Studio compatible with win32 COM:
extern "C"extern "C" AnaimoAI_API float _stdcall NetLearningRateGet();
Parameters
None
Returns
A float with the current learning rate.
Usage
The following C example gets the current learning rate and introduces it into a variable:
lSngTmp = NetLearningRateGet();
NetLearningRateSet
Purpose
Sets the learning rate.
Declarations
Standard C:
extern void NetLearningRateSet(float pSng);
MS Visual Studio:
extern void _cdecl NetLearningRateSet(float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetLearningRateSet(float pSng);
Parameters
- The learning rate. It must be in the range of [0, 1]. If not set, it defaults to 0.5.
Returns
Nothing.
Usage
The following C example sets learning rate to 0.5.
NetLearningRateSet(0.5);
NetMemoryCacheGet (beta)
Purpose
Returns an integer with the size of the cache memory, in kilo bytes (Kb), that the neural network is considering. Normally, cache memory should be equal or less than the level 3 cache of the computer executing the neural network. Memory cache will be disconnected if this parameter is set to zero.
This parameter can affect performance of the training process of the neural network as it allows to optimize the use of the memory cache and to reduce memory bottlenecks.
This parameter is currently subject to additional testing (beta).
Declarations
Standard C:
extern float NetMemoryCacheGet();
MS Visual Studio:
extern void _cdecl NetMemoryCacheGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetMemoryCacheGet();
Parameters
None.
Returns
An integer indicating the size of the cache memory in kilo bytes (Kb).
Usage
The following C example gets the size in kilo bytes (Kb) of the memory cache that the neural network is considering and puts it into a variable:
lIntMemoryCache = NetMemoryCacheGet();
NetMemoryCacheSet (beta)
Purpose
Sets the size of the cache memory, in kilo bytes (Kb), that the neural network is considering. Normally, cache memory should be equal or less than the level 3 cache of the computer executing the neural network. Memory cache will be disconnected if this parameter is set to zero.
This parameter can affect performance of the training process of the neural network as it allows to optimize the use of the memory cache and to reduce memory bottlenecks.
This parameter is currently subject to additional testing (beta).
Declarations
Standard C:
extern void NetMemoryCacheSet(int pInt);
MS Visual Studio:
extern void _cdecl NetMemoryCacheSet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetMemoryCacheSet(int pInt);
Parameters
- The size, in kilo bytes (Kb), of the cache memory that the neural network should consider.
Returns
Nothing.
Usage
The following C example sets the size in kilo bytes (Kb) of the memory cache that the neural network will consider:
NetMemoryCacheSet(31000);
NetModeGet
Purpose
Returns an integer with the current working mode.
Declarations
Standard C:
extern float NetModeGet();
MS Visual Studio:
extern void _cdecl NetModeGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetModeGet();
Parameters
None.
Returns
Returns an integer with the current working mode.
Usage
The following C example gets the current network mode and puts it into a variable:
#define MODE_NORMAL 0
#define MODE_STANDARD_BACKPROPAGATION 1
#define MODE_DYNAMIC_PROPAGATION 2
#define MODE_BACKPROP_EXPERIMENTAL 3
lIntMode = NetModeGet();
NetModeSet
Purpose
Sets the current working mode. Changing the working mode could initialize the network, therefore, if
you want to keep the model you should save it before changing the mode.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore they need you to manually connect the neurons first.
Please note that:
- If you want to use manual topology with the faster mode Normal, you can do it by using manual topology to create the neural network and then, changing mode to Normal.
- Changing to Standard back propagation or Dynamic modes will automatically change topology to manual.
Declarations
Standard C:
extern void NetModeSet(int pInt);
MS Visual Studio:
extern void _cdecl NetModeSet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetModeSet(int pInt);
Parameters
- The desired working mode.
Returns
Nothing.
Usage
The following C example sets mode to dynamic.
#define MODE_NORMAL 0
#define MODE_STANDARD_BACKPROPAGATION 1
#define MODE_DYNAMIC_PROPAGATION 2
#define MODE_BACKPROP_EXPERIMENTAL 3
NetModeSet(MODE_DYNAMIC_PROPAGATION);
NetMomentumGet
Purpose
Returns the current momentum rate. Momentum rate different than 0 will add, to weights and bias, the values of the changes of the previous training iteration, multiplied by this rate. A momentum of 0.1 will add 10% of the previous changes, whereas a momentum of 0.9 will add 90%.
Declarations
Standard C:
extern float NetMomentumGet();
MS Visual Studio:
extern void _cdecl NetMomentumGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetMomentumGet();
Parameters
None
Returns
A float with the current momentum rate.
Usage
The following C example gets the momentum and puts it into a variable:
lSngTmp = NetMomentumGet();
NetMomentumSet
Purpose
Sets the momentum rate. It is only applied during training.
Declarations
Standard C:
extern void NetMomentumSet(float pSng);
MS Visual Studio:
extern void _cdecl NetMomentumSet(float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetMomentumSet(float pSng);
Parameters
- The momentum rate.
Returns
Nothing.
Usage
TThe following C example sets momentum to 0.9.
NetMomentumSet(0.9);
NetNeuronsAdd
Purpose
Adds a neuron to the neural network. You must use this call also for inputs and outputs. Inputs are indicated with the parameter 1 in this call and outputs must be, first created with this call and then qualified with NetOutputAdd.
Declarations
Standard C:
extern int _cdecl NetNeuronsAdd(int pIntLayerNumber);
MS Visual Studio:
extern int NetNeuronsAdd(int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAdd(int pIntLayerNumber);
Parameters
- The number of the layer, starting with 1 for the inputs, where the neuron will be located. For the first layer (1), the inputs, this parameter is mandatory. For the rest of layers (2, …), you can ignore this parameter by indicating a zero value only if you are not using the full connected topology.
Returns
The number of added neurons so far if the neuron was added, or zero if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- When topology is full connected, and the layer number was not indicated.
Usage
The following C code adds a neuron to the network.
NetNeuronsAdd(0);
NetNeuronsAddConsecutive
Purpose
Adds several neurons to the neural network. You must use this call also for inputs and outputs. Inputs are indicated with the parameter 1 in this call and outputs must be, first created with this call and then created with NetOutputAdd.
Please note that this function does not allow to create channels, for example to store images separated in RGB (red, green, blue) channels. If you want to create separated channels, use NetNeuronsAddLayer instead.
Declarations
Standard C:
extern int _cdecl NetNeuronsAddConsecutive(int pIntLayerNumber, int pIntQuantity);
MS Visual Studio:
extern int NetNeuronsAddConsecutive(int pIntLayerNumber, int pIntQuantity);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddConsecutive(int pIntLayerNumber, int pIntQuantity);
Parameters
- The number of the layer, starting with 1 for the inputs, where the neuron will be located. For the first layer (1), the inputs, this parameter is mandatory. For the rest of layers (2, …), you can ignore this parameter by indicating a zero value only if you are not using the full connected topology.
- The quantity of neurons to add.
Returns
The number of id of the last added neuron, if the neuron was added, or -1 if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- The layer number was not indicated and was mandatory.
Usage
The following C code adds 100 input neurons to the network:
NetNeuronsAddConsecutive(1, 100);
NetNeuronsAddLayer
Purpose
Adds a new layer to the neural network.
Please note that convolutional layers will only work in mode MODE_NORMAL and topology TOPOLOGY_FULL_CONNECTED.
Declarations
Standard C:
extern int _cdecl NetNeuronsAddLayer(int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
MS Visual Studio:
extern int NetNeuronsAddLayer(int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddLayer(int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
Parameters
- The number of the layer, starting with 1 for the inputs.
- The channels of the layer (for example 3 for RGB [red, green, blue] images).
- The type of the layer. For more information, please read NetLayersTypeGet in this document.
- The width of the layer. In case of convolutional layers this is the width of the filter.
- The height of the layer. In case of convolutional layers this is the height of the filter.
- The stride for width of the layer, to convolve, normally 1.
- The stride for height of the layer, to convolve, normally 1.
- The padding to convolve, normally zero.
Returns
The number of id of the last added neuron, if the neuron was added, or -1 if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- The layer number was not indicated and was mandatory.
Usage
The following C code adds a second layer which is a complete convolutional layer of 3×3 to convolve RGB images:
NetNeuronsAddLayer(2, 3, LAYERS_TYPE_Conv2D, 3, 3, 1, 1, 0);
NetNeuronsAddedNumberGet
Purpose
Returns the number of neurons that have been added with NetNeuronsAdd to the neural network. The maximum number of neurons that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetNeuronsAddedNumberGet();
MS Visual Studio:
extern int _cdecl NetNeuronsAddedNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddedNumberGet();
Parameters
None
Returns
An integer.
Usage
The following C example gets the maximum number of neurons that have been added and puts it into a variable:
lSngTmp = NetNeuronsAddedNumberGet();
NetNeuronsMaxNumberGet
Purpose
Returns the maximum number of neurons that can be added with NetNeuronsAdd to the neural network.
This value is set by the function NetCreate.
Declarations
Standard C:
extern int NetNeuronsMaxNumberGet();
MS Visual Studio:
extern int _cdecl NetNeuronsMaxNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsMaxNumberGet();
Parameters
None
Returns
An integer.
Usage
The following C example gets the maximum number of neurons that can be added and puts it into a
variable:
lSngTmp = NetNeuronsMaxNumberGet();
NetOutputAdd
Purpose
Adds an output neuron to the neural network.
Declarations
Standard C:
extern void NetOutputAdd(int pIntNeuron);
MS Visual Studio:
extern void _cdecl NetOutputAdd(int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_Aextern "C" AnaimoAI_API void _stdcall NetOutputAdd(int pIntNeuron);
Parameters
- Neuron number or id to become an output.
Returns
None.
Usage
The following CThe following C example make the 100th neuron to be an output;
NetOutputAdd(99);
NetOutputPredict
Purpose
Computes the value of an output and returns it.
Declarations
Standard C:
extern float NetOutputPredict(int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
MS Visual Studio:
extern float _cdecl NetOutputPredict(int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetOutputPredict(int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
Parameters
- The number of the output.
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- A boolean indicating false If the prediction must be performed based upon the status of the neural network or true upon the snapshot. This allows to obtain predictions of a particular moment of the neural network while it continues learning with the set. This feature is only available for full connected topology, normal or back propagation modes.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
A float.
Usage
The following C example computes the value of output 10th and puts it into a float variable, without controlling cycles.
float lSngTmp = NetOutputPredict(9, 0, false, &lBolGradientExploded);
NetOutputsAddedNumberGet
Purpose
Returns the number of outputs that have been added with NetOutputAdd to the neural network. The maximum number of outputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetOutputsAddedNumberGet();
MS Visual Studio:
extern int _cdecl NetOutputsAddedNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetOutputsAddedNumberGet();
Parameters
None.
Returns
An integer.
Usage
The following C example gets the maximum number of outputs that have been added and puts it into a
variable:
lIntTmp = NetOutputsAddedNumberGet();
NetOutputGet
Purpose
Gets the current value of an output.
Declarations
Standard C:
extern float NetOutputGet(int pIntOutput);
MS Visual Studio:
extern float _cdecl NetOutputGet(int pIntOutput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetOutputGet(int pIntOutput);
Parameters
- The number of the output.
Returns
Nothing.
Usage
The following C example gets the current value of the 10th output and stores it into a variable:
float lSngOutputValue = NetOutputGet(9);
NetOutputSet
Purpose
Sets the value of an output.
Declarations
Standard C:
extern void NetOutputSet(int pIntOutput, float pSng);
MS Visual Studio:
extern void _cdecl NetOutputSet(int pIntOutput, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetOutputSet(int pIntOutput, float pSng);
Parameters
- The number of the output.
- The new value.
Returns
Nothing.
Usage
The following C example sets the value of the 10th output to 0.5.
NetOutputSet(9, 0.5);
NetOutputsMaxNumberGet
Purpose
Returns the maximum number of outputs that can been added with NetOutputAdd to the neural network.
The maximum number of outputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetOutputsMaxNumberGet();
MS Visual Studio:
extern int _cdecl NetOutputsMaxNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetOutputsMaxNumberGet();
Parameters
None.
Returns
An integer.
Usage
The following C example gets the maximum number of outputs that can been added and puts it into a
variable:
lSngTmp = NetOutputsMaxNumberGet();
NetSetMatch
Purpose
Indicates, with true, if current channels, inputs and outputs of the data set match those of the current neural network.
Declarations
Standard C:
extern bool NetSetMatch();
MS Visual Studio:
extern bool _cdecl NetSetMatch();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetSetMatch();
Parameters
None.
Returns
True if current channels, inputs and outputs of the data set match those of the current neural network.
Usage
The following C example checks if the current data set matches with current neural network:
bool lBolSetAndNetMatch = NetSetMatch();
NetSnapshotGet
Purpose
Sets the current state of the neural network to exactly how it was when NetSnapshotTake was called for the last time.
When using the full connected topology, the snapshot can be gotten even from a snapshot that was taken with a neural network that had a different number of neurons only in one layer. This is very useful to, for example, increase or decrease the number of outputs of the neural network without losing the knowledge.
Declarations
Standard C:
extern int NetSnapshotGet();
MS Visual Studio:
extern int _cdecl NetSnapshotGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetSnapshotGet();
Parameters
None.
Returns
An integer, indicating:
- 0: success.
- -1: could not analyze neural network structure.
- -2: no snapshot was previously taken.
- -3: no neurons were added to the neural network.
- -4: there are changes, since the snapshot, in the number of neurons of 2 or more consecutive layers.
Usage
The following C example sets the neural network to how it was when NetSnapshotTake was called for the last time:
NetSnapshotGet();
NetSnapshotTake
Purpose
Saves the current state of the neural network to a snapshot in memory, to be recalled later by NetSnapshotGet.
When using the full connected topology, the snapshot can be gotten even from a snapshot that was taken with a neural network that had a different number of neurons only in one layer. This is very useful to, for example, increase or decrease the number of outputs of the neural network without losing the knowledge.
Declarations
Standard C:
extern int NetSnapshotTake();
MS Visual Studio:
extern int _cdecl NetSnapshotTake ();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetSnapshotTake();
Parameters
None.
Returns
An integer, indicating:
- 0: success.
- -1: could not analyze neural network structure.
- -2: could not create memory structure.
- -3: could not memorize internal neural network structure.
- -4: there are no neurons to snapshot.
Usage
The following C example saves the current state of the neural network to a snapshot in memory, to be recalled later by NetSnapshotGet:
NetSnapshotTake();
NetThreadsMaxNumberGet
Purpose
Returns the current maximum number of internal threads that will be used to train the neural network. It is set to 1 by default.
Declarations
Standard C:
extern int NetThreadsMaxNumberGet();
MS Visual Studio:
extern int _cdecl NetThreadsMaxNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetThreadsMaxNumberGet();
Parameters
None.
Returns
An integer indicating the current maximum number of threads.
Usage
The following C example puts into a variable the current maximum number of threads:
int lIntTmp = NetThreadsMaxNumberGet();
NetThreadsMaxNumberSet
Purpose
Sets the current maximum number of internal threads that will be used to train the neural network. It is set to 1 by default.
Declarations
Standard C:
extern void NetThreadsMaxNumberSet(int pInt);
MS Visual Studio:
extern void _cdecl NetThreadsMaxNumberSet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetThreadsMaxNumberSet(int pInt);
Parameters
An integer indicating the maximum number of threads.
Returns
Nothing.
Usage
The following C example sets the maximum number of threads that will be used to train the neural
network:
NetThreadsMaxNumberSet(8);
NetArchitectureCreate
Purpose
Creates the basement of the neural network (like NetCreate) and the architecture of the neural network, based on a string which defines all the layers. Only supports the full connected topology. You do not need to call neither NetCreate nor NetInitialize before or after calling this function.
This function allows to create a complete architecture of a neural network with just a string, which is very useful to try different architectures in search of the best model.
Declarations
Standard C:
extern int NetArchitectureCreate(unsigned long pLngBufferLength, const char* pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
MS Visual Studio:
extern int _cdecl NetArchitectureCreate(DWORD pLngBufferLength, LPCSTR pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetArchitectureCreate(DWORD pLngBufferLength, LPCSTR pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage);
Parameters
- Length of the string which defines the layers.
- Pointer to the string that which defines the layers. The string defining the layers can be of types:
- Detailed (see example below): a string containing as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- The number of channels of the layer.
- The type of the layer. For more information, please read NetLayersTypeGet.
- The width of the layer.
- The height of the layer.
- The width stride of the layer, to convolve.
- The height stride of the layer, to convolve.
- The padding to convolve.
- Mnemotechnic (see example below):
- To indicate the inputs layer: Number of channels I (uppercase i) width of inputs (height must be the same).
- Rest of layers: number of subelements TYPE and width (height must be the same). Where TYPE can be:
- C: convolutional 2D layer.
- P: max pooling layer.
- S: SoftMax layer.
- Detailed (see example below): a string containing as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- Initialization mean for the random numbers, normally 0.
- Initialization variance for the random numbers, normally 1.
- Percentage of the network that will be initialized, should be 1.
For example, to create a network architecture with these characteristics:
- 1st layer: 28 x 28 inputs with 1 channel (black and white).
- 2nd layer: 10 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 3rd layer: 10 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 4th layer: a max pool layer of 2 x 2, with strides of 1 and no padding.
- 5th layer: 20 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 6th layer: 20 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 7th layer: a max pool layer of 2 x 2, with strides of 1 and no padding.
- 8th layer: a fully connected (dense) layer of 128 neurons.
- 9th layer: a fully connected (dense) layer of 10 outputs with SoftMax function.
The detailed string of the indicated architecture is:
1,0,28,28,1,1,0;10,1,3,3,1,1,0;10,1,3,3,1,1,0;10,2,2,2,1,1,0;20,1,3,3,1,1,0;20,1,3,3,1,1,0;20,2,2,2,1,1,0;128;10,4,0,0,0,0,0
And the equivalent with mnemotechnic is:
1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural network library is not licensed for the current hardware running it. In this case, you need to send to Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your device supports.
- 4: unknown error.
- 5: inputs number must be greater than outputs number.
- 6: this function is only available for full connected topology.
- 7: indicated parameters are incorrect.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: the number of neurons of the last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the neural network.
// see NetCreate for the definition of possible returning values
int lIntTmp = NetArchitectureCreate(strlen(“1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S“), “1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S”, 0, 1, 1);
NetArchitectureCreateAuto
Purpose
Creates the basement of the neural network (like NetCreate) and the architecture of the neural network. Only supports the full connected topology. You do not need to call NetCreate before calling this function.
Declarations
Standard C:
extern int NetArchitectureCreateAuto(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfLayers);
MS Visual Studio:
extern int _cdecl NetArchitectureCreateAuto(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfLayers);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetArchitectureCreateAuto(int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfLayers);
Parameters
- Maximum number of neurons: this parameter is optional; you can put 0 and it will create as many neurons as memory allows. If you put it, that will be a limit and an error will be returned if more neurons are tried to be created.
- Maximum number of inputs.
- Maximum number of outputs.
- Maximum number of hidden layers including the layer for outputs.
- Initialization mean for the random numbers, normally 0.
- Initialization variance for the random numbers, normally 1.
- Percentage of the network that will be initialized, should be 1.
- Output parameter: returns the number of layers created (including the outputs layer).
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural network library is not licensed for the current hardware running it. In this case, you need to send to Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your device supports.
- 4: unknown error.
- 5: inputs number must be greater than outputs number.
- 6: this function is only available for full connected topology.
- 7: indicated parameters are incorrect.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: the number of neurons of the last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the neural network.
// see NetCreate for the definition of possible returning values
int lIntTmp = NetArchitectureCreateAuto(0, pIntInputsNumber, pIntOutputsNumber, 3, 0, 1, 1, &lIntNumberOfLayers);
NetTopologyGet
Purpose
Returns an integer with the current topology of the neural network. There are these topologies available:
- 0: Manual. The topology, this is the number of neurons, inputs, outputs, and their connections, is completely free and it is responsibility of the programmer.
- 1: Full connected. All neurons from one layer are connected to all neurons of the next layer.
Declarations
Standard C:
extern float NetTopologyGet();
MS Visual Studio:
extern void _cdecl NetTopologyGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetTopologyGet();
Parameters
None.
Returns
An integer indicating:
- 0 for manual topology, where connections between neurons must be stablished programmatically with calls to NetConnect, NetConnectConsecutive or NetConnectLayer.
- 1 for full connected topology, where all neurons from one layer are automatically connected to all neurons of the next layer.
Usage
The following C example gets the current neural network topology and puts it into a variable:
#define TOPOLOGY_MANUAL 0
#define TOPOLOGY_FULL_CONNECTED 1
lIntTmp = NetTopologyGet();
NetTopologySet
Purpose
Sets the current topology of the neural network. There are these topologies available:
0: Manual. The topology, this is the number of neurons, inputs, outputs, and their connections, is completely free and it is responsibility of the programmer.
1: Full connected. All neurons from one layer are connected to all neurons of the next layer.
Full connected topology is much faster and uses a 5% of the memory used in the manual topology, but manual topology allows you to connect neurons as you wish, which can be convenient for scientific and experimentation purposes.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore they need you to connect the neurons first.
Please note that:
- If you create the network in a topology different than manual, then, if you change to manual topology, you will have to connect neurons, or the network will not work.
- Modes Standard back propagation and Dynamic propagation can only be used in manual topology; therefore, they need you to connect the neurons first.
- If you want to use manual topology with the faster mode Normal, you can do it by using manual topology and Standard back propagation mode to create the neural network and then, changing mode to Normal.
The following function calls do not work in full connected topology:
- NetConnect
- NetConnectConsecutive
- NetConnectLayer
- NeuWeightUpdatedGet
- NeuValueUpdatedGet
- NetErrorGet
Declarations
Standard C:
extern void NetTopologySet(int pInt);
MS Visual Studio:
extern void _cdecl NetTopologySet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetTopologySet(int pInt);
Parameters
- The desired topology.
Returns
Nothing.
Usage
The following C example sets topology to full connected:
#define TOPOLOGY_MANUAL 0
#define TOPOLOGY_FULL_CONNECTED 1
NetTopologySet(TOPOLOGY_FULL_CONNECTED);
Neuron functions
A neuron is an entity, kept in memory, which internally has a value, a bias and a delta. Neurons can also be inputs and outputs. A neuron can be connected to any other neuron in the network.
NeuValueGet
Purpose
Returns the current value of a neuron.
Declarations
Standard C:
extern float NeuValueGet(int pIntNeuron);
MS Visual Studio:
extern float _cdecl NeuValueGet(int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NeuValueGet(int pIntNeuron);
Parameters
- Number of neuron to obtain its current value.
Returns
A float.
Usage
The following C code puts into a float variable the bias value of the 10th neuron:
float lSngTmp = NeuValueGet(9);
NeuGetValueUpdatedGet
Purpose
Returns the number of times that the value was calculated for a neuron, since these functions were called for the last time:
- NetCreate
- NetInitialize
Declarations
Standard C:
extern int NeuGetValueUpdatedGet(int pIntNeuron);
MS Visual Studio:
extern int _cdecl NeuGetValueUpdatedGet(int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuGetValueUpdatedGet(int pIntNeuron);
Parameters
- Number of neuron, starting with zero.
Returns
An integer.
Usage
The following C code puts into an integer variable the number of times that the 10th neuron’s value was updated:
int lIntTmp = NeuGetValueUpdatedGet(9);
NeuWeightGet
Purpose
Returns the current, bias or delta of a neuron.
Declarations
Standard C:
extern float NeuWeightGet(int pIntNeuron, int pIntInput, bool pBolDelta);
MS Visual Studio:
extern float _cdecl NeuWeightGet(int pIntNeuron, int pIntInput, bool pBolDelta);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NeuWeightGet(int pIntNeuron, int pIntInput, bool pBolDelta);
Parameters
- Number of neuron, starting with zero.
- Number of input. Put zero here for the bias.
- False to obtain the current weight or true to obtain the delta that will be applied to the weight.
Returns
A float.
Usage
The following C code puts into a float variable the current weight of the 10th neuron’s 1st weight:
float lSngTmp = NeuWeightGet(9, 1, false);
NeuWeightUpdatedGet
Purpose
Returns the number of times that the weight was calculated for a neuron since the last call to the function NetInitialize.
Declarations
Standard C:
extern int NeuWeightUpdatedGet(int pIntNeuron, int pIntInput);
MS Visual Studio:
extern int _cdecl NeuWeightUpdatedGet(int pIntNeuron, int pIntInput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuWeightUpdatedGet(int pIntNeuron, int pIntInput);
Parameters
- Number of neuron, starting with zero.
- Number of input, starting with 1.
Returns
An integer.
Usage
The following C code puts into an integer variable the number of times that the 1st weight was updated for 10th neuron:
int lIntTmp = NeuWeightUpdatedGet(9, 1);
NeuWeightSet
Purpose
Sets the current weight, bias or delta of a neuron.
Declarations
Standard C:
extern bool NeuWeightSet(int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
MS Visual Studio:
extern bool _cdecl NeuWeightSet(int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NeuWeightSet(int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
Parameters
- Number of neuron, starting with zero.
- Number of weight. Put zero here for the bias.
- Weight value.
- False to set the current weight or true to set the delta that will be applied to the weight.
Returns
True if weight was set. False if neuron is an input or is outside of the total number of added neurons.
Usage
The following C code sets the 1st weight value of the 10th neuron:
NeuWeightSet(9, 1, 0.25, false);
NeuWeightsNumberGet
Purpose
Returns the number of weights of a neuron. If this function returns -2, it means that this neuron is not relevant for bias and weights and therefore its bias and weights should not be saved.
Declarations
Standard C:
extern int NeuWeightsNumberGet(int pIntNeuron);
MS Visual Studio:
extern int _cdecl NeuWeightsNumberGet(int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuWeightsNumberGet(int pIntNeuron);
Parameters
- Number of neuron.
Returns
An integer with the number of weights or:
- -1: error.
- -2: this neuron is not relevant for biases and weights.
Usage
The following C code puts into an integer variable the number of weights of the 10th neuron:
int lIntTmp = NeuWeightsNumberGet(9);
Set functions
If you want, as this is optional, you can create a set with your records. Then, you can train the network directly in memory from the set. This will result in faster learning and the need of a smaller number of records.
The procedure is:
- Get sure that the set is empty by calling SetFree.
- Every time you have set up the inputs and the outputs, call SetRecordInsert to create a new record in the set. If you want to speed up SetRecordInsert, and to save time and computing power, you can call SetCreate before any SetRecordInsert so that the memory will be pre-reserved at once.
- At any moment, you can update the inputs and ouputs of a record with SetRecordSet or delete a record by calling SetRecordDelete.
- When all the records that you want to use in the training have been created, then you must, for each epoch:
- SetLearnStart (once)
- SetLearnContinue (per record) or SetLearnConsecutive (per group of consecutive records).
- SetLearnEnd (once) to finish. This call is optional, but recommended in case you are using batches to learn, and needed in Dynamic propagation mode to allow the auto adaptation of the neural network.
Remember to use NetSnapshotTake to store the network configuration while the set is being used for training and, after SetLearnEnd, to use NetSnapshotGet to recall the best network configuration obtained during training.
The set also provides auto machine learning functions which automate certain tasks of a search of a learning model:
- SetAutoLearnStart
- SetAutoLearnContinue
- SetAutoLearnEnd
The set auto learn features only works with MODE_NORMAL.
Please check a source code example to better understand the set and its great benefits.
SetAutoLearnContinue
Purpose
Continues the set auto learn feature. The function SetAutoLearnStart must be called before calling this function. The function SetAutoLearnEnd must be called at the end of the auto learn process.
Declarations
Standard C:
extern float SetAutoLearnContinue(int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetAutoLearnContinue(int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetAutoLearnContinue(int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- An integer which returns the number of layers of the current neural network. This value is what the function is trying to create, but it does not mean that the current number of layers of the neural network will always be this parameter. To find out the current real number of layers of the current neural network, please use NetLayersNumberGet.
- An integer which returns the current number of the record of the data set being learned.
- An integer which returns the current number of the epoch in the learning process.
- An integer which returns the current option (from the list of options defined by SetAutoLearnStart) being tested.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- An integer which returns the status, following the contants related to NetLearn_Success.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
A float with the current success rate.
Usage
The following C example continues the set auto learn process:
float lSngSuccessRate = SetAutoLearnContinue(&lIntNumberOfLayers, &lIntCurrentRecord, &lIntCurrentEpochNumber, &lIntCurrentOption, true, &lBolThereIsNaN, &lIntStatus, &lBolGradientsExploded, &lBolGradientsVanished, &lSngEntropy);
SetAutoLearnEnd
Purpose
Finishes the set auto learn feature. In the mode AUTOLEARN_MODE_INFINITE_DATA this function will restore the data set to its original content when the function SetAutoLearnStart was called.
Functions SetAutoLearnStart and SetAutoLearnContinue must be called before calling this function.
Declarations
Standard C:
extern bool SetAutoLearnEnd();
MS Visual Studio:
extern bool _cdecl SetAutoLearnEnd();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetAutoLearnEnd();
Parameters
None.
Returns
True if no error.
Usage
The following C example finishes the set auto learn process:
bool lBolOk = SetAutoLearnEnd();
SetAutoLearnStart
Purpose
Starts the auto learn features of the set. This function must be called before calling SetAutoLearnContinue. The function SetAutoLearnEnd must be called at the end of the auto learn process.
There are 2 possible modes:
- AUTOLEARN_MODE_FINITE_DATA: this mode is recommended when we have a finite number of records of data. It tests different neural network architectures, training with the part of the data set for training, testing with the part of the data set for testing, for a number of epochs and performs early stop moving to the next architecture of network if the trend of the success rate will not reach the target.
- AUTOLEARN_MODE_INFINITE_DATA: this mode is recommended when we expect an infinite quantity of data, for example, data coming from humans interacting with our system (also know as “lifelong learning”). In this mode, the data set will be altered until calling SetAutoLearnEnd. It tests a list of neural network architectures, together with a list of possible data augmentations, for a number of epochs, and provides the value of the trend of the success rate and the error of the trend. The error of the trend will indicate how likely is that a real success rate would be temporary very low with respect to the provided value of the trend of the success rate.
Declarations
Standard C:
extern int SetAutoLearnStart(int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
MS Visual Studio:
extern int _cdecl SetAutoLearnStart(int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetAutoLearnStart(int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
Parameters
- pIntAutoMode: the mode to be used, over:
- #define AUTOLEARN_MODE_FINITE_DATA 0
- #define AUTOLEARN_MODE_INFINITE_DATA 1
- pIntCyclesControl: control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- pIntMaxNeurons: maximum number of neurons. Put zero for no limit.
- pIntEpochs: number of epochs to learn.
- pIntBatches: number of batches to perform the learning. If this parameter is not zero, actions will be taken and execution will return after each batch.
- pIntLayersStart: number of layers to start creating of neural networks. The minimum value to start is 2, although you can put zero in this parameter to ignore it and use current neural network. See NetArchitectureCreateAuto in this document for more information.
- pIntLayersEnd: number of layers to stop creating neural networks. See NetArchitectureCreateAuto in this document for more information. You can put zero in this parameter to ignore it and use current neural network.
- pIntNumberOfFirstRecordForTraining: number of the first record for training, starting at zero. Not necessary in the AUTOLEARN_MODE_INFINITE_DATA.
- pIntNumberOfLastRecordForTraining: number of the last record for training. Not necessary in the AUTOLEARN_MODE_INFINITE_DATA.
- pIntNumberOfWhichForValidation: number of records from the list from pIntNumberOfFirstRecordForTraining to pIntNumberOfLastRecordForTraining which will be used for validation, not for test. The records from the data set which are not included in the list from pIntNumberOfFirstRecordForTraining to pIntNumberOfLastRecordForTraining will be used for test. If this parameter is not zero, the learning rate will be also auto adjusted.
- pSngThresholdForActive: float indicating which activation value means that an output is active.
- pSngDeviationPctTarget: float indicating which percentage of deviation in the activation value of an output will mean that the output is not active. This is an alternative to pSngThresholdForActive, put zero in this parameter if you are going to use a threshold for active.
- pSngTargetSuccessesOverRecords: a float in the range of [0, 1] indicating the percentage of successful predictions over the total records. In the AUTOLEARN_MODE_FINITE_DATA, this parameter is what will determine an early stop when the pace of the trend of the success rate will not achieve this parameter in the remaining epochs.
- pIntTestType: test type. For more information, please read SetLearnStart in this document.
- pIntTimeForTrend: time in seconds to analyze if the trend of the success rate is going to reach the target parameter pSngTargetSuccessesOverRecords. If indicated, leave the following parameter as zero.
- pIntItemsForTrend: the number of success rates to analyze if the trend of the success rate is going to reach the target parameter pSngTargetSuccessesOverRecords.
- The following parameters only apply to the mode AUTOLEARN_MODE_INFINITE_DATA:
- pIntIWidth: width of the inputs.
- pIntIHeight: height of the inputs.
- pIntNumberOfOptions: number of elements in the following lists.
- pNets: pointer to a list of strings with definitions of network architectures. Please read NetArchitectureCreate in this document for more information of detailed and mnemotechnic strings.
- pLearningRates: a list of floats with learning rates.
- pInitMeans: list of floats with the mean values to randomly initialize the neural network.
- pInitVariances: list of floats with the variance values to randomly initialize the neural network.
- pDropouts: pointer to a list of strings with the dropout rates of each layer, separated by commas (,).
- pMinNumberOfRecords: list of integers which indicates, for each option, the number of records which will be stored in an internal data set, augmented, and then uses as a big batch for learning. Please note that, inside this batch, there can still be the learning batches indicated with the previous parameter. The difference between the batch created by this parameter and the batches created by the previous parameter is that the batch of this parameter is designed to guarantee that the network learns enough different classes of records, while the previous batches are only intended to accumulate the adjustment of biases and weights in batches.
- pReturnNumberOfPredictions: list of integers with, for each option, the number of the most probable predictions that will be used to check success. For example: let us consider that the neural network is learning to classify 10 types of fruits. A value of 3 in this parameter will consider a success if a record of the dataset was an image of a banana and the neural network predicted banana as the 1st, 2nd or 3rd most probable output.
- For parameters from pAugShapes to pAugSaturationStep, please read the function SetImagesAugment also in this document.
- pAugNumberOfRecordsPerRecord: a list of integers returning the number of augmented records that will be created per record of the data set.
- pTrendFinal: a list of float numbers returning the final trend of success rate (percentage) calculated by this function.
- pError: a list of float numbers returning the final error of the measured success rates with respect to the trend calculated by this function. Errors are calculated as the average of the sum of the squares of the differences between success rates and the trend.
Returns
An integer, indicating the result:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero.
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example prepares the set auto learn feature to test different options to find the best learning model:
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingArchitecture 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Architecture_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
#define NetLearn_Out_Of_Memory 13
#define NetLearn_Could_not_augment_data_set 14
#define NetLearn_Options_Finished 15
#define NUM_EPOCHS 5
#define NEURONS_THRESHOLD_FOR_ACTIVE 0.5
#define NUMBER_OF_RECENT_PREDICTIONS_TO_COUNT_SUCCESS 100
if (SetAutoLearnStart(AUTOLEARN_MODE_INFINITE_DATA, 0, 0, NUM_EPOCHS, 0, 0, 0, 0, 0, 0, NEURONS_THRESHOLD_FOR_ACTIVE, 0, 0, TEST_TYPE_ByMax, 0, NUMBER_OF_RECENT_PREDICTIONS_TO_COUNT_SUCCESS, mIntImageWidth, mIntImageHeight, lAutoNets->count, lAutoNets->item, lLearningRates->item, lInitMeans->item, lInitVariances->item, lAutoDropouts->item, lMinNumberOfRecords->item, lReturnNumberOfPredictions->item, lAugShapes->item, lAugRotationFrom->item, lAugRotationTo->item, lAugRotationStep->item, lAugBrightnessFrom->item, lAugBrightnessTo->item, lAugBrightnessStep->item, lAugContrastFrom->item, lAugContrastTo->item, lAugContrastStep->item, lAugGammaFrom->item, lAugGammaTo->item, lAugGammaStep->item, lAugMoveHorizontalFrom->item, lAugMoveHorizontalTo->item, lAugMoveHorizontalStep->item, lAugMoveVerticalFrom->item, lAugMoveVerticalTo->item, lAugMoveVerticalStep->item, lAugZoomFrom->item, lAugZoomTo->item, lAugZoomStep->item, lAugHueFrom->item, lAugHueTo->item, lAugHueStep->item, lAugSaturationFrom->item, lAugSaturationTo->item, lAugSaturationStep->item, lAugNumberOfRecordsPerRecord->item, lTrendFinal->item, lError->item) == NetLearn_Success)
printf(“Ready to start.\n”);
SetCreate
Purpose
Creates the memory structure for a set of inputs and outputs for the neural network. This function is not strictly necessary, as calling multiple times to SetRecordInsert will also reserve memory for each record, but slower than using SetCreate once and before the SetRecordInsert calls. Therefore, SetCreate is used before than the multiple calls to SetRecordInsert, as it will be faster by reserving all the needed memory for the whole set at once.
Please remember that SetFree must be called after SetCreate or the memory will not be freed by any other function, which could lead to memory corruption problems.
Declarations
Standard C:
extern bool SetCreate(int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
MS Visual Studio:
extern bool _cdecl SetCreate(int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetCreate(int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
Parameters
- Total number of records of the data set.
- The number of channels of the inputs (for example 3 for RGB [red, green, blue] images).
- Number of inputs.
- Number of outputs.
Returns
- True: memory was reserved successfully.
- False: out of memory.
Usage
The following C example reserves memory for a data set of 5000 records of 3 inputs channels of 100 inputs each and 1 output:
SetCreate(5000, 3, 100, 1);
SetFree
Purpose
Destroys the current set of records and frees the memory.
Declarations
Standard C:
extern void SetFree();
MS Visual Studio:
extern void _cdecl SetFree();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetFree();
Parameters
None.
Returns
Nothing.
Usage
The following C example destroys the current set and frees the memory:
SetFree();
SetImagesAugment
Purpose
Makes the neural network to generalize better when learning images. This is achieved by taking a subset of consecutive records of the data set and generating new records, based on the original records, changing different image parameters, also known as augmentation.
If you need to know the number of records that will be generated by this function before using it, call first SetImagesAugmentNumberGet.
This function will produce a vertical flip if requested to rotate 360 degrees.
Declarations
Standard C:
extern bool SetImagesAugment(int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio:
extern bool _cdecl SetImagesAugment(int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetImagesAugment(int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
Parameters
- First record to be augmented.
- Last record to be augmented.
- Width of images (all images must have the same width).
- Height of images (all images must have the same height).
- Maximum number of records that will be generated per record. Put 0 to generate all possible records.
- Filter the image by a shape (only if width and height of image are the same you can perform rotations, horizontal and vertical moves, zoom, and filters by shapes).
- The following parameters will have a parameter from, to indicate the value to start; a parameter to, to indicate the value to finish; and a parameter step, to indicate the quantity of change when going from start to to:
- Degrees to rotate images.
- Brightness.
- Contrast.
- Gamma.
- Horizontal move.
- Vertical move.
- Zoom.
- Hue.
- Saturation.
These parameters must be in the range of these constants:
#define AUGMENT_FILTER_SHAPE_ALL 0
#define AUGMENT_FILTER_SHAPE_CIRCLE 1
#define AUGMENT_BRIGHT_MIN -150.0
#define AUGMENT_BRIGHT_MAX 150.0
#define AUGMENT_BRIGHT_NORMAL 0.0
#define AUGMENT_CONTRAST_MIN -100.0
#define AUGMENT_CONTRAST_MAX 100.0
#define AUGMENT_CONTRAST_NORMAL 0.0
#define AUGMENT_GAMMA_MIN 0.01
#define AUGMENT_GAMMA_MAX 10.0
#define AUGMENT_GAMMA_NORMAL 1.0
#define AUGMENT_MOVE_MIN -100.0
#define AUGMENT_MOVE_MAX 100.0
#define AUGMENT_MOVE_NORMAL 0.0
#define AUGMENT_ZOOM_MIN 0.01
#define AUGMENT_ZOOM_MAX 10.0
#define AUGMENT_ZOOM_NORMAL 1.0
#define AUGMENT_HUE_MIN 0.0
#define AUGMENT_HUE_MAX 360.0
#define AUGMENT_HUE_NORMAL 0.0
#define AUGMENT_SATURATION_MIN 0.0
#define AUGMENT_SATURATION_MAX 10.0
#define AUGMENT_SATURATION_NORMAL 0.0
Returns
True if the images were created inside the data set, or false if it could not, typically because of out of memory.
Usage
The following C function augments the data set or calculates the number of new augmented records that will be created. It is important to use the same function to calculate the number of augmented records and to augment them, to avoid putting different parameters in both calls. All “mSng…” variables are float module variables:
bool augmentRecordsPerRecord(bool pBolOnlyCalculate, bool pBolOnlyShape, int pIntRecordFrom, int pIntRecordTo, int *pIntNumberOfRecordsPerRecord)
{
if (pBolOnlyShape)
{
mSngRotationFrom = 0;
mSngRotationTo = 0;
mSngBrightnessFrom = AUGMENT_BRIGHT_NORMAL;
mSngBrightnessTo = AUGMENT_BRIGHT_NORMAL;
mSngContrastFrom = AUGMENT_CONTRAST_NORMAL;
mSngContrastTo = AUGMENT_CONTRAST_NORMAL;
mSngZoomFrom = AUGMENT_ZOOM_NORMAL;
mSngZoomTo = AUGMENT_ZOOM_NORMAL;
mSngHueFrom = AUGMENT_HUE_NORMAL;
mSngHueTo = AUGMENT_HUE_NORMAL;
mSngSaturationFrom = AUGMENT_SATURATION_NORMAL;
mSngSaturationTo = AUGMENT_SATURATION_NORMAL;
}
else
{
mSngRotationFrom = AUGMENTATION_ROTATION_FROM + mSngRotationStep * myRand(1);
mSngRotationTo = mSngRotationTo;
mSngBrightnessFrom = AUGMENTATION_BRIGHTNESS_FROM + mSngBrightnessStep * myRand(1);
mSngBrightnessTo = mSngBrightnessTo;
mSngContrastFrom = AUGMENTATION_CONTRAST_FROM + mSngContrastStep * myRand(1);
mSngHueFrom = AUGMENTATION_HUE_FROM + mSngHueStep * myRand(1);
mSngSaturationFrom = AUGMENTATION_SATURATION_FROM + mSngSaturationStep * myRand(1);
}
if (pBolOnlyCalculate)
{
if (mBolAugmentationEnabled)
(*pIntNumberOfRecordsPerRecord) = SetImagesAugmentNumberGet(pIntRecordFrom, pIntRecordTo, INPUTS_CHANNELS, mIntImageWidth, mIntImageHeight, 0, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep);
else
(*pIntNumberOfRecordsPerRecord) = 1;
return(true);
}
else
return(SetImagesAugment(pIntRecordFrom, pIntRecordTo, mIntImageWidth, mIntImageHeight, 0, mIntFilterShape, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep));
}
SetImagesAugmentNumberGet
Purpose
Calculates the number of new records that will be created in the data set by the function SetImagesAugment.
Declarations
Standard C:
extern int SetImagesAugmentNumberGet(int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio:
extern int _cdecl SetImagesAugmentNumberGet(int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetImagesAugmentNumberGet(int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
Parameters
- First record to be augmented.
- Last record to be augmented.
- Number of channels (for example 3 in case of RGB [red, green blue]).
- Width of images (all images must have the same width).
- Height of images (all images must have the same height).
- Maximum number of records that will be generated per record. Put 0 to generate all possible records.
- Filter the image by a shape (only if width and height of image are the same you can perform rotations, horizontal and vertical moves, zoom, and filters by shapes).
- The following parameters will have a parameter from, to indicate the value to start; a parameter to, to indicate the value to finish; and a parameter step, to indicate the quantity of change when going from start to to:
- Degrees to rotate images.
- Brightness.
- Contrast.
- Gamma.
- Horizontal move.
- Vertical move.
- Zoom.
- Hue.
- Saturation.
These parameters must be in the range of the constants that you can read in this manual in the function SetImagesAugment.
Returns
The number of new augmented records that will be created in the data set.
Usage
The following C function augments the data set or calculates the number of new augmented records that will be created. It is important to use the same function to calculate the number of augmented records and to augment them, to avoid putting different parameters in both calls. All “mSng…” variables are float module variables:
bool augmentRecordsPerRecord(bool pBolOnlyCalculate, bool pBolOnlyShape, int pIntRecordFrom, int pIntRecordTo, int *pIntNumberOfRecordsPerRecord)
{
if (pBolOnlyShape)
{
mSngRotationFrom = 0;
mSngRotationTo = 0;
mSngBrightnessFrom = AUGMENT_BRIGHT_NORMAL;
mSngBrightnessTo = AUGMENT_BRIGHT_NORMAL;
mSngContrastFrom = AUGMENT_CONTRAST_NORMAL;
mSngContrastTo = AUGMENT_CONTRAST_NORMAL;
mSngZoomFrom = AUGMENT_ZOOM_NORMAL;
mSngZoomTo = AUGMENT_ZOOM_NORMAL;
mSngHueFrom = AUGMENT_HUE_NORMAL;
mSngHueTo = AUGMENT_HUE_NORMAL;
mSngSaturationFrom = AUGMENT_SATURATION_NORMAL;
mSngSaturationTo = AUGMENT_SATURATION_NORMAL;
}
else
{
mSngRotationFrom = AUGMENTATION_ROTATION_FROM + mSngRotationStep * myRand(1);
mSngRotationTo = mSngRotationTo;
mSngBrightnessFrom = AUGMENTATION_BRIGHTNESS_FROM + mSngBrightnessStep * myRand(1);
mSngBrightnessTo = mSngBrightnessTo;
mSngContrastFrom = AUGMENTATION_CONTRAST_FROM + mSngContrastStep * myRand(1);
mSngHueFrom = AUGMENTATION_HUE_FROM + mSngHueStep * myRand(1);
mSngSaturationFrom = AUGMENTATION_SATURATION_FROM + mSngSaturationStep * myRand(1);
}
if (pBolOnlyCalculate)
{
if (mBolAugmentationEnabled)
(*pIntNumberOfRecordsPerRecord) = SetImagesAugmentNumberGet(pIntRecordFrom, pIntRecordTo, INPUTS_CHANNELS, mIntImageWidth, mIntImageHeight, 0, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep);
else
(*pIntNumberOfRecordsPerRecord) = 1;
return(true);
}
else
return(SetImagesAugment(pIntRecordFrom, pIntRecordTo, mIntImageWidth, mIntImageHeight, 0, mIntFilterShape, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep));
}
SetImagesDownsize
Purpose
Reduces the size of all the images (records) in the data set.
Declarations
Standard C:
extern bool SetImagesDownsize(int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
MS Visual Studio:
extern bool _cdecl SetImagesDownsize(int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetImagesDownsize(int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
Parameters
- The current width of the images of the data set.
- The current height of the images of the data set.
- The new width of the images of the data set, which must be lower than the current.
- The new height of the images of the data set, which must be lower than the current.
Returns
True if went well.
Usage
The following C code makes all the images (all the records) of the data set, which are currently of 200 x 200 pixels, to be 50 x 50 pixels:
SetImagesDownsize(200, 200, 50, 50);
SetLearnContinue
Purpose
Makes the neural network to learn a record from the set. What it does internally is to put inputs and outputs of a certain record from the set in memory, and then makes the neural network to learn.
Returns an estimation of the success in predictions of the current network.
Declarations
Standard C:
extern float SetLearnContinue(int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetLearnContinue(int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnContinue(int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- The number of the record to learn from the data set.
- A boolean to indicate if this function must return an estimation of the success in predictions.
- The current epoch number, which must start at zero.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example continues training with record number i, returns if there were NaN (“Not a Number”) in a parameter boolean variable named lBolThereIsNan, and obtains more information:
SetLearnContinue(i, true, 0, true, &lBolThereIsNan, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
SetLearnConsecutive
Purpose
Makes the neural network to learn exactly as SetLearnContinue but for all the consecutive records indicated in the parameters.
Returns an estimation of the success in predictions of the current network.
Declarations
Standard C:
extern float SetLearnConsecutive(int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish,
bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetLearnConsecutive(int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnConsecutive(int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- The number of the start record to learn.
- The number of the end record to learn.
- A boolean to indicate if this function must return an estimation of the success in predictions.
- The current epoch number, which must start at zero.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example continues training with all records from number i to number j, and returns if there were NaN (“Not a Number”) in a parameter boolean variable named lBolThereIsNan, and obtains more information:
SetLearnConsecutive(i, j, false, 0, true, &lBolThereIsNan, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
SetLearnStart
Purpose
Prepares the set for the self-learning process, performing:
- Create internal necessary memory structures.
- Internal random reposition of the data set records.
Please note that batches are only considered in MODE_NORMAL mode.
Declarations
Standard C:
extern int SetLearnStart(int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
MS Visual Studio:
extern int _cdecl SetLearnStart(int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetLearnStart(int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
Parameters
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- An integer indicating the number of batches to process. The batch size is defined as the number of records for training of the data set divided by this parameter. Put 0 to not use batches.
- An integer indicating the number of the first record of the data set that will be used for training.
- An integer indicating the number of the last record of the data set that will be used for training. This, and the previous parameter, define the subset that will be used for training. The rest of the records of the data set will be used for testing. The records for testing must be contiguous.
- An integer indicating the number of records, from the records for training indicated in the previous parameter, that will be used for validation. These records will be the ones at the end of the training subset.
- A float value indicating the threshold, typically in the range of [0, 1], to consider that an output is active (when its value is equal or greater than this parameter) or inactive.
- A float value indicating the target percentage of deviation, between the real value and the predicted value, to consider that a prediction is correct. This parameter is only used when the previous parameter threshold is zero.
- An integer to indicate:
- TEST_TYPE_ByThreshold: when the output is equal or higher than the activation value, it will be considered as active; and when the active predicted outputs match exactly to those that were indicated in the learning process, then the record is considered as a success.
- TEST_TYPE_ByMax: only the output with the highest activation value will be considered active. Additionally, if this parameter is true, if the activation value of the output with the maximum activation value does not reach pSngThresholdForActive; then it will be set to pSngThresholdForActive.
- TEST_TYPE_ByOutputs: same behavior than by threshold, but it will count how many outputs were predicted ok with respect to the total outputs predicted, instead of counting correct and incorrect records from the set.
Returns
An integer, indicating:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example starts the self-learning process, using the previously memorized set. As it sets the threshold to zero, and indicates a Deviation Target, it will most probably will used to learn values prediction, for example to forecast time series. It does not use control of cycles and batches are not used neither:
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingTopology 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Topology_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
SetLearnStart(0, 0, lIntNumberOfTheFirstRecordForTraining, lIntNumberOfTheLastRecordForTraining, 0, lSngActivationPercentage, lSngActivationPercentage, lIntTestType);
SetLearnEnd
Purpose
Marks as finished the training based on the current memorized set of records. Sets the neural network into auto adaptative only when in Dynamic propagation mode, in the rest of cases, its use is optional but recommended if you are using batch size different than 0, to consolidate the learning.
Returns an estimation of the success in predictions of the current network
Declarations
Standard C:
extern float SetLearnEnd(int pIntCurrentEpochNumber);
MS Visual Studio:
extern float _cdecl SetLearnEnd(int pIntCurrentEpochNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnEnd(int pIntCurrentEpochNumber);
Parameters
The current epoch number.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example finishes training with the memorized and activates the auto adaptative feature:
SetLearnEnd(0);
SetLoadFromFile
Purpose
Loads the data set from a file. This function will internally use SetFree and SetCreate, therefore, you do not need to call these other functions before calling it.
For the characteristics of the file, please read SetSaveToFile.
Declarations
Standard C:
extern int SetLoadFromFile(int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
MS Visual Studio:
extern int _cdecl SetLoadFromFile(int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetLoadFromFile(int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
Parameters
3 Integers which return indicating the number of:
- Inputs.
- Outputs.
- Records.
If the values provided are not high enough to store all the inputs and outputs, the function will recalculate them and generate the file accordingly. Therefore, a 0 can be passed on each of them.
Returns
An integer indicating with:
- 0: file was loaded successfully.
- 1: file could not be opened.
- 2: version could not be read.
- 3: second line could not be read.
- 4: header is not correct.
- 5: file indicates in header to have 0 data records.
- 6: file does not include complete data for the number of records indicated in the header.
Usage
The following C code loads the current data set from a file:
int lIntRes = SetLoadFromFile(&lIntNumberOfInputs, &lIntNumberOfOutputs, &lIntNumberOfRecords);
SetRecordDelete
Purpose
Deletes the parameter record number from the set. Calling this function will reduce in memory the size of the set to the current number of records, overriding the work of the call SetCreate.
Declarations
Standard C:
extern int SetRecordDelete(int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordDelete(int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordDelete(int pIntRecordNumber);
Parameters
The record number to be deleted from the set, starting with 0 for the first record.
Returns
- An integer: indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example deletes the first record from the set and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordDelete(0);
SetRecordInsert
Purpose
Inserts into the set, as a new record, the current inputs and outputs of the neural network. For faster operation of SetRecordInsert, it is recommended to previously have called SetCreate for the total number of records that you intend to create via SetRecordInsert. By calling SetCreate previously to SetRecordInsert, all the necessary memory will be reserved at once, instead of per record with SetRecordInsert.
Declarations
Standard C:
extern int SetRecordInsert();
MS Visual Studio:
extern int _cdecl SetRecordInsert();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordInsert();
Parameters
None.
Returns
- An integer: indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example memorizes in the set, as a new record, the current inputs and outputs of the neural network, and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordInsert();
SetRecordGet
Purpose
Updates the current inputs and outputs of the neural network with the ones in the parameter record of the set.
Declarations
Standard C:
extern bool SetRecordGet(int pIntRecordNumber);
MS Visual Studio:
extern bool _cdecl SetRecordGet(int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordGet(int pIntRecordNumber);
Parameters
The record number of the set, starting with 0 for the first record.
Returns
- True: if it got the record correctly.
- False: if there are not records in the set or the requested number of record is higher than the number of records in the set.
Usage
The following C example updates the current inputs and ouputs of the network with the ones memorized in the 10th record of the set:
bool lBolTmp = SetRecordGet(9);
SetRecordOutputMaxGet
Purpose
Returns the number of the output of the record of the data set which has the maximum value (not the predicted value), and also returns the original position that this record had in the data set before being shuffled.
This function is useful to know which is the active output of a record from the data set after it has been shuffled by functions like SetLearnStart.
Declarations
Standard C:
extern int SetRecordOutputMaxGet(int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordOutputMaxGet(int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordOutputMaxGet(int pIntRecordNumber);
Parameters
- The record number of the set, starting with 0 for the first record.
Returns
The number of the output which has the maximum value, not the predicted value, or the following errors:
- -1: inputs and outputs do not match between the network and the data set.
- -2: there was an error when reordering the data set.
Usage
The following C example gets, from the data set, and from the record in the 10th position, the number of the output which has the maximum value:
int lIntOutputNumberWithMaxValue = SetRecordOutputMaxGet(9);
SetRecordSet
Purpose
Updates the parameter record number of the set with the current inputs and outputs of the neural network. If the parameter record number does not exist, it will perform a SetRecordInsert.
Declarations
Standard C:
extern int SetRecordSet(int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordSet(int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordSet(int pIntRecordNumber);
Parameters
The record number to be updated from the set, starting with 0 for the first record.
Returns
- 0: if it updated the record correctly.
- An integer: if it had to create a new record, indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example updates the first record of the set with the current inputs and ouputs, and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordSet(0);
SetRecordsNumberGet
Purpose
Returns the current number of records inside the data set.
Declarations
Standard C:
extern int SetRecordsNumberGet();
MS Visual Studio:
extern int _cdecl SetRecordsNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordsNumberGet();
Parameters
None.
Returns
- The current number of records inside the data set.
Usage
The following C example puts into a variable the current number of records of the data set:
int lIntTmp = SetRecordsNumberGet();
SetRecordsNumberSet
Purpose
Sets the current number of records inside the data set, if the indicated number is lower than the current number of records inside the data set.
This function is useful to update the data set with new records without having to SetFree and SetCreate.
Setting the parameter of this function to zero does not free the memory of the data set, therefore, SetFree must be called at the end of using the data set.
Declarations
Standard C:
extern void SetRecordsNumberSet(int pIntNumberOfRecords);
MS Visual Studio:
extern void _cdecl SetRecordsNumberSet(int pIntNumberOfRecords);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetRecordsNumberSet(int pIntNumberOfRecords);
Parameters
The new number of records of the data set.
Returns
Nothing.
Usage
The following C example sets the current number of records of the data set to 10:
SetRecordsNumberSet(10);
SetRecordsOrderSet
Purpose
Resets the order of the records in the data set to their original position.
This function is useful to restore the original position of the records of the data set after they have been shuffled by a function like SetLearnStart.
Declarations
Standard C:
extern bool SetRecordsOrderSet();
MS Visual Studio:
extern bool _cdecl SetRecordsOrderSet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordsOrderSet();
Parameters
None.
Returns
True if went ok.
Usage
The following C example restores the position of the records of the data set to their original position:
SetRecordsOrderSet();
SetRecordSuccess
Purpose
Checks if a record from the data set is currently predicted correctly, considering the current test type stablished by functions like SetLearnStart.
Declarations
Standard C:
extern bool SetRecordSuccess(int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
MS Visual Studio:
extern bool _cdecl SetRecordSuccess(int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordSuccess(int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
Parameters
- The record number to be checked, starting with 0 for the first record.
- A returned parameter which will indicate if, when predicting, any not a number (NaN) was obtained.
- A returned parameter which will indicate the number of the output which had the maximum predicted value.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
- True: if the predicted output matches with the active output of the record.
- False: otherwise.
Usage
The following C example checks if the 10th record of the set is predicted correctly. The result will be stored in a variable:
bool lBolTmp = SetRecordSuccess(9, &lBolThereIsNaN, &lIntIdxMaxOutput, &lBolGradientExploded);
SetRecordSwap
Purpose
Exchanges the content of 2 records of the data set.
Declarations
Standard C:
extern void SetRecordSwap(int pIntSrcRecordNumber, int pIntDstRecordNumber);
MS Visual Studio:
extern void _cdecl SetRecordSwap(int pIntSrcRecordNumber, int pIntDstRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetRecordSwap(int pIntSrcRecordNumber, int pIntDstRecordNumber);
Parameters
- The source record number to be swapped, starting with 0 for the first record.
- The destination record number to be swapped, starting with 0 for the first record.
Returns
Nothing.
Usage
The following C example swaps the content of the 10th and 11th records of the set:
SetRecordSwap(9, 10);
SetReorderTrainingAndTest
Purpose
Rearranges the data set so that a certain percentage of the records representing all the different outputs are placed at the end of the data set.
This function is useful as it allows to:
- Create a data set by adding records without considering to create training and test records.
- Once all the records have been created, a call to this function will separate the data set into the training and the test subsets.
Declarations
Standard C:
extern int SetReorderTrainingAndTest(float pSngPercentageForTesting, float pSngThresholdForActive);
MS Visual Studio:
extern int _cdecl SetReorderTrainingAndTest(float pSngPercentageForTesting, float pSngThresholdForActive);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetReorderTrainingAndTest(float pSngPercentageForTesting, float pSngThresholdForActive);
Parameters
- A float in the range [0, 1] indicating the percentage of records for testing.
- A float indicating the threshold value to consider that an output is active. Consider that this is affected by the defined test type.
Returns
An integer indicating the number of records that were rearranged. Subtracting this value from the total number of records will indicate which record is the first record for test.
Usage
The following C rearranges the data set so that 15% of the records are reserved for tests at the end of the dataset. The number of records that were rearranged is also memorized into a variable:
int lIntTotalNumberOfRecordsRearranged = SetReorderTrainingAndTest(0.15, 0.5);
SetSaveToFile
Purpose
Saves current data set in a file, which is useful to load it later with SetLoadFromFile or to visually examine the data set with, for example, the NNViewer application.
The file has the following characteristics:
- The file is internally a comma separated value (CSV) file.
- It can be opened by the NNViewer application which source code is provided.
- The file is named AnaimoAI.nnv and is created where the AnaimoAI.dll is.
- The content of the file is:
- 1st row: the version of the Anaimo AI SDK which created the file, in format YYYYMM. For example, for this version: 202203
- 2nd row are 7 integers indicating the number of:
- Inputs.
- Outputs.
- Rows for inputs. It is optional and therefore can be zero.
- Columns for inputs. It is optional and therefore can be zero.
- Rows for outputs. It is optional and therefore can be zero.
- Columns for outputs. It is optional and therefore can be zero.
- Number of following records of the set in this file. This number can be lower than the real number of records, so it is recommended to keep reading the file after this number of records has been reached.
- Rest of rows, for all records of the data set, by consecutive numbers, the content of the cells of the all the inputs and then all the outputs.
As you can see, in the 2nd row of the file, columns and rows for inputs and outputs can be zero indicating that no specific visualization grid is recommended for the data set of inputs and outputs. Read the parameters section below for more information.
Declarations
Standard C:
extern int SetSaveToFile(int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl SetSaveToFile(int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSaveToFile(int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
5 integers indicating the number of:
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Precision or number of digits after the decimal symbol. Put -1 here to ignore this parameter.
Put 0 in the rows and columns parameters above if you are not interested in storing any type of recommended visualization for the data.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
Usage
The following C code saves the current data set into a file as indicated above:
int lIntRes = SetSaveToFile(0, 0, 0, 0, -1);
SetSaveAddToFile
Purpose
Adds new records to the current data set in the file previously saved by SetSaveToFile. This function is much faster than SetSaveToFile as it only adds the new records to the file, but it requires SetSaveToFile to be called before using this function.
For more information and the characteristics of the outputted file, please read SetSaveToFile.
Declarations
Standard C:
extern int SetSaveAddToFile(int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl SetSaveAddToFile(int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSaveAddToFile(int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
6 integers indicating the number of:
- The number of the record to start saving in the file.
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Precision or number of digits after the decimal symbol. Put -1 here to ignore this parameter.
Put 0 in the rows and columns parameters above if you are not interested in storing any type of recommended visualization for the data.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
Usage
The following C code adds to the current file all the available records of the data set starting with record number 5:
int lIntRes = SetSaveAddToFile(5, 0, 0, 0, 0, -1);
SetSuccessGet
Purpose
Returns the current success rate in predictions based on the data set and the current neural network. The records from the data set and the test type that will be used to check the predictions are indicated for Validation (if specified) or Test in SetLearnStart or SetAutoStart.
Declarations
Standard C:
extern float SetSuccessGet(bool* pBolThereIsNan, bool* pBolGradientExploded);
MS Visual Studio:
extern float _cdecl SetSuccessGet(bool* pBolThereIsNan, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetSuccessGet(bool* pBolThereIsNan, bool* pBolGradientExploded);
Parameters
- A returned parameter which informs with true if there were “not a number” (NaN) in the outputs. Detecting NaN is very important, therefore, if this parameter comes true, the current training hyperparameters will probably not lead to success.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
A float in the range [0, 1] indicating the success rate of the predictions of the current neural network. This success rate is calculated by comparing the outputs with the predicted outputs of the consecutive records between the parameters and returning the number of records where all outputs where correctly predicted divided by the total number of records indicated in the parameters.
Usage
The following C code puts into a float variable the current success rate of the current neural network when using the set:
#define TEST_TYPE_ByThreshold 0
#define TEST_TYPE_ByMax 1
#define TEST_TYPE_ByOutputs 2
float lSngTmp = SetSuccessGet(&lBolThereIsNan, &lBolGradientExploded);
Example: .NET\NNViewer
It is very important to have an example to understand how the library can be used.
NNViewer is a complete, fully functional, easy to understand, source code of an MS Windows example application. Allows drawing inputs and outputs to train the neural network and see how it learns the patterns.
For this purpose, the source code of the example NNViewer is provided in version MS Visual Studio VB.NET (source code requires .NET 5.0 and MS Visual Studio version 2022 or higher).
How does it work?
You can type the M key at any time to obtain help about the available commands. Most of them should be self-explanatory.
NNViewer has a graphical user interface showing, on the left side, a grid with the inputs, and on the right side, a grid with the outputs.
Every input and output (every cell of the grids) can have a value in the range from 0 to 1 inclusive. The value of each cell will be shown with a color, from white for 0 to black for 1. A value of a cell of 0.5 will be shown in grey.
The user can change the inputs at any time by clicking with the mouse on an input cell. The user can change the outputs in the same way, only when the application is not on “thinking”. In the “thinking” (after pressing the T key), the outputs will be colored by the neural network.
You can create and delete pages, clear them, and navigate through them. Once you have all the pages of inputs and outputs that you want, you can make the neural network to learn them.
Once the neural network has learnt your pages, you can switch to “thinking” with the T key, and then the neural network will show you, on the outputs, which outputs it understands from the current inputs.
When the neural network is in “thinking” (T key), you can create a new page with the N key, draw new inputs and see in real time the outputs that the neural network predicts.
Example files
Once you have created your pages of inputs and outputs, you can save them with the S key.
The application comes with 4 examples files that can be loaded. These files are internally CSV files but renamed to extension NNV:
- handwritten_28x28_50000_10000.nnv: the popular MNIST handwritten digits data set with 28×28 pixel images, 50k pages of training images and 10k pages of test images. Open it with 2 layers, learning rate of 3 (L key), test type (V key) by maximum and make it learn (A key) for 30 epochs with 5000 batches (this means a batch size of 10 pages), 50.000 pages for training and not auto tuning hyperparameters. This should result in about 80% of success rate in the first epoch.
- numbers.nnv: a few handwritten numbers, in low quality, to show how the neural network can learn with very few and poor-quality inputs and outputs.
- ranges.nnv: marks ranges of inputs and outputs at the same height. The purpose of this file is to show how the neural network learns to mark outputs at the same height than inputs. Please note that 5% of the outputs are wrong, and that the neural network, in “thinking”, will mark the correct answer for the outputs. In other words, it will be detecting anomalies.
- symbols.nnv: a few pages of symbols, to show how the neural network can learn with very few inputs and outputs. Once learned (with the A key), you can create a new page (with N key), draw a similar symbol, and see how the neural network shows the output.
- xor.nnv: very simple example of a XOR operation. Type 3 layers of neural network (when choosing 3 layers, NNViewer will, in this case because there are not enough neurons for 3 layers, generate a hidden layer with the same number of neurons as inputs) and learn with 3.000 epochs and it will learn the operation.
Example: Windows\Python\Jupyter\Handwritten_Digits
Before opening the Jupyter file you need to:
- Install Anaconda for Windows; in our tests we used Anaconda 2023.03-1 downloaded from the official web page https://www.anaconda.com/
- From the Windows start menu, execute Anaconda Navigator, and Launch Jupyter Notebook to verify that you can open Jupyter notebooks in your computer.
- Additionally, you can open Jupyter Notebook quicker by executing, from the Windows start menu, Anaconda Prompt and then typing: jupyter notebook
- From the Jupyter notebook, navigate through your hard disk and open the file provided with the SDK: \Examples\Python\Jupyter_Handwritten_Digits\IpnybAnaimo.ipynb
- Once opened, you can execute cell by cell or the whole notebook with the menu Cell and Run all.
The code trains the neural network with 50k handwritten digits and then tests with 10k handwritten different digits.
Layers created (including the inputs and outputs layers): 3
Creating set of records 1000 / 60000...
Creating set of records 2000 / 60000...
Creating set of records 3000 / 60000...
Creating set of records 4000 / 60000...
Creating set of records 5000 / 60000...
Creating set of records 6000 / 60000...
Creating set of records 7000 / 60000...
Creating set of records 8000 / 60000...
Creating set of records 9000 / 60000...
Creating set of records 10000 / 60000...
Creating set of records 11000 / 60000...
Creating set of records 12000 / 60000...
Creating set of records 13000 / 60000...
Creating set of records 14000 / 60000...
Creating set of records 15000 / 60000...
Creating set of records 16000 / 60000...
Creating set of records 17000 / 60000...
Creating set of records 18000 / 60000...
Creating set of records 19000 / 60000...
Creating set of records 20000 / 60000...
Creating set of records 21000 / 60000...
Creating set of records 22000 / 60000...
Creating set of records 23000 / 60000...
Creating set of records 24000 / 60000...
Creating set of records 25000 / 60000...
Creating set of records 26000 / 60000...
Creating set of records 27000 / 60000...
Creating set of records 28000 / 60000...
Creating set of records 29000 / 60000...
Creating set of records 30000 / 60000...
Creating set of records 31000 / 60000...
Creating set of records 32000 / 60000...
Creating set of records 33000 / 60000...
Creating set of records 34000 / 60000...
Creating set of records 35000 / 60000...
Creating set of records 36000 / 60000...
Creating set of records 37000 / 60000...
Creating set of records 38000 / 60000...
Creating set of records 39000 / 60000...
Creating set of records 40000 / 60000...
Creating set of records 41000 / 60000...
Creating set of records 42000 / 60000...
Creating set of records 43000 / 60000...
Creating set of records 44000 / 60000...
Creating set of records 45000 / 60000...
Creating set of records 46000 / 60000...
Creating set of records 47000 / 60000...
Creating set of records 48000 / 60000...
Creating set of records 49000 / 60000...
Creating set of records 50000 / 60000...
Creating set of records 51000 / 60000...
Creating set of records 52000 / 60000...
Creating set of records 53000 / 60000...
Creating set of records 54000 / 60000...
Creating set of records 55000 / 60000...
Creating set of records 56000 / 60000...
Creating set of records 57000 / 60000...
Creating set of records 58000 / 60000...
Creating set of records 59000 / 60000...
Creating set of records 60000 / 60000...
2023-06-28 09:39:24.088862 - Initiating learning...
2023-06-28 09:39:29.557353 - Epoch 0: Success rate: 91.26999974250793 %
2023-06-28 09:39:35.440770 - Epoch 1: Success rate: 92.96000003814697 %
2023-06-28 09:39:41.059279 - Epoch 2: Success rate: 93.33999752998352 %
2023-06-28 09:39:46.695869 - Epoch 3: Success rate: 92.8600013256073 %
2023-06-28 09:39:51.703843 - Epoch 4: Success rate: 94.0500020980835 %
2023-06-28 09:39:57.409123 - Epoch 5: Success rate: 93.88999938964844 %
2023-06-28 09:40:03.194934 - Epoch 6: Success rate: 94.09000277519226 %
2023-06-28 09:40:08.700750 - Epoch 7: Success rate: 94.55999732017517 %
2023-06-28 09:40:14.001746 - Epoch 8: Success rate: 94.66999769210815 %
2023-06-28 09:40:19.160740 - Epoch 9: Success rate: 94.62000131607056 %
2023-06-28 09:40:24.720267 - Epoch 10: Success rate: 94.34999823570251 %
2023-06-28 09:40:30.188844 - Epoch 11: Success rate: 94.6399986743927 %
2023-06-28 09:40:35.903739 - Epoch 12: Success rate: 94.92999911308289 %
2023-06-28 09:40:41.646582 - Epoch 13: Success rate: 94.83000040054321 %
2023-06-28 09:40:47.024246 - Epoch 14: Success rate: 94.60999965667725 %
2023-06-28 09:40:52.813525 - Epoch 15: Success rate: 94.9500024318695 %
2023-06-28 09:40:58.321525 - Epoch 16: Success rate: 94.8199987411499 %
2023-06-28 09:41:03.793734 - Epoch 17: Success rate: 94.65000033378601 %
2023-06-28 09:41:09.400702 - Epoch 18: Success rate: 95.02000212669373 %
2023-06-28 09:41:15.238250 - Epoch 19: Success rate: 94.91999745368958 %
2023-06-28 09:41:20.422186 - Epoch 20: Success rate: 94.84999775886536 %
2023-06-28 09:41:26.132038 - Epoch 21: Success rate: 95.169997215271 %
2023-06-28 09:41:31.708575 - Epoch 22: Success rate: 95.14999985694885 %
2023-06-28 09:41:36.972715 - Epoch 23: Success rate: 95.02999782562256 %
2023-06-28 09:41:42.391361 - Epoch 24: Success rate: 95.08000016212463 %
2023-06-28 09:41:47.966994 - Epoch 25: Success rate: 95.09000182151794 %
2023-06-28 09:41:53.446017 - Epoch 26: Success rate: 94.9899971485138 %
2023-06-28 09:41:59.323663 - Epoch 27: Success rate: 95.06000280380249 %
2023-06-28 09:42:04.690296 - Epoch 28: Success rate: 95.09999752044678 %
2023-06-28 09:42:09.867939 - Epoch 29: Success rate: 95.02999782562256 %
Example: Windows\Python\MS Visual Studio\Handwritten_Digits
The NNViewer example solves the handwritten digits challenge (by opening project handwritten_28x28_50000_10000.nnv). This python example has just the code to run on MS Windows and solve the same challenge. You can open the file PyAnaimo_hw.py with any python editor or open the file Python_Handwritten_Digits.sln with MS Visual Studio version 2022 or higher.
Output will be identical as the Jupyter notebook version.
Support
In case you need support, please contact us at support@anaimo.com or contact as through the web Support.
Change log
Date | Version | Change |
20/02/2023 | 2022-03 (B.010900) | Multiple improvements in several functions. |
23/02/2023 | 2022-03 (B.010900) | SetAutoLearnContinue now learns, in the infinite mode, all records, not just those which did not predict in the 1st place. |
05/03/2023 | 2022-03 (B.010900) | Now weights are saved in NNK files as they are stored in memory. Previous NNK files will not work. |
06/03/2023 | 2022-03 (B.010900) | NeuBiasSet/Get & NeuDeltaGet have been deleted. Now all is managed with NeuWeightSet/Get. |
06/03/2023 | 2022-03 (B.010900) | NeuWeightUpdatedGet, parameter input now must start with 1. |
28/6/2023 | 2022-03 (B.010900) | Added the Jupyter version of the Handwritten Digits example. |