Anaimo AI SDK User’s Guide
Artificial Intelligence to Turn Challenges into Benefits.
Version: 2024-01 (build 10010).
Copyright
This document and all its content are protected by copyright international laws as is property of a private company which legal name is ANAIMO SPAIN, S.L. (since now on “Anaimo”) registered in the country of Spain with fiscal id B16943706. Modifying this document or removing any copyright is expressly prohibited without prior written permission by Anaimo. For more information, please contact https://anaimo.com
Introduction
The Anaimo AI SDK is a software library which allows to create, train, and use neural networks for whatever application developed in C/C++, C#, VB.NET and more. Its main advantages are:
- Wide range of functionalities:
- Create neural networks, from fully connected, to convolutionals, to manually connected neurons.
- Create easily, with mnemotechnics, different architectures of neural networks.
- Augment images automatically.
- Reorder data sets automatically.
- Snapshot to make predictions based on a previous knowledge.
- Optimized:
- For high speed with multi core, thread and SIMD.
- To consume the least memory possible.
- Runs on cloud or on-premises (Edge AI), and potentially on any hardware.
- Unique features:
- Dynamic outputs: the number of outputs can change without losing knowledge.
- Flexible topology: connect neurons from fully connected automatically, to fully manual as you need.
- Auto adaptative: it can self-adapt its topology for faster and better performance, like the human brain.
This document constitutes the User’s Guide of the Anaimo AI SDK.
For more information or support, please visit or contact us at: https://anaimo.ai
License
Anaimo AI SDK can run with a:
- Free license: for neural networks with less than 1000 neurons.
- Paid license: for neural networks with 1000 or more neurons.
If you want to obtain a paid license, you need to provide Anaimo with all the hardware id codes of the computer which will run the neural network. To obtain them, please run the library on the computer which you need to license and use the HardwareId public function to obtain and provide with all the hardware identificators.
Then please email, with subject “License request”, these 16 hardware identificators to marketing@anaimo.com
Installation
The Anaimo AI SDK comes in different versions:
Anaimo AI SDK Version | Anaimo AI SDK File |
Windows (x32 & x64) | |
General Release | AnaimoAI.dll |
Optimized for computers with the AVX2 instruction set. | AnaimoAI_AVX2.dll |
Optimized for computers with the AVX512 instruction set. | AnaimoAI_AVX512.dll |
Linux | |
General Release | libAnaimoAI_nn.so |
For the NPUs (Neural Processing Units) version, please request support to Anaimo, indicating the target hardware of the NPU.
The Windows version of the Anaimo AI SDK requires to have pre-installed the following component:
- Microsoft’s Visual Studio C++ Redistributable (version 2015 or higher).
Linux version was compiled with: Ubuntu GLIBC 2.31-0ubuntu9.9.
Depending on the folder, where the previous files are, the following software must be additionally preinstalled before use:
- Intel folder: requires Intel’s Redistributable Libraries for Intel® C++ and Fortran 2020 Compilers for Windows.
If you try to execute in a computer your software developed with the Anaimo AI SDK without having pre-installed in that computer these cited softwares, you will probably get an error indicating a “dll not found exception”.
For Python, only the General Release version has been tested.
When opening the source code of the NNViewer application (see later in this document), please remember that it requires .NET 5.0 and MS Visual Studio version 2022 or higher.
Neural Processing Units (NPUs, GPUs and other)
The SDK can run in devices like Neural Processing Units (NPU), Graphics Processing Units (GPU) and more. To activate this, use the function ComputeUnitSet to use the new device and after ComputeUnitGet to determine if the set was successful, meaning that your device could be accessed.
When using NPUs, please consider that:
- Only the function SetsLearn is currently running on NPUs.
- When running in a NPU, you cannot determine the maximum number of threads and therefore functions like NetThreadsMaxNumberSet will have no effect.
- Function SetRecordsOrderSet cannot be used after SetsLearn.
- When calling SetLearnStart before using SetsLearn, do not repeat the index of network to be learned. In other words, all indices of the networks to be learned must be different.
- Only full connected topologies are currently supported.
Multiple neural networks and sets
The SDK allows to create multiple neural networks, which can be useful to classify with one network per class or in general in cases where multiple networks are required.
You can add a network using the NetsAdd function. To remove networks, you use the NetsRemove function, indicating the index of the network you want to remove. The indices of the created networks range from 0 to the total number of created networks minus one. If you want to know how many networks have been created so far, you can use the NetsNumberGet function.
Multiple sets can also be created and used from the networks. The equivalent functions SetsAdd, SetsRemove and SetsNumberGet can be used.
Inputs and ouputs
Inputs and ouputs of the neural network are regular neurons. You indicate how many inputs your network has in the NetCreate function. After that, to indicate how many outputs your network has, you use the function NetOutputAdd. Therefore, inputs will be the initial neurons added until the number of inputs indicated in NetCreate and there does not exist any specific function to add inputs.
Values of inputs and outputs should be in the range [0, 1].
Topology
The calls NetTopologyGet and NetTopologySet allow to get or set the desired topology. Supported topologies are:
- Manual: neurons and their connections are stablished programmatically.
- Full connected: neurons per layer are stablished programmatically, but they are all automatically fully connected between layers.
Control of cycles
Depending on the network topology, neurons can be interconnected so that they stablish closed loops, for example when neuron A outputs to neuron B which outputs back to neuron A. This sometimes can happen after multiple levels and therefore cycles will not be easily seen.
If your topology connects neurons in loops, the neural network could enter an infinite loop and hang or even drain the resources of the computer. The functions which might experience this problem will require the parameter CyclesControl to be with value 1 to avoid this problem.
Numbering of neurons
Neurons are numbered starting with 0. In general, this is applied to all other items (inputs, outputs, weights, etc.).
The set and the auto adaptative feature
One of the disadvantages of the learning process of an artificial neural network versus a person, is that the first needs thousands of records and iterations to learn what the second can learn with just a few records and with almost no iteration.
To reduce that difference, Anaimo AI developed:
- Set: all the records that are used to learn can be memorized inside the neural network.
- Self-training from the set: the neural network can use the set for self-training.
- Auto adaptation: once the self-training has finished, the neural network self-adapts (auto adaptative feature) its topology to increase speed and save computational resources. This feature is only available in Dynamic propagation mode
Only when using the set functionality, the Anaimo AI SDK will enter into the auto adaptative feature. The related Set functions are also explained in this guide.
Please note that the set cannot be changed (neither add nor delete records) while the learning process is being applied on the set.
Modes
There are different working modes for the neural network. The mode affects mainly to training, but it could also affect other operations. There are these modes available:
- Normal: fully optimized for learning and highly parallel computation.
- Standard back propagation:
- Calculates neurons outputs.
- Calculates deltas.
- Adjusts biases and weights.
- Dynamic propagation (beta): for all neurons, adjusts biases and weights via calculating neurons outputs and deltas when needed. This mode supports any network topology. This mode is more than 50% faster than standard back propagation, although is still being validated in different use cases.
- Back propagation (beta): same as mode standard back propagation but optimized for parallel computation and high volume of neural networks and data.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore you need to manually connect the neurons first.
Modes Normal and Back propagation (beta) will automatically create an internal memory structure. You can also create this internal memory structure manually, every time you change the topology of the neural network, with the function NetLayersAnalyze, which will consume some time and memory. Once created, it will be automatically maintained during operation. In these modes, some functions (for example NeuWeightUpdatedGet and NetValueUpdatedGet) will not work, therefore please see the documentation related to mode (NetModeSet) later. This is because these internal counters will not be updated for efficiency.
Convolutional layers (Conv2D, MaxPool and AvgPool) and SoftMax are only supported in the mode Normal.
Performance
The neural network is highly optimized for speed and the use of the least memory possible. For the best results:
- Use the Normal mode (this is the default).
- Use the full connected topology (this is the default).
- Set the number of threads, normally equal or less to the number of cores of your processor.
- Set the memory cache, normally equal or less to the size of the L3 cache memory of your processor (this feature is subject to additional testing: beta).
Disclaimer of use
The Anaimo AI SDK is a powerful tool capable of providing great benefits to its users. But, unlike an algorithm, and similarly to other Artificial Intelligences, humans cannot fully understand how it is predicting its outputs. Because of this, we recommend validating the outputs before applying them to actions that could have potential negative impacts such as, and not being an exhaustive list, ethical or safety problems. Therefore, ANAIMO WILL NOT ACCEPT, UNDER NO CIRCUMSTANCE, NEITHER RESPONSIBILITY NOR LIABILITY FOR ANY NEGATIVE IMPACT RELATED TO ANY USAGE OF ANAIMO’S PRODUCTS, as it is finally the total responsibility of the user if and how to use them.
For awareness and more information, please consult:
- https://www.codedbias.com/
Public functions
The following are the published available functions.
Common functions
ComputeUnitGet
Purpose
Returns the current active compute unit.
Declarations
Standard C:
extern int ComputeUnitGet();
MS Visual Studio:
extern int _cdecl ComputeUnitGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall ComputeUnitGet();
Parameters
None.
Returns
An integer indicating the current active compute unit.
Usage
The following C example will obtain the current active compute unit:
#define COMPUTE_UNIT_CPU 0
#define COMPUTE_UNIT_NPU 1
int lIntComputeUnit = ComputeUniteGet();
ComputeUnitSet
Purpose
Sets the current active compute unit.
After calling this function, it is recommended to call ComputeUnitGet to verify that the NPU is usable.
Declarations
Standard C:
extern void ComputeUnitSet(int pIntComputeUnit);
MS Visual Studio:
extern void _cdecl ComputeUnitSet(int pIntComputeUnit);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall ComputeUnitSet(int pIntComputeUnit);
Parameters
- An integer indicating the desired compute unit, according to the following constants.
Returns
Nothing.
Usage
The following C example will set the current active compute unit:
#define COMPUTE_UNIT_CPU 0
#define COMPUTE_UNIT_NPU 1
ComputeUnitSet(COMPUTE_UNIT_NPU);
HardwareId
Purpose
Provides 16 hardware identificators, in 4 rows by 4 columns, which uniquely identify the computer running the neural network. This data must be sent to Anaimo to obtain a licensed version of the neural network library.
Declarations
Standard C:
extern int HardwareId(unsigned int pIntRow, unsigned int pIntCol);
MS Visual Studio:
extern int _cdecl HardwareId(unsigned int pIntRow, unsigned int pIntCol);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall HardwareId(unsigned int pIntRow, unsigned
int pIntCol);
Parameters
- Row [0,3]
- Column [0,3]
Returns
An integer.
Usage
The following C example will provide the hardware id of the machine running the neural network:
#include <stdio.h>
#include <string.h>
#include <stdbool.h>
using namespace std;
#include "AnaimoAI_nn.h"
int main(int argc, char *argv[]) {
char lStrTmp[1024] = "";
char lStrTmp2[1024] = "";
for(int i = 0; i < 4; i++){
for(int j = 0; j < 4; j++){
snprintf(lStrTmp, sizeof(lStrTmp), "%X", HardwareId(i, j));
strcat(lStrTmp2, lStrTmp);
if(!((i==3)&&(j==3)))
strcat(lStrTmp2, ":");
}
}
printf("%sn", lStrTmp2);
return 0;
}
To compile the above program, you can execute:
g++ -Wall -O3 -o main main.cpp -I./ -L./ -l:libAnaimoAI_nn.a -lgomp -pthread -lm -fopenmp
Licensed
Purpose
Returns the status of the license of the library.
Declarations
Standard C:
extern int Licensed();
MS Visual Studio:
extern int _cdecl Licensed();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall Licensed();
Parameters
None.
Returns
An integer with the status of the license. Please see the possible values in the method NetCreate.
Usage
The following C example will provide the status of the license:
int lIntLicenseStatus = Licensed();
Version
Purpose
Returns the version and build numbers of the library.
Declarations
Standard C:
extern int Version(int *pIntBuild);
MS Visual Studio:
extern int _cdecl Version(int *pIntBuild);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall Version(int *pIntBuild);
Parameters
A returned parameter indicating the build number.
Returns
An integer with the version number.
Usage
The following C example will provide the version and the build of the library:
int lIntBuild, lIntVersion = Version(&lIntBuild);
AutoLearnEnable
Purpose
Enable the AutoLearn functions. It must be enabled before creating the neural network. For more information, please refer to the “Set functions” section.
Declarations
Standard C:
extern void AutoLearnEnable(bool pBolEnable);
MS Visual Studio:
extern void _cdecl AutoLearnEnable(bool pBolEnable);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall AutoLearnEnable(bool pBolEnable);
Parameters
Whether the function is activated or not. Set ‘true’ to activate it.
Usage
The following C example activates the AutoLearn functions:
AutoLearnEnable(true);
Multiple neural network functions
A multiple neural network is a group of neural networks.
NetsAdd
Purpose
Adds a new neural network.
Declarations
Standard C:
extern int NetsAdd();
MS Visual Studio:
extern int _cdecl NetsAdd();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetsAdd();
Parameters
None.
Returns
Nothing.
Usage
The following C example adds a new neural network:
NetsAdd();
NetsRemove
Purpose
Deletes a neural network.
Declarations
Standard C:
extern int NetsRemove(int pIntNet);
MS Visual Studio:
extern int _cdecl NetsRemove(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetsRemove(int pIntNet);
Parameters
The index of the network you want to delete. For the first network, it will be 0. It is possible to delete all networks by iterating over the total number and deleting network 0 in each iteration.
Returns
Nothing.
Usage
The following C example deletes the first neutal network:
NetsRemove(0);
NetsNumberGet
Purpose
Returns the total number of neural networks created so far.
Declarations
Standard C:
extern int NetsNumberGet();
MS Visual Studio:
extern int _cdecl NetsNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetsNumberGet();
Parameters
None.
Returns
The total number of neural networks.
Usage
The following C example stores the total number of networks in a variable:
int lIntTotalNumberOfNets = NetsNumberGet();
Network functions
A network is a group of neurons.
NetActivationGet
Purpose
Returns the activation function to be used.
Declarations
Standard C:
extern int _cdecl NetActivationGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio:
extern int NetActivationGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetActivationGet(int pIntNet, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Layer number to apply the activation function. The inputs layer is the first layer and is number 1. This parameter can be 0 and then it will set the default for layers created in the future.
Returns
Activation function, with possible values of:
- 0: for Sigmoid.
- 1: for ReLU.
- 2: for fast Sigmoid.
It can also return -1 if the parameter layer number is not correct.
Usage
The following C example gets the activation function which will be used for layers created in the future:
#define ACTIVATION_F_Sigmoid 0
#define ACTIVATION_F_ReLU 1
#define ACTIVATION_F_FastSigmoid 2
int lIntActivation = NetActivationGet(0, ACTIVATION_F_ReLU, 0);
NetActivationSet
Purpose
Selects the activation function to be used.
Declarations
Standard C:
extern void _cdecl NetActivationSet(int pIntNet, int pIntActivationFunction, int pIntLayerNumber);
MS Visual Studio:
extern void NetActivationSet(int pIntNet, int pIntActivationFunction, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetActivationSet(int pIntNet, int pIntActivationFunction, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Activation function, with possible values of:
- 0: for Sigmoid.
- 1: for ReLU.
- 2: for fast Sigmoid.
- Layer number to apply the activation function. The inputs layer is the first layer and is number 1. This parameter can be 0 and then it will set the default for layers created in the future.
Returns
Nothing.
Usage
The following C example will set ReLU as the activation function for layers created in the futureint pIntNet
#define ACTIVATION_F_Sigmoid 0
#define ACTIVATION_F_ReLU 1
#define ACTIVATION_F_FastSigmoid 2
NetActivationSet(0, ACTIVATION_F_ReLU, 0);
NetConnect
Purpose
Connects two neurons.
Declarations
Standard C:
extern bool NetConnect(int pIntNet, int pIntSrc, int pIntDst);
MS Visual Studio:
extern bool _cdecl NetConnect(int pIntNet, int pIntSrc, int pIntDst);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnect(int pIntNet, int pIntSrc, int pIntDst);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of the source neuron.
- Number of the destination neuron.
Returns
True if connection was successful.
Usage
The following C example connects the 10th neuron from the first network to be the input of the 20th neuron and returns in a boolean variable, true or false if there was an error.
bool lBolTmp = NetConnect(0, 9, 19);
NetConnectConsecutive
Purpose
Connects a consecutive list of neurons to one another neuron. Its purpose is to speed up the connection of a high number of neurons to a destination neuron.
NetConnectConsecutive will behave as NetConect when the first 2 parameters are the same number (pIntSrc1 equals pIntSrc2).
Declarations
Standard C:
extern bool NetConnectConsecutive(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst);
MS Visual Studio:
extern bool _cdecl NetConnectConsecutive(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnectConsecutive(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of the source initial neuron.
- Number of the source final neuron.
- Number of the destination neuron.
Note that all neurons from initial to final will be connected to the destination neuron.
Returns
True if connection was successful.
Usage
The following C example connects the 10th, 11th, and 12th neurons from the first network to be the input of 22nd neuron and returns in a boolean variable, true, or false if there was an error:
bool lBolTmp = NetConnectConsecutive(0, 9, 11, 21);
NetConnectLayer
Purpose
Connects a layer of neurons to another layer of neurons. Its purpose is to speed up the connection of a high number of neurons.
Declarations
Standard C:
extern bool NetConnectLayer(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
MS Visual Studio:
extern bool _cdecl NetConnectLayer(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetConnectLayer(int pIntNet, int pIntSrc1, int pIntSrc2, int pIntDst1, int pIntDst2);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of the source initial neuron.
- Number of the source final neuron.
- Number of the destination initial neuron.
- Number of the destination final neuron.
Note that all neurons from source initial to source final will be connected to all neurons from the destination initial to destination final neurons.
Returns
True if connection was successful.
Usage
The following C example connects the 10th, 11th, and 12th neurons from the first network to be the input of the 20th, 21st, and 22nd neurons and returns in a boolean variable, true or false if there was an error:
bool lBolTmp = NetConnectLayer(0, 9, 11, 19, 21);
NetCreate
Purpose
Creates the basement of the neural network by dynamically creating part of the memory structures. This function will only create the base memory structure, but not the neural network itself. You will have to create it manually. If you want to create the full neural network automatically, do not use this function but NetTopologyCreate.
Declarations
Standard C:
extern int NetCreate(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
MS Visual Studio:
extern int _cdecl NetCreate(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetCreate(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, bool pBolSnapshotsFree);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Maximum number of neurons.
- Number of inputs.
- Number of outputs.
- If the snapshot should be also freed first.
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural
- network library is not licensed for the current hardware running it. In this case, you need to send to
- Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your
- device supports.
- 4: unknown error.
- 5: the number of inputs must be greater than the number of outputs.
- 6: select a full connected topology to perform this function.
- 7: error in parameters.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the first neural network.
#define NetCreate_Success 0
#define NetCreate_LicenseExpiresInLessThan30Days 1
#define NetCreate_NotLicensed 2
#define NetCreate_OutOfMemory 3
#define NetCreate_UnknownError 4
#define NetCreate_InputsMustBeGreaterThanOutputs 5
#define NetCreate_OnlyAvailableForFullConnectedTopology 6
#define NetCreate_IndicatedParametersAreIncorrect 7
#define NetCreate_ReductionBetweenLayersMustBeGreaterThanOne 8
#define NetCreate_MaximumNumberOfNeuronsReached 9
#define NetCreate_LastLayerIsLessOrEqualThanTheNumberOfNeededOutputs 10
#define NetCreate_CouldNotAddNeurons 11
#define NetCreate_LayersCouldNotBeAnalyzed 12
int lIntTmp = NetCreate(0, pIntFinalNumberOfTotalNeurons, pIntInputsNumber,
pIntOutputsNumber, true);
NetDecayRateGet
Purpose
Returns the current decay rate in percentage expressed in the range [0, 1]. The decay rate will be applied to the learning rate following this formula:
LearningRate = LearningRate * DecayRate ^ CurrentEpochNumber
Declarations
Standard C:
extern float NetDecayRateGet(int pIntNet);
MS Visual Studio:
extern float _cdecl NetDecayRateGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetDecayRateGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
A float with the current decay rate in percentage in the range [0, 1].
Usage
The following C example gets the decay rate of the first network and puts it into a variable:
float lSngDecayRate = NetDecayRateGet(0);
NetDecayRateSet
Purpose
Sets the current decay rate in percentage expressed in the range [0, 1]. For more information, please read the function NetDecayRateGet in this document.
Declarations
Standard C:
extern void NetDecayRateSet(int pIntNet, float pSng);
MS Visual Studio:
extern void _cdecl NetDecayRateSet(int pIntNet, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetDecayRateSet(int pIntNet, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The decay rate in percentage and in the range [0, 1].
Returns
Nothing.
Usage
The following C example sets the decay rate of the first network at 95%:
NetDecayRateSet(0, 0.95);
NetDropoutGet
Purpose
Returns the current dropout rate in percentage, indicated with a number in the range [0, 1].
Please see NetDropoutSet for more information.
Declarations
Standard C:
extern float NetDropoutGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio:
extern float _cdecl NetDropoutGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetDropoutGet(int pIntNet, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The layer number of which to obtain its current dropout rate, or 0 to obtain the default dropout rate which will be applied to future new created layers.
Returns
A float with the current dropout rate as a value in the range of [0, 1].
It will return -1 if the number of layer is not correct.
Usage
The following C example gets the default dropout rate of the first network and puts it into a variable.
lSngDropOut = NetDropoutGet(0, 0);
Purpose
NetDropoutSet
Sets the dropout rate in percentage, indicated with a number in the range [0, 1]. You can set it to 0 for no dropout. It is set to 0 by default.
If different than 0, will make that the dropout rate percentage of the inputs of the neurons will not be considered on each training cycle. These ignored neurons will be randomly selected.
Inputs not ignored are scaled up by 1/(1 – dropout) for the sum over all inputs to remain unchanged.
Note that the rate and the actual percentage might not be exactly, please check the following table and graphic:
DropOut Rate | Actual measured % of weights considered depending on the DropOut rate |
0,05 | 90,98% |
0,10 | 83,90% |
0,50 | 50,11% |
0,90 | 16,09% |
0,95 | 8,92% |

Declarations
Standard C:
extern void NetDropoutSet(int pIntNet, float pSng, int pIntLayerNumber);
MS Visual Studio:
extern void _cdecl NetDropoutSet(int pIntNet, float pSng, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetDropoutSet(int pIntNet, float pSng, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The dropout rate [0, 1].
- The layer number to apply the new dropout rate. Indicate 0 for a default dropout rate which will be applied to all new layers.
Returns
Nothing.
Usage
The following C example sets the dropout rate of the first network at 10% for all new future layers:
NetDropoutSet(0, 0.1, 0);
NetErrorGet
Purpose
Returns a float which is the sum, in absolute values, of the errors of all neurons.
Declarations
Standard C:
extern float NetErrorGet(int pIntNet, int pIntCyclesControl);
MS Visual Studio:
extern float _cdecl NetErrorGet(int pIntNet, int pIntCyclesControl);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetErrorGet(int pIntNet, int pIntCyclesControl);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
Returns
A float.
Usage
The following C code puts into a float variable the first network total error:
float lSngError = NetErrorGet(0, 0);
NetFree
Purpose
Destroys the neural network and frees memory. This function does not destroy the set.
Please remember that the last NetFree should also free the snapshot.
Declarations
Standard C:
extern void NetFree(int pIntNet, bool pBolSnapshotsFree);
MS Visual Studio:
extern void _cdecl NetFree(int pIntNet, bool pBolSnapshotsFree);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetFree(int pIntNet, bool pBolSnapshotsFree);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Boolean to indicate if the snapshot should be also freed.
Returns
Nothing.
Usage
The following C example frees the memory of the first neural network, including the snapshot:
NetFree(0, true);
NetInitialize
Purpose
Initializes the neural network, by initializing in memory:
- Bias: puts 0.
- Values: puts 0.
- Neurons’ weights: a random number.
Random numbers are generated with a Gaussian distribution.
Declarations
Standard C:
extern bool NetInitialize(int pIntNet, float pSngMean, float pSngVariance, float pSngPercentage);
MS Visual Studio:
extern bool _cdecl NetInitialize(int pIntNet, float pSngMean, float pSngVariance, float pSngPercentage);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetInitialize(int pIntNet, float pSngMean, float pSngVariance, float pSngPercentage);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Mean of the random values, normally zero.
- Variance of the random values, normally 1. If different than 1, it will change the variance of the random numbers used to initialize all layers except the last one.
- Percentage of the network that should be initialized. Indicate 1 for all the network.
Returns
A boolean indicating with true if the initialization went ok, or false otherwise. False will be returned typically when there is not enough memory to analyze all the layers and to create the internal memory structures.
Usage
The following C code initializes the first neural network:
NetInitialize(0, 0, 1, 1);
NetInitTypeGet
Purpose
Returns the type of initialization performed by the function NetInitialize.
Declarations
Standard C:
extern int NetInitTypeGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetInitTypeGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInitTypeGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
Initialization type:
- 0: Normal.
- 1: Xavier.
- 2: HE.
Usage
The following C code gets the initialization type of the first neural network:
#define INIT_TYPE_Normal 0
#define INIT_TYPE_Xavier 1
#define INIT_TYPE_HE 2
int lIntInitType = NetInitTypeGet(0);
NetInitTypeSet
Purpose
Sets the type of initialization performed by the function NetInitialize.
Declarations
Standard C:
extern void NetInitTypeSet(int pIntNet, int pIntInitType);
MS Visual Studio:
extern void _cdecl NetInitTypeSet(int pIntNet, int pIntInitType);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetInitTypeSet(int pIntNet, int pIntInitType);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Initialization type:
- 0: Normal.
- 1: Xavier.
- 2: HE.
Returns
Nothing.
Usage
The following C code sets the initialization type of the first neural network to be Xavier:
#define INIT_TYPE_Normal 0
#define INIT_TYPE_Xavier 1
#define INIT_TYPE_HE 2
NetInitTypeSet(0, INIT_TYPE_Xavier);
NetInputGet
Purpose
Gets the current value of an input.
Declarations
Standard C:
extern float NetInputGet(int pIntNet, int pIntInput);
MS Visual Studio:
extern float_cdecl NetInputGet(int pIntNet, int pIntInput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetInputGet(int pIntNet, int pIntInput);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section
- The number of the input.
Returns
The current value of the input.
Usage
The following C example gets the value of the 10th input from the first network and puts it into a variable:
lSngInputValue = NetInputGet(0, 9);
NetInputSet
Purpose
Sets the value of an input.
Declarations
Standard C:
extern void NetInputSet(int pIntNet, int pIntInput, float pSng);
MS Visual Studio:
extern void _cdecl NetInputSet(int pIntNet, int pIntInput, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetInputSet(int pIntNet, int pIntInput, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the input.
- The new value.
Returns
Nothing.
Usage
The following C example sets the value of the 10th input from the first network to 0.1.
NetInputSet(0, 9, 0.1);
NetInputsAddedNumberGet
Purpose
Returns the number of inputs that have been added with NetNeuronsAdd or NetNeuronsAddConsecutive to the neural network. The maximum number of inputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetInputsAddedNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetInputsAddedNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInputsAddedNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer.
Usage
The following C example gets the maximum number of inputs that have been added in the first network and puts it into a variable:
lIntTmp = NetInputsAddedNumberGet(0);
NetInputsMaxNumberGet
Purpose
Returns the number of inputs that the neural network has.
Declarations
Standard C:
extern int NetInputsMaxNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetInputsMaxNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetInputsMaxNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
Nothing.
Usage
The following C example get the number of inputs of the first neural network and puts it into a variable:
int lIntTmp = NetInputsMaxNumberGet(0);
NetKnowledgeFilePathSet
Purpose
Sets the full or relative path and file name (.nnk) that will be used to read and save the knowledge of the network.
Declarations
Standard C:
extern void NetKnowledgeFilePathSet(int pIntNet, unsigned long pLngBufferLength, const char* pStrBuffer);
MS Visual Studio:
extern void _cdecl NetKnowledgeFilePathSet(int pIntNet, DWORD pLngBufferLength, LPCSTR pStrBuffer);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetKnowledgeFilePathSet(int pIntNet, DWORD pLngBufferLength, LPCSTR pStrBuffer);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Length of the string which defines the path to the file.
- Pointer to the string which defines the path to the file.
Returns
Nothing.
Usage
The following C code changes the file path and name to read and save knowledge of the first network:
char lStrTmp[1024] = "";
// note that the directory ./know must exist
snprintf(lStrTmp, 1024, "./know/AnaimoAI.nnk";
NetKnowledgeFilePathSet(0, strlen(lStrTmp), lStrTmp);;
NetKnowledgeLoadFromFile
Purpose
Loads the knowledge of the neural network from a file.
For the characteristics of the file, please read NetKnowledgeSaveToFile.
Declarations
Standard C:
extern int NetKnowledgeLoadFromFile(int pIntNet, bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int *pIntNumberOfNeurons);
MS Visual Studio:
extern int _cdecl NetKnowledgeLoadFromFile(int pIntNet, bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int *pIntNumberOfNeurons);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetKnowledgeLoadFromFile(int pIntNet, bool pBolNetworkChangeIfNecessary, int pIntOutputs, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int *pIntNumberOfNeurons);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Boolean to indicate with true if the neural network should be created to match the knowledge being loaded.
- The number of outputs for the network that will be created. Indicate here 0 to create the network with the number of outputs contained in the file, or put a number different than 0 and then the knowledge will only be loaded if the number of outputs of the filed is exactly as this parameter.
- The mean of the random numbers to initialize the network, in case it is created, normally 0.
- The variance of the random numbers to initialize the network, in case it is created, normally 1.
- The percentage of the network that will be initialized, normally 1.
- Number of neurons created.
Returns
An integer indicating with:
- 0: file was loaded successfully.
- -1: file could not be opened.
- -2: version could not be read.
- -3: line with rows, cols, … could not be read.
- -4: architecture of network could not be read.
- -5: could not calculate internal architecture of network.
- number of layers and neurons per layer could not be read.
- -6: knowledge file is not compatible with number of layers or neurons per layer of current neural network.
- -7: number of layers and neurons per layer could not be read.
- -8: number of layers or neurons per layer was not provided.
- -9: architecture of network did not include information for all layers.
- – 10: knowledge file is not compatible with number of layers or neurons per layer of current neural network.
- -11: line with total number of records and maximum number of neurons, inputs and outputs, could not be read.
- -12: any of these parameters was not provided: total number of records, max number of neurons, max. number of inputs or max. number of outputs.
- -13: file indicates to have 0 data records.
- -14: knowledge file is not compatible with maximum number of neurons, number of inputs or outputs of current neural network.
- -15: a neuron record was not correct.
- -16: error setting the bias of a neuron.
- -17: number of inputs does not match those of a current neuron.
- -18: error setting the weight of an input of a neuron.
- -19: file does not include complete data for inputs of a neuron.
- -20: file does not include complete data for the number of records indicated in the header.
- -21: knowledge file is not compatible with number of inputs or outputs of current neural network.
- In case the value is positive, then please read the help of NetArchitectureCreate in this document.
Usage
The following C code loads all the first neural network knowledge from a file:
int lIntRes = NetKnowledgeLoadFromFile(0, false, 0, 0, 0, 0);
NetKnowledgeSaveToFile
Purpose
Saves current knowledge of neural network, which is useful to load it later with NetKnowledgeLoadFromFile.
The file has the following characteristics:
- The file is internally a comma separated value (CSV) file.
- It can be opened by the NNViewer application which source code is provided.
- The file is named AnaimoAI.nnk and is created where the AnaimoAI.dll is.
- The content of the file is:
- 1st row: the version of the Anaimo AI SDK which created the file, in format YYYYMM. For example, for this version: 202203
- 2nd row are 5 integers indicating the number of:
- Number of inputs channels, for example 3 for RGB (red, green, blue) images.
- Rows for inputs. It is optional and therefore can be zero.
- Columns for inputs. It is optional and therefore can be zero.
- Rows for outputs. It is optional and therefore can be zero.
- Columns for outputs. It is optional and therefore can be zero.
- 3rd row is a string indicating the network architecture. This string will contain as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- The number of channels of the layer.
- The type of the layer. For more information, please read NetLayersTypeGet.
- The width of the layer.
- The height of the layer.
- The width stride of the layer, to convolve.
- The height stride of the layer, to convolve.
- The padding to convolve.
- 4th row:
- An integer indicating the number of layers.
- A comma separated string indicating the number of neurons per layer.
- 5th row:
- The number of the total number of following records.
- The maximum number of neurons.
- The maximum number of inputs.
- The maximum number of outputs.
- Rest of rows are any of these types and have the following content:
- Type of record for a neuron. Containing:
- The number of the neuron.
- The bias.
- The number of inputs of this neuron. If this value is different than zero, then there will be as many records of the following type as inputs of this neuron.
- Type of record for an input of a neuron. Containing:
- The weight for the input of the neuron.
- Type of record for a neuron. Containing:
As you can see, in the 2nd row of the file, channels, columns and rows for inputs and outputs can be zero indicating that no specific visualization grid is recommended. Read the parameters section below for more information.
Declarations
Standard C:
extern int NetKnowledgeSaveToFile(int pIntNet, int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl NetKnowledgeSaveToFile(int pIntNet, int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetKnowledgeSaveToFile(int pIntNet, int pIntNumberOfInputsChannels, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Integers indicating the number of:
- Channels in the inputs (for example 3 for RGB [red, green, blue] images).
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Decimals for the floating-point numbers used to store the information. Less than 15 decimals can be enough to save knowledge, but to save exactly what is in memory that minimum is needed in this parameter.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
- 2: could not analyze current architecture of neural network.
Usage
The following C code saves the current knowledge of the first network into a file as indicated above:
int lIntRes = NetKnowledgeSaveToFile(0, 3, 50, 50, 10, 1, 6);
NetLayersAnalyze
Purpose
Analyzes the current neural network and automatically builds the internal memory structure. This function will be automatically called internally when setting certain modes. For more information, please read the information about the different working modes.
Declarations
Standard C:
extern bool NetLayersAnalyze(int pIntNet);
MS Visual Studio:
extern bool _cdecl NetLayersAnalyze(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersAnalyze(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
A boolean, indicating:
- true: success.
- false: failure, possibly out of memory.
Usage
The following C example analyzes the first neural network and creates the internal memory structure:
NetLayersAnalyze(0);
NetLayersBiasGet
Purpose
Returns the value of the bias of an element of a layer.
Declarations
Standard C:
extern float NetLayersBiasGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
MS Visual Studio:
extern float _cdecl NetLayersBiasGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API float _stdcall NetLayersBiasGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
Returns
A floating-point number which is the bias.
Usage
The following C example obtains the bias of the 1st element of the 2nd layer from the first network:
float lSngBias = NetLayersBiasGet(0, 2, 0);
NetLayersBiasSet
Purpose
Sets the value of the bias of an element of a layer.
Declarations
Standard C:
extern bool NetLayersBiasSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio:
extern bool _cdecl NetLayersBiasSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersBiasSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
- The new value of the bias.
Returns
True if the bias was set correctly.
Usage
The following C example sets to 1.41 the bias of the 1st element of the 2nd layer from the first network:
NetLayersBiasSet(0, 2, 0, 1.41);
NetLayersInfoGet
Purpose
Returns information of a layer.
Declarations
Standard C:
extern bool NetLayersInfoGet(int pIntNet, int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
MS Visual Studio:
extern bool _cdecl NetLayersInfoGet(int pIntNet, int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersInfoGet(int pIntNet, int pIntLayerNumber, int* pIntChannels, int* pIntType, int* pIntWidth, int* pIntHeight, int* pIntStrideW, int* pIntStrideH, int* pIntPadding);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer of which to obtain information. The first layer is the inputs layer and is the layer number 1.
- A returned parameter indicating the number of channels (for example 3 for RGB [red, green, blue] images).
- The type of layer (for more information read the help of NetLayersTypeGet in this document).
- The width of the layer.
- The height of the layer.
- The stride in width to convolve.
- The stride in height to convolve.
- The padding to convolve.
Returns
A boolean, indicating:
- true: success.
- false: failure.
Usage
The following C example returns information about the 4th layer of the fist network:
NetLayersInfoGet(0, 4, &lIntChannels, &lIntType, &lIntWidth, &lIntHeight, &lIntStrideW, &lIntStrideH, &lIntPadding);
NetLayersNeuronsNumberGet
Purpose
Returns the number of neurons in a layer.
Declarations
Standard C:
extern int NetLayersNeuronsNumberGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio:
extern int _cdecl NetLayersNeuronsNumberGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersNeuronsNumberGet(int pIntNet, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer. The first layer is the layer number 1 and is the inputs layer.
Returns
The number of neurons of the parameter layer.
Usage
The following C example get the number of neurons in the 2nd layer of the first neural network and puts it into a variable:
int lIntTmp = NetLayersNeuronsNumberGet(0, 2);
NetLayersNumberGet
Purpose
Returns the number of layers in the current neural network, including the outputs but not the inputs.
Declarations
Standard C:
extern int NetLayersNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetLayersNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
The number of layers. This function is only available after calling NetLayersAnalyze.
Usage
The following C example get the number of layers of the first neural network and puts it into a variable:
int lIntTmp = NetLayersNumberGet(0);
NetLayersQuantityOfNeurons
Purpose
Returns the number of neurons that will be created for a layer.
Declarations
Standard C:
extern int NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
MS Visual Studio:
extern int _cdecl NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersQuantityOfNeurons(int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubPadding, int pIntChannelsOfPrevLayer, int pIntWidthOfPrevLayer, int pIntHeightOfPrevLayer, int* pIntChannelsResult, int* pIntLayerWidth, int* pIntLayerHeight, int* pIntSubStrideW, int* pIntSubStrideH);
Parameters
- The number of channels (for example 3 for RGB [red, greeb, blue] images).
- The type of the layer. For more information, please read NetLayersTypeGet also in this manual.
- The width of the layer.
- The height of the layer.
- The padding used to convolve.
- The number of channels of the previous layer.
- The width of the previous layer.
- The height of the previous layer.
- Returning parameters with the resulting:
- Number of channels.
- Width of the layer, in case of convolutional layers, of the feature map.
- Height of the layer, in case of convolutional layers, of the feature map
- Stride of width, to convolve.
- Stride of height, to convolve.
Returns
The number of neurons needed to store all the information of the layer.
Usage
The following C example obtains the number of neurons needed for a 3×3 convolutional 2D layer which is after the inputs layers, composed of RGB images of 50×50 pixels:
int lIntNumberOfNeurons = NetLayersQuantityOfNeurons(3, LAYERS_TYPE_Conv2D, 3, 3, 0, 3, 50, 50, &lIntResultingChannels, &lIntResultingWidth, &lIntResultingHeight, &lIntResultingStrideW, &lIntResultingStrideH);
NetLayersTypeGet
Purpose
Returns the type of the parameter layer.
Declarations
Standard C:
extern int NetLayersTypeGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio:
extern int _cdecl NetLayersTypeGet(int pIntNet, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLayersTypeGet(int pIntNet, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer of which to obtain its type. The first layer is the inputs layer and is the layer number 1.
Returns
The type of the layer, according to the following list:
#define LAYERS_TYPE_Normal 0
#define LAYERS_TYPE_Conv2D 1
#define LAYERS_TYPE_MaxPooling 2
#define LAYERS_TYPE_AvgPooling 3
#define LAYERS_TYPE_SoftMax 4
Usage
The following C example obtains the type of the 3rd layer of the first network:
int lIntLayerType = NetLayersTypeGet(0, 3);
NetLayersWeightGet
Purpose
Returns the value of the weight of an element of a layer.
Declarations
Standard C:
extern float NetLayersWeightGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
MS Visual Studio:
extern float _cdecl NetLayersWeightGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API float _stdcall NetLayersWeightGet(int pIntNet, int pIntLayerNumber, int pIntIndex);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
Returns
A floating-point number which is the weight.
Usage
The following C example obtains the weight of the 1st element of the 2nd layer of the first network:
float lSngWeight = NetLayersWeightGet(0, 2, 0);
NetLayersWeightSet
Purpose
Sets the value of the weight of an element of a layer.
Declarations
Standard C:
extern bool NetLayersWeightSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio:
extern bool _cdecl NetLayersWeightSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_extern "C" AnaimoAI_API bool _stdcall NetLayersWeightSet(int pIntNet, int pIntLayerNumber, int pIntIndex, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the first layer, which is the inputs layer.
- The index of the element of the layer, starting with zero.
- The new value of the weight.
Returns
True if the weight was set correctly.
Usage
The following C example sets to 1.41 the weight of the 1st element of the 2nd layer of the first network:
NetLayersWeightSet(0, 2, 0, 1.41);
NetLearn
Purpose
Makes the neural network to learn with the current inputs and outputs.
Declarations
Standard C:
extern int NetLearn(int pIntNet, int pIntSet, int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern int _cdecl NetLearn(int pIntNet, int pIntSet, int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetLearn(int pIntNet, int pIntSet, int pIntCyclesControl, bool pBolStopIfGradientsVanish, int pIntCurrentEpochNumber, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- The current number of epoch, which must start with zero.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
An integer, indicating:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero.
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example makes the first neural network to learn, without considering cycles, that the current inputs generate the current outputs.
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingTopology 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Topology_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
#define NetLearn_Out_Of_Memory 13
#define NetLearn_Could_not_augment_data_set 14
#define NetLearn_Options_Finished 15
#define VALIDATION_TYPE_ByThreshold 0
#define VALIDATION_TYPE_ByMax 1
#define VALIDATION_TYPE_ByOutputs 2
NetLearn(0, 0, 0, true, 0, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
NetLearningRateGet
Purpose
Returns a float with the current learning rate.
Declarations
Standard C:
extern float NetLearningRateGet(int pIntNet);
MS Visual Studio:
extern float _cdecl NetLearningRateGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C"extern "C" AnaimoAI_API float _stdcall NetLearningRateGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
A float with the current learning rate.
Usage
The following C example gets the current learning rate of the first network and introduces it into a variable:
lSngTmp = NetLearningRateGet(0);
NetLearningRateSet
Purpose
Sets the learning rate.
Declarations
Standard C:
extern void NetLearningRateSet(int pIntNet, float pSng);
MS Visual Studio:
extern void _cdecl NetLearningRateSet(int pIntNet, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetLearningRateSet(int pIntNet, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The learning rate. It must be in the range of [0, 1]. If not set, it defaults to 0.5.
Returns
Nothing.
Usage
The following C example sets learning rate of the first network to 0.5.
NetLearningRateSet(0, 0.5);
NetMemoryCacheGet (beta)
Purpose
Returns an integer with the size of the cache memory, in kilo bytes (Kb), that the neural network is considering. Normally, cache memory should be equal or less than the level 3 cache of the computer executing the neural network. Memory cache will be disconnected if this parameter is set to zero.
This parameter can affect performance of the training process of the neural network as it allows to optimize the use of the memory cache and to reduce memory bottlenecks.
This parameter is currently subject to additional testing (beta).
Declarations
Standard C:
extern float NetMemoryCacheGet();
MS Visual Studio:
extern void _cdecl NetMemoryCacheGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetMemoryCacheGet();
Parameters
None.
Returns
An integer indicating the size of the cache memory in kilo bytes (Kb).
Usage
The following C example gets the size in kilo bytes (Kb) of the memory cache that the neural network is considering and puts it into a variable:
lIntMemoryCache = NetMemoryCacheGet();
NetMemoryCacheSet (beta)
Purpose
Sets the size of the cache memory, in kilo bytes (Kb), that the neural network is considering. Normally, cache memory should be equal or less than the level 3 cache of the computer executing the neural network. Memory cache will be disconnected if this parameter is set to zero.
This parameter can affect performance of the training process of the neural network as it allows to optimize the use of the memory cache and to reduce memory bottlenecks.
This parameter is currently subject to additional testing (beta).
Declarations
Standard C:
extern void NetMemoryCacheSet(int pInt);
MS Visual Studio:
extern void _cdecl NetMemoryCacheSet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetMemoryCacheSet(int pInt);
Parameters
- The size, in kilo bytes (Kb), of the cache memory that the neural network should consider.
Returns
Nothing.
Usage
The following C example sets the size in kilo bytes (Kb) of the memory cache that the neural network will consider:
NetMemoryCacheSet(31000);
NetModeGet
Purpose
Returns an integer with the current working mode.
Declarations
Standard C:
extern float NetModeGet(int pIntNet);
MS Visual Studio:
extern void _cdecl NetModeGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetModeGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
Returns an integer with the current working mode.
Usage
The following C example gets the current mode of the first network and puts it into a variable:
#define MODE_NORMAL 0
#define MODE_STANDARD_BACKPROPAGATION 1
#define MODE_DYNAMIC_PROPAGATION 2
#define MODE_BACKPROP_EXPERIMENTAL 3
lIntMode = NetModeGet(0);
NetModeSet
Purpose
Sets the current working mode. Changing the working mode could initialize the network, therefore, if
you want to keep the model you should save it before changing the mode.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore they need you to manually connect the neurons first.
Please note that:
- If you want to use manual topology with the faster mode Normal, you can do it by using manual topology to create the neural network and then, changing mode to Normal.
- Changing to Standard back propagation or Dynamic modes will automatically change topology to manual.
Declarations
Standard C:
extern void NetModeSet(int pIntNet, int pInt);
MS Visual Studio:
extern void _cdecl NetModeSet(int pIntNet, int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetModeSet(int pIntNet, int pInt);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The desired working mode.
Returns
Nothing.
Usage
The following C example sets mode of the first network to dynamic.
#define MODE_NORMAL 0
#define MODE_STANDARD_BACKPROPAGATION 1
#define MODE_DYNAMIC_PROPAGATION 2
#define MODE_BACKPROP_EXPERIMENTAL 3
NetModeSet(0, MODE_DYNAMIC_PROPAGATION);
NetMomentumGet
Purpose
Returns the current momentum rate. Momentum rate different than 0 will add, to weights and bias, the values of the changes of the previous training iteration, multiplied by this rate. A momentum of 0.1 will add 10% of the previous changes, whereas a momentum of 0.9 will add 90%.
Declarations
Standard C:
extern float NetMomentumGet(int pIntNet);
MS Visual Studio:
extern void _cdecl NetMomentumGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetMomentumGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
A float with the current momentum rate.
Usage
The following C example gets the momentum of the first network and puts it into a variable:
lSngTmp = NetMomentumGet(0);
NetMomentumSet
Purpose
Sets the momentum rate. It is only applied during training.
Declarations
Standard C:
extern void NetMomentumSet(int pIntNet, float pSng);
MS Visual Studio:
extern void _cdecl NetMomentumSet(int pIntNet, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetMomentumSet(int pIntNet, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The momentum rate.
Returns
Nothing.
Usage
TThe following C example sets momentum of the first network to 0.9.
NetMomentumSet(0, 0.9);
NetNeuronsAdd
Purpose
Adds a neuron to the neural network. You must use this call also for inputs and outputs. Inputs are indicated with the parameter 1 in this call and outputs must be, first created with this call and then qualified with NetOutputAdd.
Declarations
Standard C:
extern int _cdecl NetNeuronsAdd(int pIntNet, int pIntLayerNumber);
MS Visual Studio:
extern int NetNeuronsAdd(int pIntNet, int pIntLayerNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAdd(int pIntNet, int pIntLayerNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the inputs, where the neuron will be located. For the first layer (1), the inputs, this parameter is mandatory. For the rest of layers (2, …), you can ignore this parameter by indicating a zero value only if you are not using the full connected topology.
Returns
The number of added neurons so far if the neuron was added, or zero if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- When topology is full connected, and the layer number was not indicated.
Usage
The following C code adds a neuron to the first network.
NetNeuronsAdd(0, 1);
NetNeuronsAddConsecutive
Purpose
Adds several neurons to the neural network. You must use this call also for inputs and outputs. Inputs are indicated with the parameter 1 in this call and outputs must be, first created with this call and then created with NetOutputAdd.
Please note that this function does not allow to create channels, for example to store images separated in RGB (red, green, blue) channels. If you want to create separated channels, use NetNeuronsAddLayer instead.
Declarations
Standard C:
extern int _cdecl NetNeuronsAddConsecutive(int pIntNet, int pIntLayerNumber, int pIntQuantity);
MS Visual Studio:
extern int NetNeuronsAddConsecutive(int pIntNet, int pIntLayerNumber, int pIntQuantity);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddConsecutive(int pIntNet, int pIntLayerNumber, int pIntQuantity);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the inputs, where the neuron will be located. For the first layer (1), the inputs, this parameter is mandatory. For the rest of layers (2, …), you can ignore this parameter by indicating a zero value only if you are not using the full connected topology.
- The quantity of neurons to add.
Returns
The number of id of the last added neuron, if the neuron was added, or -1 if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- The layer number was not indicated and was mandatory.
Usage
The following C code adds 100 input neurons to the first network:
NetNeuronsAddConsecutive(0, 1, 100);
NetNeuronsAddLayer
Purpose
Adds a new layer to the neural network.
Please note that convolutional layers will only work in mode MODE_NORMAL and topology TOPOLOGY_FULL_CONNECTED.
Declarations
Standard C:
extern int _cdecl NetNeuronsAddLayer(int pIntNet, int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
MS Visual Studio:
extern int NetNeuronsAddLayer(int pIntNet, int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddLayer(int pIntNet, int pIntLayerNumber, int pIntChannels, int pIntType, int pIntSubWidth, int pIntSubHeight, int pIntSubStrideW, int pIntSubStrideH, int pIntSubPadding);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the layer, starting with 1 for the inputs.
- The channels of the layer (for example 3 for RGB [red, green, blue] images).
- The type of the layer. For more information, please read NetLayersTypeGet in this document.
- The width of the layer. In case of convolutional layers this is the width of the filter.
- The height of the layer. In case of convolutional layers this is the height of the filter.
- The stride for width of the layer, to convolve, normally 1.
- The stride for height of the layer, to convolve, normally 1.
- The padding to convolve, normally zero.
Returns
The number of id of the last added neuron, if the neuron was added, or -1 if the neuron could not be added because of:
- Trying to add more neurons than the maximum indicated in NetCreate
- The layer number was not indicated and was mandatory.
Usage
The following C code adds a second layer which is a complete convolutional layer of 3×3 to the first network to convolve RGB images:
NetNeuronsAddLayer(0, 2, 3, LAYERS_TYPE_Conv2D, 3, 3, 1, 1, 0);
NetNeuronsAddedNumberGet
Purpose
Returns the number of neurons that have been added with NetNeuronsAdd to the neural network. The maximum number of neurons that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetNeuronsAddedNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetNeuronsAddedNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsAddedNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer.
Usage
The following C example gets the maximum number of neurons that have been added to the first network and puts it into a variable:
lSngTmp = NetNeuronsAddedNumberGet(0);
NetNeuronsMaxNumberGet
Purpose
Returns the maximum number of neurons that can be added with NetNeuronsAdd to the neural network.
This value is set by the function NetCreate.
Declarations
Standard C:
extern int NetNeuronsMaxNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetNeuronsMaxNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetNeuronsMaxNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer.
Usage
The following C example gets the maximum number of neurons that can be added to the first nerwork and puts it into a variable:
lSngTmp = NetNeuronsMaxNumberGet(0);
NetOutputAdd
Purpose
Adds an output neuron to the neural network.
Declarations
Standard C:
extern void NetOutputAdd(int pIntNet, int pIntNeuron);
MS Visual Studio:
extern void _cdecl NetOutputAdd(int pIntNet, int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_Aextern "C" AnaimoAI_API void _stdcall NetOutputAdd(int pIntNet, int pIntNeuron);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Neuron number or id to become an output.
Returns
None.
Usage
The following C example makes the 100th neuron of the first network to be an output:
NetOutputAdd(0, 99);
NetOutputPredict
Purpose
Computes the value of an output and returns it.
Declarations
Standard C:
extern float NetOutputPredict(int pIntNet, int pIntSet, int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
MS Visual Studio:
extern float _cdecl NetOutputPredict(int pIntNet, int pIntSet, int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetOutputPredict(int pIntNet, int pIntSet, int pIntOutput, int pIntCyclesControl, bool pBolUseSnapshot, bool* pBolGradientExploded);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The number of the output.
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- A boolean indicating false If the prediction must be performed based upon the status of the neural network or true upon the snapshot. This allows to obtain predictions of a particular moment of the neural network while it continues learning with the set. This feature is only available for full connected topology, normal or back propagation modes.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
A float.
Usage
The following C example computes the value of output 10th of the first network and puts it into a float variable, without controlling cycles.
float lSngTmp = NetOutputPredict(0, 0, 9, 0, false, &lBolGradientExploded);
NetOutputsAddedNumberGet
Purpose
Returns the number of outputs that have been added with NetOutputAdd to the neural network. The maximum number of outputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetOutputsAddedNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetOutputsAddedNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetOutputsAddedNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer.
Usage
The following C example gets the maximum number of outputs that have been added to the first network and puts it into a variable:
lIntTmp = NetOutputsAddedNumberGet(0);
NetOutputGet
Purpose
Gets the current value of an output.
Declarations
Standard C:
extern float NetOutputGet(int pIntNet, int pIntOutput);
MS Visual Studio:
extern float _cdecl NetOutputGet(int pIntNet, int pIntOutput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetOutputGet(int pIntNet, int pIntOutput);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the output.
Returns
Nothing.
Usage
The following C example gets the current value of the 10th output of the first network and stores it into a variable:
float lSngOutputValue = NetOutputGet(0, 9);
NetOutputSet
Purpose
Sets the value of an output.
Declarations
Standard C:
extern void NetOutputSet(int pIntNet, int pIntOutput, float pSng);
MS Visual Studio:
extern void _cdecl NetOutputSet(int pIntNet, int pIntOutput, float pSng);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetOutputSet(int pIntNet, int pIntOutput, float pSng);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The number of the output.
- The new value.
Returns
Nothing.
Usage
The following C example sets the value of the 10th output of the first network to 0.5.
NetOutputSet(0, 9, 0.5);
NetOutputsMaxNumberGet
Purpose
Returns the maximum number of outputs that can been added with NetOutputAdd to the neural network.
The maximum number of outputs that can be added is determined by the function NetCreate.
Declarations
Standard C:
extern int NetOutputsMaxNumberGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetOutputsMaxNumberGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetOutputsMaxNumberGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer.
Usage
The following C example gets the maximum number of outputs that can been added to the first network and puts it into a variable:
lSngTmp = NetOutputsMaxNumberGet(0);
NetSetMatch
Purpose
Indicates, with true, if current channels, inputs and outputs of the data set match those of the current neural network.
Declarations
Standard C:
extern bool NetSetMatch(int pIntNet, int pIntSet);
MS Visual Studio:
extern bool _cdecl NetSetMatch(int pIntNet, int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NetSetMatch(int pIntNet, int pIntSet);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
Returns
True if current channels, inputs and outputs of the data set match those of the current neural network.
Usage
The following C example checks if the first data set matches with the first neural network:
bool lBolSetAndNetMatch = NetSetMatch(0, 0);
NetSnapshotGet
Purpose
Sets the current state of the neural network to exactly how it was when NetSnapshotTake was called for the last time.
When using the full connected topology, the snapshot can be gotten even from a snapshot that was taken with a neural network that had a different number of neurons only in one layer. This is very useful to, for example, increase or decrease the number of outputs of the neural network without losing the knowledge.
Declarations
Standard C:
extern int NetSnapshotGet(int pIntNet);
MS Visual Studio:
extern int _cdecl NetSnapshotGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetSnapshotGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer, indicating:
An integer, indicating:
- 0: success.
- -1: could not analyze neural network structure.
- -2: no snapshot was taken previously.
Usage
The following C example sets the first neural network to how it was when NetSnapshotTake was called for the last time:
NetSnapshotGet(0);
NetSnapshotTake
Purpose
Saves the current state of the neural network to a snapshot in memory, to be recalled later by NetSnapshotGet.
When using the full connected topology, the snapshot can be gotten even from a snapshot that was taken with a neural network that had a different number of neurons only in one layer. This is very useful to, for example, increase or decrease the number of outputs of the neural network without losing the knowledge.
Declarations
Standard C:
extern int NetSnapshotTake(int pIntNet);
MS Visual Studio:
extern int _cdecl NetSnapshotTake (int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetSnapshotTake(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer, indicating:
- 0: success.
- -1: could not analyze neural network structure.
- -2: could not create memory structure.
- -3: could not memorize internal neural network structure.
- -4: there is nothing to snapshot. This happens when there are no neurons.
Usage
The following C example saves the current state of the first neural network to a snapshot in memory, to be recalled later by NetSnapshotGet:
NetSnapshotTake(0);
NetThreadsMaxNumberGet
Purpose
Returns the current maximum number of internal threads that will be used to train the neural network. It is set to 1 by default.
Declarations
Standard C:
extern int NetThreadsMaxNumberGet();
MS Visual Studio:
extern int _cdecl NetThreadsMaxNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetThreadsMaxNumberGet();
Parameters
None.
Returns
An integer indicating the current maximum number of threads.
Usage
The following C example puts into a variable the current maximum number of threads:
int lIntTmp = NetThreadsMaxNumberGet();
NetThreadsMaxNumberSet
Purpose
Sets the current maximum number of internal threads that will be used to train the neural network. It is set to 1 by default.
This function has no effect when running on NPUs.
Declarations
Standard C:
extern void NetThreadsMaxNumberSet(int pInt);
MS Visual Studio:
extern void _cdecl NetThreadsMaxNumberSet(int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetThreadsMaxNumberSet(int pInt);
Parameters
An integer indicating the maximum number of threads.
Returns
Nothing.
Usage
The following C example sets the maximum number of threads that will be used to train the neural
network:
NetThreadsMaxNumberSet(8);
NetArchitectureCreate
Purpose
Creates the basement of the neural network (like NetCreate) and the architecture of the neural network, based on a string which defines all the layers. Only supports the full connected topology. You do not need to call neither NetCreate nor NetInitialize before or after calling this function.
This function allows to create a complete architecture of a neural network with just a string, which is very useful to try different architectures in search of the best model.
Declarations
Standard C:
extern int NetArchitectureCreate(int pIntNet, unsigned long pLngBufferLength, const char* pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfNeurons);
MS Visual Studio:
extern int _cdecl NetArchitectureCreate(int pIntNet, DWORD pLngBufferLength, LPCSTR pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfNeurons);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetArchitectureCreate(int pIntNet, DWORD pLngBufferLength, LPCSTR pStrBuffer, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int* pIntNumberOfNeurons);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Length of the string which defines the layers.
- Pointer to the string that which defines the layers. The string defining the layers can be of types:
- Detailed (see example below): a string containing as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- The number of channels of the layer.
- The type of the layer. For more information, please read NetLayersTypeGet.
- The width of the layer.
- The height of the layer.
- The width stride of the layer, to convolve.
- The height stride of the layer, to convolve.
- The padding to convolve.
- Mnemotechnic (see example below):
- To indicate the inputs layer: Number of channels I (uppercase i) width of inputs (height must be the same).
- Rest of layers: number of subelements TYPE and width (height must be the same). Where TYPE can be:
- C: convolutional 2D layer.
- P: max pooling layer.
- S: SoftMax layer.
- Detailed (see example below): a string containing as many tuplas separated by ; as layers, each tupla containing values separated by , to indicate:
- Initialization mean for the random numbers, normally 0.
- Initialization variance for the random numbers, normally 1.
- Percentage of the network that will be initialized, should be 1.
- Number of neurons in the created network.
For example, to create a network architecture with these characteristics:
- 1st layer: 28 x 28 inputs with 1 channel (black and white).
- 2nd layer: 10 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 3rd layer: 10 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 4th layer: a max pool layer of 2 x 2, with strides of 1 and no padding.
- 5th layer: 20 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 6th layer: 20 convolutional 2D filters of 3 x 3, with strides of 1 and no padding.
- 7th layer: a max pool layer of 2 x 2, with strides of 1 and no padding.
- 8th layer: a fully connected (dense) layer of 128 neurons.
- 9th layer: a fully connected (dense) layer of 10 outputs with SoftMax function.
The detailed string of the indicated architecture is:
1,0,28,28,1,1,0;10,1,3,3,1,1,0;10,1,3,3,1,1,0;10,2,2,2,1,1,0;20,1,3,3,1,1,0;20,1,3,3,1,1,0;20,2,2,2,1,1,0;128;10,4,0,0,0,0,0
And the equivalent with mnemotechnic is:
1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural network library is not licensed for the current hardware running it. In this case, you need to send to Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your device supports.
- 4: unknown error.
- 5: inputs number must be greater than outputs number.
- 6: this function is only available for full connected topology.
- 7: indicated parameters are incorrect.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: the number of neurons of the last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the first neural network.
// see NetCreate for the definition of possible returning values
int lIntTmp = NetArchitectureCreate(0, strlen(“1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S“), “1I28-10C3-10C3-P2-20C3-20C3-P2-128-10S”, 0, 1, 1);
NetArchitectureCreateAuto
Purpose
Creates the basement of the neural network (like NetCreate) and the architecture of the neural network. Only supports the full connected topology. You do not need to call NetCreate before calling this function.
Declarations
Standard C:
extern int NetArchitectureCreateAuto(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int pIntTopology, int* pIntNumberOfLayers, int* pIntNumberOfNeurons);
MS Visual Studio:
extern int _cdecl NetArchitectureCreateAuto(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int pIntTopology, int* pIntNumberOfLayers, int* pIntNumberOfNeurons);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NetArchitectureCreateAuto(int pIntNet, int pIntMaxNeurons, int pIntNumberOfInputs, int pIntNumberOfOutputs, int pIntMaxNumberOfLayers, float pSngInitMean, float pSngInitVariance, float pSngInitPercentage, int pIntTopology, int* pIntNumberOfLayers, int* pIntNumberOfNeurons);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Maximum number of neurons: this parameter is optional; you can put 0 and it will create as many neurons as memory allows. If you put it, that will be a limit and an error will be returned if more neurons are tried to be created.
- Maximum number of inputs.
- Maximum number of outputs.
- Maximum number of hidden layers including the layer for outputs.
- Initialization mean for the random numbers, normally 0.
- Initialization variance for the random numbers, normally 1.
- Percentage of the network that will be initialized, should be 1.
- Desired topology, 0 for “Manual” and 1 for “Full connected”. For more information, please refer to the “Topology” section.
- Output parameter: returns the number of layers created (including the outputs layer).
- Number of neurons in the created network.
Returns
An integer, indicating:
- 0: success.
- 1: success, but license will expire in less than 30 days.
- 2: not licensed, as more or equal than 1000 neurons were requested to be created and neural network library is not licensed for the current hardware running it. In this case, you need to send to Anaimo the 16 integers obtained with the HardwareId function to qualify for a licensed version.
- 3: out of memory, you are trying to create more neurons or connections than the memory of your device supports.
- 4: unknown error.
- 5: inputs number must be greater than outputs number.
- 6: this function is only available for full connected topology.
- 7: indicated parameters are incorrect.
- 8: reduction between layers must be greater than 1.
- 9: maximum number of neurons reached.
- 10: the number of neurons of the last layer is less or equal than the number of needed outputs.
- 11: could not add neurons.
- 12: layers could not be analyzed.
Usage
The following C example will create in memory the first neural network.
// see NetCreate for the definition of possible returning values
int lIntTmp = NetArchitectureCreateAuto(0, 0, pIntInputsNumber, pIntOutputsNumber, 3, 0, 1, 1, &lIntNumberOfLayers);
NetTopologyGet
Purpose
Returns an integer with the current topology of the neural network. There are these topologies available:
- 0: Manual. The topology, this is the number of neurons, inputs, outputs, and their connections, is completely free and it is responsibility of the programmer.
- 1: Full connected. All neurons from one layer are connected to all neurons of the next layer.
Declarations
Standard C:
extern float NetTopologyGet(int pIntNet);
MS Visual Studio:
extern void _cdecl NetTopologyGet(int pIntNet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NetTopologyGet(int pIntNet);
Parameters
Index of the network. For more information, please refer to the “Multiple neural network” section.
Returns
An integer indicating:
- 0 for manual topology, where connections between neurons must be stablished programmatically with calls to NetConnect, NetConnectConsecutive or NetConnectLayer.
- 1 for full connected topology, where all neurons from one layer are automatically connected to all neurons of the next layer.
Usage
The following C example gets the topology of the first neural network and puts it into a variable:
#define TOPOLOGY_MANUAL 0
#define TOPOLOGY_FULL_CONNECTED 1
lIntTmp = NetTopologyGet(0);
NetTopologySet
Purpose
Sets the current topology of the neural network. There are these topologies available:
0: Manual. The topology, this is the number of neurons, inputs, outputs, and their connections, is completely free and it is responsibility of the programmer.
1: Full connected. All neurons from one layer are connected to all neurons of the next layer.
Full connected topology is much faster and uses a 5% of the memory used in the manual topology, but manual topology allows you to connect neurons as you wish, which can be convenient for scientific and experimentation purposes.
Modes Standard back propagation and Dynamic propagation (beta) are only compatible with manual topology and therefore they need you to connect the neurons first.
Please note that:
- If you create the network in a topology different than manual, then, if you change to manual topology, you will have to connect neurons, or the network will not work.
- Modes Standard back propagation and Dynamic propagation can only be used in manual topology; therefore, they need you to connect the neurons first.
- If you want to use manual topology with the faster mode Normal, you can do it by using manual topology and Standard back propagation mode to create the neural network and then, changing mode to Normal.
The following function calls do not work in full connected topology:
- NetConnect
- NetConnectConsecutive
- NetConnectLayer
- NeuWeightUpdatedGet
- NeuValueUpdatedGet
- NetErrorGet
Declarations
Standard C:
extern void NetTopologySet(int pIntNet, int pInt);
MS Visual Studio:
extern void _cdecl NetTopologySet(int pIntNet, int pInt);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall NetTopologySet(int pIntNet, int pInt);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The desired topology.
Returns
Nothing.
Usage
The following C example sets topology of the first network to full connected:
#define TOPOLOGY_MANUAL 0
#define TOPOLOGY_FULL_CONNECTED 1
NetTopologySet(0, TOPOLOGY_FULL_CONNECTED);
Neuron functions
A neuron is an entity, kept in memory, which internally has a value, a bias and a delta. Neurons can also be inputs and outputs. A neuron can be connected to any other neuron in the network.
NeuValueGet
Purpose
Returns the current value of a neuron.
Declarations
Standard C:
extern float NeuValueGet(int pIntNet, int pIntNeuron);
MS Visual Studio:
extern float _cdecl NeuValueGet(int pIntNet, int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NeuValueGet(int pIntNet, int pIntNeuron);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron to obtain its current value.
Returns
A float.
Usage
The following C code puts into a float variable the bias value of the 10th neuron of the first network:
float lSngTmp = NeuValueGet(0, 9);
NeuValueUpdatedGet
Purpose
Returns the number of times that the value was calculated for a neuron, since these functions were called for the last time:
- NetCreate
- NetInitialize
Declarations
Standard C:
extern int NeuValueUpdatedGet(int pIntNet, int pIntNeuron);
MS Visual Studio:
extern int _cdecl NeuValueUpdatedGet(int pIntNet, int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuValueUpdatedGet(int pIntNet, int pIntNeuron);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron, starting with zero.
Returns
An integer.
Usage
The following C code puts into an integer variable the number of times that the value of the 10th neuron in the first network was updated:
int lIntTmp = NeuGetValueUpdatedGet(0, 9);
NeuWeightGet
Purpose
Returns the current, bias or delta of a neuron.
Declarations
Standard C:
extern float NeuWeightGet(int pIntNet, int pIntNeuron, int pIntInput, bool pBolDelta);
MS Visual Studio:
extern float _cdecl NeuWeightGet(int pIntNet, int pIntNeuron, int pIntInput, bool pBolDelta);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall NeuWeightGet(int pIntNet, int pIntNeuron, int pIntInput, bool pBolDelta);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron, starting with zero.
- Number of input. Put zero here for the bias.
- False to obtain the current weight or true to obtain the delta that will be applied to the weight.
Returns
A float.
Usage
The following C code puts into a float variable the current weight of the first input of the 10th neuron in the first network.
float lSngTmp = NeuWeightGet(0, 9, 1, false);
NeuWeightUpdatedGet
Purpose
Returns the number of times that the weight was calculated for a neuron since the last call to the function NetInitialize.
Declarations
Standard C:
extern int NeuWeightUpdatedGet(int pIntNet, int pIntNeuron, int pIntInput);
MS Visual Studio:
extern int _cdecl NeuWeightUpdatedGet(int pIntNet, int pIntNeuron, int pIntInput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuWeightUpdatedGet(int pIntNet, int pIntNeuron, int pIntInput);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron, starting with zero.
- Number of input, starting with 1.
Returns
An integer.
Usage
The following C code puts into an integer variable the number of times that the weight of the first input of the 10th neuron in the first network was updated:
int lIntTmp = NeuWeightUpdatedGet(0, 9, 1);
NeuWeightSet
Purpose
Sets the current weight, bias or delta of a neuron.
Declarations
Standard C:
extern bool NeuWeightSet(int pIntNet, int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
MS Visual Studio:
extern bool _cdecl NeuWeightSet(int pIntNet, int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall NeuWeightSet(int pIntNet, int pIntNeuron, int pIntInput, float pSng, bool pBolDelta);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron, starting with zero.
- Number of weight. Put zero here for the bias.
- Weight value.
- False to set the current weight or true to set the delta that will be applied to the weight.
Returns
True if weight was set. False if neuron is an input or is outside of the total number of added neurons.
Usage
The following C code sets the current weight value of the first input of the tenth neuron in the first network:
NeuWeightSet(0, 9, 1, 0.25, false);
NeuWeightsNumberGet
Purpose
Returns the number of weights of a neuron. If this function returns -2, it means that this neuron is not relevant for bias and weights and therefore its bias and weights should not be saved.
Declarations
Standard C:
extern int NeuWeightsNumberGet(int pIntNet, int pIntNeuron);
MS Visual Studio:
extern int _cdecl NeuWeightsNumberGet(int pIntNet, int pIntNeuron);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall NeuWeightsNumberGet(int pIntNet, int pIntNeuron);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Number of neuron.
Returns
An integer with the number of weights or:
- -1: error.
- -2: this neuron is not relevant for biases and weights.
Usage
The following C code puts into an integer variable the number of weights of the 10th neuron of the first network:
int lIntTmp = NeuWeightsNumberGet(0, 9);
Multiple set functions
The SDK is multiset, meaning it allows creating and deleting multiple sets in memory, over which we can later apply multiple neural networks. For more information, please refer to the “Set functions” section.
SetsAdd
Purpose
Adds a new set.
Declarations
Standard C:
extern int SetsAdd();
MS Visual Studio:
extern int _cdecl SetsAdd();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetsAdd();
Parameters
None.
Returns
- 0: SetCreate_Success
- 1: SetCreate_OutOfMemory
- 2: SetCreate_UnknownError
- 3: SetCreate_IndicatedParametersAreIncorrect
Usage
The following C example adds a new set:
SetsAdd();
SetsClone
Purpose
Clones a source set to a destination set, deleting the previous content of destination set.
Declarations
Standard C:
extern int SetsClone(int pIntSetSrc, int pIntSetDst);
MS Visual Studio:
extern int _cdecl SetsClone(int pIntSetSrc, int pIntSetDst);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetsClone(int pIntSetSrc, int pIntSetDst);
Parameters
- Index of source set.
- Index of destination set.
Returns
- 0: success.
- -1: error, typically could not create the new set.
Usage
The following C example clones set index 1 into index 0:
SetsClone(1, 0);
SetsLearn
Purpose
Learns all the sets and all the records, by all the networks, indicated previously by one or multiple calls to SetLearnStart.
SetsLearn, when running on NPUs (GPUs or similar), can be executed asynchronously, and then the function SetsLearnProgressGet can be called to obtain the percentage of progress.
Once SetsLearn has finished, the user must call SetsLearnEnd.
When calling SetLearnStart before using SetsLearn, do not repeat the index of network to be learned. In other words, all indices of the networks to be learned must be different.
Declarations
Standard C:
extern void SetsLearn(int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool pBolAsync);
MS Visual Studio:
extern void _cdecl SetsLearn(int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool pBolAsync);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetsLearn(int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool pBolAsync);
Parameters
- The current epoch number, which must start at zero.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- If the call should be asynchronous, which is only possible when running on NPUs.
Returns
Nothing.
Usage
The following C example starts learning all nets, with all the sets and records indicated by one or multiple calls to SetLearnStart:
// one or multiple SetLearnStart calls must be performed before calling SetsLearn, to define the networks, sets and records to be learned by SetsLearn
SetsLearn(0, mBolStopLearningIfGradientsVanish, true);
SetsLearnEnd
Purpose
Marks as finished the training based on the current memorized sets of records and networks indicated by one or multiple SetLearnStart and learned by SetsLearn.
Declarations
Standard C:
extern void SetsLearnEnd(int pIntCurrentEpochNumber, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern void _cdecl SetsLearnEnd(int pIntCurrentEpochNumber, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetsLearnEnd(int pIntCurrentEpochNumber, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- The current epoch number, which must start at zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
Nothing.
Usage
The following C example marks as finished the learning of all nets, with all the sets and records indicated by one or multiple previous calls to SetLearnStart:
SetsLearnEnd(0, &mBolThereIsNaN, &mBolGradientExploded, &mBolGradientsVanished, &mSngDatasetEntropy);
SetsLearnProgressGet
Purpose
Returns the percentage of progress, in the range [0, 1], of the learning process initiated by an asynchronous call to SetsLearnStart.
Declarations
Standard C:
extern float SetsLearnProgressGet();
MS Visual Studio:
extern float _cdecl float SetsLearnProgressGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall float SetsLearnProgressGet();
Parameters
None.
Returns
A float with the percentage of progress, in the range [0, 1].
Usage
The following C example stores into a variable the percentage of progress:
lSngPercentageOfAdvance = SetsLearnProgressGet() * 100.0;
SetsRemove
Purpose
Deletes a set.
Declarations
Standard C:
extern int SetsRemove(int pIntSet);
MS Visual Studio:
extern int _cdecl SetsRemove(int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetsRemove(int pIntSet);
Parameters
Index of the set.
Returns
- 0: SetCreate_Success
- 1: SetCreate_OutOfMemory
- 2: SetCreate_UnknownError
- 3: SetCreate_IndicatedParametersAreIncorrect
Usage
The following C example deletes the first set:
SetsRemove(0);
SetsNumberGet
Purpose
Returns the total number of sets created so far.
Declarations
Standard C:
extern int SetsNumberGet();
MS Visual Studio:
extern int _cdecl SetsNumberGet();
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetsNumberGet();
Parameters
None.
Returns
The total number of sets.
Usage
The following C example stores the total number of sets in a variable:
int lIntTotalNumberOfSets = SetsNumberGet();
Set functions
If you want, as this is optional, you can create a set with your records. Then, you can train the network directly in memory from the set. This will result in faster learning and the need of a smaller number of records.
The procedure is:
- Get sure that the set is empty by calling SetFree.
- Every time you have set up the inputs and the outputs, call SetRecordInsert to create a new record in the set. If you want to speed up SetRecordInsert, and to save time and computing power, you can call SetCreate before any SetRecordInsert so that the memory will be pre-reserved at once.
- At any moment, you can update the inputs and ouputs of a record with SetRecordSet or delete a record by calling SetRecordDelete.
- When all the records that you want to use in the training have been created, then you must, for each epoch:
- SetLearnStart (once)
- SetLearnContinue (per record) or SetLearnConsecutive (per group of consecutive records).
- SetLearnEnd (once) to finish. This call is optional, but recommended in case you are using batches to learn, and needed in Dynamic propagation mode to allow the auto adaptation of the neural network.
Remember to use NetSnapshotTake to store the network configuration while the set is being used for training and, after SetLearnEnd, to use NetSnapshotGet to recall the best network configuration obtained during training.
The set also provides auto machine learning functions which automate certain tasks of a search of a learning model:
- SetAutoLearnStart
- SetAutoLearnContinue
- SetAutoLearnEnd
The set auto learn features only works with MODE_NORMAL.
Please check a source code example to better understand the set and its great benefits.
SetAutoLearnContinue (beta)
Purpose
Continues the set auto learn feature. The function SetAutoLearnStart must be called before calling this function. The function SetAutoLearnEnd must be called at the end of the auto learn process.
Declarations
Standard C:
extern float SetAutoLearnContinue(int pIntNet, int pIntSet, int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetAutoLearnContinue((int pIntNet, int pIntSet, int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetAutoLearnContinue(int pIntNet, int pIntSet, int* pIntNumberOfLayers, int* pIntCurrentRecord, int* pIntCurrentEpochNumber, int *pIntAutoCurrentOption, bool pBolStopIfGradientsVanish, bool* pBolThereIsNaN, int* pIntStatus, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- An integer which returns the number of layers of the current neural network. This value is what the function is trying to create, but it does not mean that the current number of layers of the neural network will always be this parameter. To find out the current real number of layers of the current neural network, please use NetLayersNumberGet.
- An integer which returns the current number of the record of the data set being learned.
- An integer which returns the current number of the epoch in the learning process.
- An integer which returns the current option (from the list of options defined by SetAutoLearnStart) being tested.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- An integer which returns the status, following the contants related to NetLearn_Success.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
A float with the current success rate.
Usage
The following C example continues the set auto learn process of the first network:
float lSngSuccessRate = SetAutoLearnContinue(0, 0, &lIntNumberOfLayers, &lIntCurrentRecord, &lIntCurrentEpochNumber, &lIntCurrentOption, true, &lBolThereIsNaN, &lIntStatus, &lBolGradientsExploded, &lBolGradientsVanished, &lSngEntropy);
SetAutoLearnEnd (beta)
Purpose
Finishes the set auto learn feature. In the mode AUTOLEARN_MODE_INFINITE_DATA this function will restore the data set to its original content when the function SetAutoLearnStart was called.
Functions SetAutoLearnStart and SetAutoLearnContinue must be called before calling this function.
Declarations
Standard C:
extern bool SetAutoLearnEnd(int pIntNet, int pIntSet);
MS Visual Studio:
extern bool _cdecl SetAutoLearnEnd(int pIntNet, int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetAutoLearnEnd(int pIntNet, int pIntSet);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
Returns
True if no error.
Usage
The following C example finishes the set auto learn process of the first network:
bool lBolOk = SetAutoLearnEnd(0, 0);
SetAutoLearnStart (beta)
Purpose
Starts the auto learn features of the set. This function must be called before calling SetAutoLearnContinue. The function SetAutoLearnEnd must be called at the end of the auto learn process.
There are 2 possible modes:
- AUTOLEARN_MODE_FINITE_DATA: this mode is recommended when we have a finite number of records of data. It tests different neural network architectures, training with the part of the data set for training, testing with the part of the data set for testing, for a number of epochs and performs early stop moving to the next architecture of network if the trend of the success rate will not reach the target.
- AUTOLEARN_MODE_INFINITE_DATA: this mode is recommended when we expect an infinite quantity of data, for example, data coming from humans interacting with our system (also know as “lifelong learning”). In this mode, the data set will be altered until calling SetAutoLearnEnd. It tests a list of neural network architectures, together with a list of possible data augmentations, for a number of epochs, and provides the value of the trend of the success rate and the error of the trend. The error of the trend will indicate how likely is that a real success rate would be temporary very low with respect to the provided value of the trend of the success rate.
Declarations
Standard C:
extern int SetAutoLearnStart(int pIntNet, int pIntSet, int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
MS Visual Studio:
extern int _cdecl SetAutoLearnStart(int pIntNet, int pIntSet, int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetAutoLearnStart(int pIntNet, int pIntSet, int pIntAutoMode, int pIntCyclesControl, int pIntMaxNeurons, int pIntEpochs, int pIntBatches, int pIntLayersStart, int pIntLayersEnd, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, float pSngTargetSuccessesOverRecords, int pIntTestType, int pIntTimeForTrend, int pIntItemsForTrend, int pIntIWidth, int pIntIHeight, int pIntNumberOfOptions, char** pNets, float* pLearningRates, float* pInitMeans, float* pInitVariances, char** pDropouts, int* pMinNumberOfRecords, int* pReturnNumberOfPredictions, int* pAugShapes, float* pAugRotationFrom, float* pAugRotationTo, float* pAugRotationStep, float* pAugBrightnessFrom, float* pAugBrightnessTo, float* pAugBrightnessStep, float* pAugContrastFrom, float* pAugContrastTo, float* pAugContrastStep, float* pAugGammaFrom, float* pAugGammaTo, float* pAugGammaStep, float* pAugMoveHorizontalFrom, float* pAugMoveHorizontalTo, float* pAugMoveHorizontalStep, float* pAugMoveVerticalFrom, float* pAugMoveVerticalTo, float* pAugMoveVerticalStep, float* pAugZoomFrom, float* pAugZoomTo, float* pAugZoomStep, float* pAugHueFrom, float* pAugHueTo, float* pAugHueStep, float* pAugSaturationFrom, float* pAugSaturationTo, float* pAugSaturationStep, int* pAugNumberOfRecordsPerRecord, float* pTrendFinal, float* pError);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- pIntAutoMode: the mode to be used, over:
- #define AUTOLEARN_MODE_FINITE_DATA 0
- #define AUTOLEARN_MODE_INFINITE_DATA 1
- pIntCyclesControl: control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- pIntMaxNeurons: maximum number of neurons. Put zero for no limit.
- pIntEpochs: number of epochs to learn.
- pIntBatches: number of batches to perform the learning. If this parameter is not zero, actions will be taken and execution will return after each batch.
- pIntLayersStart: number of layers to start creating of neural networks. The minimum value to start is 2, although you can put zero in this parameter to ignore it and use current neural network. See NetArchitectureCreateAuto in this document for more information.
- pIntLayersEnd: number of layers to stop creating neural networks. See NetArchitectureCreateAuto in this document for more information. You can put zero in this parameter to ignore it and use current neural network.
- pIntNumberOfFirstRecordForTraining: number of the first record for training, starting at zero. Not necessary in the AUTOLEARN_MODE_INFINITE_DATA.
- pIntNumberOfLastRecordForTraining: number of the last record for training. Not necessary in the AUTOLEARN_MODE_INFINITE_DATA.
- pIntNumberOfWhichForValidation: number of records from the list from pIntNumberOfFirstRecordForTraining to pIntNumberOfLastRecordForTraining which will be used for validation, not for test. The records from the data set which are not included in the list from pIntNumberOfFirstRecordForTraining to pIntNumberOfLastRecordForTraining will be used for test. If this parameter is not zero, the learning rate will be also auto adjusted.
- pSngThresholdForActive: float indicating which activation value means that an output is active.
- pSngDeviationPctTarget: float indicating which percentage of deviation in the activation value of an output will mean that the output is not active. This is an alternative to pSngThresholdForActive, put zero in this parameter if you are going to use a threshold for active.
- pSngTargetSuccessesOverRecords: a float in the range of [0, 1] indicating the percentage of successful predictions over the total records. In the AUTOLEARN_MODE_FINITE_DATA, this parameter is what will determine an early stop when the pace of the trend of the success rate will not achieve this parameter in the remaining epochs.
- pIntTestType: test type. For more information, please read SetLearnStart in this document.
- pIntTimeForTrend: time in seconds to analyze if the trend of the success rate is going to reach the target parameter pSngTargetSuccessesOverRecords. If indicated, leave the following parameter as zero.
- pIntItemsForTrend: the number of success rates to analyze if the trend of the success rate is going to reach the target parameter pSngTargetSuccessesOverRecords.
- The following parameters only apply to the mode AUTOLEARN_MODE_INFINITE_DATA:
- pIntIWidth: width of the inputs.
- pIntIHeight: height of the inputs.
- pIntNumberOfOptions: number of elements in the following lists.
- pNets: pointer to a list of strings with definitions of network architectures. Please read NetArchitectureCreate in this document for more information of detailed and mnemotechnic strings.
- pLearningRates: a list of floats with learning rates.
- pInitMeans: list of floats with the mean values to randomly initialize the neural network.
- pInitVariances: list of floats with the variance values to randomly initialize the neural network.
- pDropouts: pointer to a list of strings with the dropout rates of each layer, separated by commas (,).
- pMinNumberOfRecords: list of integers which indicates, for each option, the number of records which will be stored in an internal data set, augmented, and then uses as a big batch for learning. Please note that, inside this batch, there can still be the learning batches indicated with the previous parameter. The difference between the batch created by this parameter and the batches created by the previous parameter is that the batch of this parameter is designed to guarantee that the network learns enough different classes of records, while the previous batches are only intended to accumulate the adjustment of biases and weights in batches.
- pReturnNumberOfPredictions: list of integers with, for each option, the number of the most probable predictions that will be used to check success. For example: let us consider that the neural network is learning to classify 10 types of fruits. A value of 3 in this parameter will consider a success if a record of the dataset was an image of a banana and the neural network predicted banana as the 1st, 2nd or 3rd most probable output.
- For parameters from pAugShapes to pAugSaturationStep, please read the function SetImagesAugment also in this document.
- pAugNumberOfRecordsPerRecord: a list of integers returning the number of augmented records that will be created per record of the data set.
- pTrendFinal: a list of float numbers returning the final trend of success rate (percentage) calculated by this function.
- pError: a list of float numbers returning the final error of the measured success rates with respect to the trend calculated by this function. Errors are calculated as the average of the sum of the squares of the differences between success rates and the trend.
Returns
An integer, indicating the result:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero.
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example prepares the set auto learn feature to test different options to find the best learning model for the first network:
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingArchitecture 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Architecture_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
#define NetLearn_Out_Of_Memory 13
#define NetLearn_Could_not_augment_data_set 14
#define NetLearn_Options_Finished 15
#define NUM_EPOCHS 5
#define NEURONS_THRESHOLD_FOR_ACTIVE 0.5
#define NUMBER_OF_RECENT_PREDICTIONS_TO_COUNT_SUCCESS 100
if (SetAutoLearnStart(0, 0, AUTOLEARN_MODE_INFINITE_DATA, 0, 0, NUM_EPOCHS, 0, 0, 0, 0, 0, 0, NEURONS_THRESHOLD_FOR_ACTIVE, 0, 0, TEST_TYPE_ByMax, 0, NUMBER_OF_RECENT_PREDICTIONS_TO_COUNT_SUCCESS, mIntImageWidth, mIntImageHeight, lAutoNets->count, lAutoNets->item, lLearningRates->item, lInitMeans->item, lInitVariances->item, lAutoDropouts->item, lMinNumberOfRecords->item, lReturnNumberOfPredictions->item, lAugShapes->item, lAugRotationFrom->item, lAugRotationTo->item, lAugRotationStep->item, lAugBrightnessFrom->item, lAugBrightnessTo->item, lAugBrightnessStep->item, lAugContrastFrom->item, lAugContrastTo->item, lAugContrastStep->item, lAugGammaFrom->item, lAugGammaTo->item, lAugGammaStep->item, lAugMoveHorizontalFrom->item, lAugMoveHorizontalTo->item, lAugMoveHorizontalStep->item, lAugMoveVerticalFrom->item, lAugMoveVerticalTo->item, lAugMoveVerticalStep->item, lAugZoomFrom->item, lAugZoomTo->item, lAugZoomStep->item, lAugHueFrom->item, lAugHueTo->item, lAugHueStep->item, lAugSaturationFrom->item, lAugSaturationTo->item, lAugSaturationStep->item, lAugNumberOfRecordsPerRecord->item, lTrendFinal->item, lError->item) == NetLearn_Success)
printf(“Ready to start.n”);
SetCreate
Purpose
Creates the memory structure for a set of inputs and outputs for the neural network. This function is not strictly necessary, as calling multiple times to SetRecordInsert will also reserve memory for each record, but slower than using SetCreate once and before the SetRecordInsert calls. Therefore, SetCreate is used before than the multiple calls to SetRecordInsert, as it will be faster by reserving all the needed memory for the whole set at once.
Please remember that SetFree must be called after SetCreate or the memory will not be freed by any other function, which could lead to memory corruption problems.
Declarations
Standard C:
extern bool SetCreate(int pIntSet, int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
MS Visual Studio:
extern bool _cdecl SetCreate(int pIntSet, int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetCreate(int pIntSet, int pIntTotalNumberOfRecords, int pIntNumberOfInputsChannels, int pIntNumberOfInputs, int pIntNumberOfOutputs);
Parameters
- Index of the set.
- Total number of records of the data set.
- The number of channels of the inputs (for example 3 for RGB [red, green, blue] images).
- Number of inputs.
- Number of outputs.
Returns
- True: memory was reserved successfully.
- False: out of memory.
Usage
The following C example reserves memory for a data set of 5000 records of 3 inputs channels of 100 inputs each and 1 output:
SetCreate(0, 5000, 3, 100, 1);
SetFree
Purpose
Destroys the current set of records and frees the memory.
Declarations
Standard C:
extern void SetFree(int pIntSet, bool pBolSnapshotsFree);
MS Visual Studio:
extern void _cdecl SetFree(int pIntSet, bool pBolSnapshotsFree);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetFree(int pIntSet, bool pBolSnapshotsFree);
Parameters
- Index of the set.
- If the snapshot should be also freed first.
Returns
Nothing.
Usage
The following C example destroys the first set of records and frees the snapshot and the memory:
SetFree(0, true);
SetImagesAugment
Purpose
Makes the neural network to generalize better when learning images. This is achieved by taking a subset of consecutive records of the data set and generating new records, based on the original records, changing different image parameters, also known as augmentation.
If you need to know the number of records that will be generated by this function before using it, call first SetImagesAugmentNumberGet.
This function will produce a vertical flip if requested to rotate 360 degrees.
Declarations
Standard C:
extern bool SetImagesAugment(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntSetWidth, int pIntSetHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio:
extern bool _cdecl SetImagesAugment(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntSetWidth, int pIntSetHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetImagesAugment(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntWidth, int pIntHeight, int pIntSetWidth, int pIntSetHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
Parameters
- Index of the set.
- First record to be augmented.
- Last record to be augmented.
- Width of images. All images must have the same width.
- Height of images. All images must have the same height.
- Width of the set. It can be larger or equal to pIntWidth, but not smaller.
- Height of the set. It can be larger or equal to pIntHeight, but not smaller.
- Maximum number of records that will be generated per record. Put 0 to generate all possible records.
- Filter the image by a shape. Only if width and height of image are the same you can perform rotations, horizontal and vertical moves, zoom, and filters by shapes.
- The following parameters are all optional and will have a parameter from, to indicate the value to start; a parameter to, to indicate the value to finish; and a parameter step, to indicate the quantity of change when going from start to to:
- Degrees to rotate images.
- Brightness.
- Contrast.
- Gamma.
- Horizontal move.
- Vertical move.
- Zoom.
- Hue.
- Saturation.
These parameters must be in the range of these constants:
#define AUGMENT_FILTER_SHAPE_ALL 0
#define AUGMENT_FILTER_SHAPE_CIRCLE 1
#define AUGMENT_BRIGHT_MIN -150.0
#define AUGMENT_BRIGHT_MAX 150.0
#define AUGMENT_BRIGHT_NORMAL 0.0
#define AUGMENT_CONTRAST_MIN -100.0
#define AUGMENT_CONTRAST_MAX 100.0
#define AUGMENT_CONTRAST_NORMAL 0.0
#define AUGMENT_GAMMA_MIN 0.01
#define AUGMENT_GAMMA_MAX 10.0
#define AUGMENT_GAMMA_NORMAL 1.0
#define AUGMENT_MOVE_MIN -100.0
#define AUGMENT_MOVE_MAX 100.0
#define AUGMENT_MOVE_NORMAL 0.0
#define AUGMENT_ZOOM_MIN 0.01
#define AUGMENT_ZOOM_MAX 10.0
#define AUGMENT_ZOOM_NORMAL 1.0
#define AUGMENT_HUE_MIN 0.0
#define AUGMENT_HUE_MAX 360.0
#define AUGMENT_HUE_NORMAL 0.0
#define AUGMENT_SATURATION_MIN 0.0
#define AUGMENT_SATURATION_MAX 10.0
#define AUGMENT_SATURATION_NORMAL 0.0
Returns
True if the images were created inside the data set, or false if it could not, typically because of out of memory.
Usage
The following C function augments the first data set or calculates the number of new augmented records that will be created. It is important to use the same function to calculate the number of augmented records and to augment them, to avoid putting different parameters in both calls. All “mSng…” variables are float module variables:
bool augmentRecordsPerRecord(bool pBolOnlyCalculate, bool pBolOnlyShape, int pIntRecordFrom, int pIntRecordTo, int *pIntNumberOfRecordsPerRecord)
{
if (pBolOnlyShape)
{
mSngRotationFrom = 0;
mSngRotationTo = 0;
mSngBrightnessFrom = AUGMENT_BRIGHT_NORMAL;
mSngBrightnessTo = AUGMENT_BRIGHT_NORMAL;
mSngContrastFrom = AUGMENT_CONTRAST_NORMAL;
mSngContrastTo = AUGMENT_CONTRAST_NORMAL;
mSngZoomFrom = AUGMENT_ZOOM_NORMAL;
mSngZoomTo = AUGMENT_ZOOM_NORMAL;
mSngHueFrom = AUGMENT_HUE_NORMAL;
mSngHueTo = AUGMENT_HUE_NORMAL;
mSngSaturationFrom = AUGMENT_SATURATION_NORMAL;
mSngSaturationTo = AUGMENT_SATURATION_NORMAL;
}
else
{
mSngRotationFrom = AUGMENTATION_ROTATION_FROM + mSngRotationStep * myRand(1);
mSngRotationTo = mSngRotationTo;
mSngBrightnessFrom = AUGMENTATION_BRIGHTNESS_FROM + mSngBrightnessStep * myRand(1);
mSngBrightnessTo = mSngBrightnessTo;
mSngContrastFrom = AUGMENTATION_CONTRAST_FROM + mSngContrastStep * myRand(1);
mSngHueFrom = AUGMENTATION_HUE_FROM + mSngHueStep * myRand(1);
mSngSaturationFrom = AUGMENTATION_SATURATION_FROM + mSngSaturationStep * myRand(1);
}
if (pBolOnlyCalculate)
{
if (mBolAugmentationEnabled)
(*pIntNumberOfRecordsPerRecord) = SetImagesAugmentNumberGet(0, pIntRecordFrom, pIntRecordTo, INPUTS_CHANNELS, mIntImageWidth, mIntImageHeight, 0, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep);
else
(*pIntNumberOfRecordsPerRecord) = 1;
return(true);
}
else
return(SetImagesAugment(pIntRecordFrom, pIntRecordTo, mIntImageWidth, mIntImageHeight, 0, mIntFilterShape, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep));
}
SetImagesAugmentNumberGet
Purpose
Calculates the number of new records that will be created in the data set by the function SetImagesAugment.
Declarations
Standard C:
extern int SetImagesAugmentNumberGet(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio:
extern int _cdecl SetImagesAugmentNumberGet(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetImagesAugmentNumberGet(int pIntSet, int pIntFirstRecord, int pIntLastRecord, int pIntNumberOfChannels, int pIntWidth, int pIntHeight, int pIntMaxNumberOfRecordsPerRecord, int pIntFilterShape, float pSngRotationFrom, float pSngRotationTo, float pSngRotationStep, float pSngBrightnessFrom, float pSngBrightnessTo, float pSngBrightnessStep, float pSngContrastFrom, float pSngContrastTo, float pSngContrastStep, float pSngGammaFrom, float pSngGammaTo, float pSngGammaStep, float pSngMoveHorizontalFrom, float pSngMoveHorizontalTo, float pSngMoveHorizontalStep, float pSngMoveVerticalFrom, float pSngMoveVerticalTo, float pSngMoveVerticalStep, float pSngZoomFrom, float pSngZoomTo, float pSngZoomStep, float pSngHueFrom, float pSngHueTo, float pSngHueStep, float pSngSaturationFrom, float pSngSaturationTo, float pSngSaturationStep);
Parameters
- Index of the set.
- First record to be augmented.
- Last record to be augmented.
- Number of channels, for example 3 in case of RGB [red, green blue].
- Width of images. All images must have the same width.
- Height of images. All images must have the same height.
- Maximum number of records that will be generated per record. Put 0 to generate all possible records.
- Filter the image by a shape. Only if width and height of image are the same you can perform rotations, horizontal and vertical moves, zoom, and filters by shapes.
- The following parameters are all optional and will have a parameter from, to indicate the value to start; a parameter to, to indicate the value to finish; and a parameter step, to indicate the quantity of change when going from start to to:
- Degrees to rotate images.
- Brightness.
- Contrast.
- Gamma.
- Horizontal move.
- Vertical move.
- Zoom.
- Hue.
- Saturation.
These parameters must be in the range of the constants that you can read in this manual in the function SetImagesAugment.
Returns
The number of new augmented records that will be created in the data set.
Usage
The following C function augments the first data set or calculates the number of new augmented records that will be created. It is important to use the same function to calculate the number of augmented records and to augment them, to avoid putting different parameters in both calls. All “mSng…” variables are float module variables:
bool augmentRecordsPerRecord(bool pBolOnlyCalculate, bool pBolOnlyShape, int pIntRecordFrom, int pIntRecordTo, int *pIntNumberOfRecordsPerRecord)
{
if (pBolOnlyShape)
{
mSngRotationFrom = 0;
mSngRotationTo = 0;
mSngBrightnessFrom = AUGMENT_BRIGHT_NORMAL;
mSngBrightnessTo = AUGMENT_BRIGHT_NORMAL;
mSngContrastFrom = AUGMENT_CONTRAST_NORMAL;
mSngContrastTo = AUGMENT_CONTRAST_NORMAL;
mSngZoomFrom = AUGMENT_ZOOM_NORMAL;
mSngZoomTo = AUGMENT_ZOOM_NORMAL;
mSngHueFrom = AUGMENT_HUE_NORMAL;
mSngHueTo = AUGMENT_HUE_NORMAL;
mSngSaturationFrom = AUGMENT_SATURATION_NORMAL;
mSngSaturationTo = AUGMENT_SATURATION_NORMAL;
}
else
{
mSngRotationFrom = AUGMENTATION_ROTATION_FROM + mSngRotationStep * myRand(1);
mSngRotationTo = mSngRotationTo;
mSngBrightnessFrom = AUGMENTATION_BRIGHTNESS_FROM + mSngBrightnessStep * myRand(1);
mSngBrightnessTo = mSngBrightnessTo;
mSngContrastFrom = AUGMENTATION_CONTRAST_FROM + mSngContrastStep * myRand(1);
mSngHueFrom = AUGMENTATION_HUE_FROM + mSngHueStep * myRand(1);
mSngSaturationFrom = AUGMENTATION_SATURATION_FROM + mSngSaturationStep * myRand(1);
}
if (pBolOnlyCalculate)
{
if (mBolAugmentationEnabled)
(*pIntNumberOfRecordsPerRecord) = SetImagesAugmentNumberGet(pIntRecordFrom, pIntRecordTo, INPUTS_CHANNELS, mIntImageWidth, mIntImageHeight, 0, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep);
else
(*pIntNumberOfRecordsPerRecord) = 1;
return(true);
}
else
return(SetImagesAugment(0, pIntRecordFrom, pIntRecordTo, mIntImageWidth, mIntImageHeight, 0, mIntFilterShape, mSngRotationFrom, mSngRotationTo, mSngRotationStep, mSngBrightnessFrom, mSngBrightnessTo, mSngBrightnessStep, mSngContrastFrom, mSngContrastTo, mSngContrastStep, mSngGammaFrom, mSngGammaTo, mSngGammaStep, mSngMoveHorizontalFrom, mSngMoveHorizontalTo, mSngMoveHorizontalStep, mSngMoveVerticalFrom, mSngMoveVerticalTo, mSngMoveVerticalStep, mSngZoomFrom, mSngZoomTo, mSngZoomStep, mSngHueFrom, mSngHueTo, mSngHueStep, mSngSaturationFrom, mSngSaturationTo, mSngSaturationStep));
}
SetImagesDownsize
Purpose
Reduces the size of all the images (records) in the data set.
Declarations
Standard C:
extern bool SetImagesDownsize(int pIntSet, int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
MS Visual Studio:
extern bool _cdecl SetImagesDownsize(int pIntSet, int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetImagesDownsize(int pIntSet, int pIntCurrentWidth, int pIntCurrentHeight, int pIntNewWidth, int pIntNewHeight);
Parameters
- Index of the set.
- The current width of the images of the data set.
- The current height of the images of the data set.
- The new width of the images of the data set, which must be lower than the current.
- The new height of the images of the data set, which must be lower than the current.
Returns
True if went well.
Usage
The following C code makes all the images (all the records) of the first data set, which are currently of 200 x 200 pixels, to be 50 x 50 pixels:
SetImagesDownsize(0,200, 200, 50, 50);
SetLearnContinue
Purpose
Makes the neural network to learn a record from the set. What it does internally is to put inputs and outputs of a certain record from the set in memory, and then makes the neural network to learn.
Returns an estimation of the success in predictions of the current network.
Declarations
Standard C:
extern float SetLearnContinue(int pIntNet, int pIntSet, int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetLearnContinue(int pIntNet, int pIntSet, int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnContinue(int pIntNet, int pIntSet, int pIntRecordNumber, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The number of the record to learn from the data set.
- A boolean to indicate if this function must return an estimation of the success in predictions.
- The current epoch number, which must start at zero.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example continues training with record number i, returns if there were NaN (“Not a Number”) in a parameter boolean variable named lBolThereIsNan, and obtains more information:
SetLearnContinue(0, 0, i, true, 0, true, &lBolThereIsNan, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
SetLearnConsecutive
Purpose
Makes the neural network to learn exactly as SetLearnContinue but for all the consecutive records indicated in the parameters.
Returns an estimation of the success in predictions of the current network.
Declarations
Standard C:
extern float SetLearnConsecutive(int pIntNet, int pIntSet, int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish,
bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio:
extern float _cdecl SetLearnConsecutive(int pIntNet, int pIntSet, int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnConsecutive(int pIntNet, int pIntSet, int pIntRecordStart, int pIntRecordEnd, bool pBolEstimateSuccess, int pIntCurrentEpochNumber, bool pBolStopIfGradientsVanish, bool* pBolThereIsNan, bool* pBolGradientExploded, bool* pBolGradientsVanished, float* pSngEntropy);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The number of the start record to learn.
- The number of the end record to learn.
- A boolean to indicate if this function must return an estimation of the success in predictions.
- The current epoch number, which must start at zero.
- A boolean which indicates if back propagation should stop if all gradients of the ouputs layer became zero with output target values different than zero.
- A boolean returned parameter which would indicate with true if the training generated not numbers.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax, and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
- A returning boolean indicating if, during back propagation, all gradients became zero with output target values different than zero.
- A returning float containing the entropy.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example continues training with all records from number i to number j, and returns if there were NaN (“Not a Number”) in a parameter boolean variable named lBolThereIsNan, and obtains more information:
SetLearnConsecutive(0, 0, i, j, false, 0, true, &lBolThereIsNan, &lBolGradientExploded, &lBolGradientsVanished, &lSngEntropy);
SetLearnStart
Purpose
Prepares the set for the self-learning process, performing:
- Create internal necessary memory structures.
- Internal random reposition of the data set records.
Please note that batches are only considered in MODE_NORMAL mode.
Declarations
Standard C:
extern int SetLearnStart(int pIntNet, int pIntSet, int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
MS Visual Studio:
extern int _cdecl SetLearnStart(int pIntNet, int pIntSet, int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetLearnStart(int pIntNet, int pIntSet, int pIntCyclesControl, int pIntBatches, int pIntNumberOfFirstRecordForTraining, int pIntNumberOfLastRecordForTraining, int pIntNumberOfWhichForValidation, float pSngThresholdForActive, float pSngDeviationPctTarget, int pIntTestType);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- Control of cycles (for more information please view the introduction):
- 0: no cycles will be checked.
- 1: cycles will be checked and avoided.
- An integer indicating the number of batches to process. The batch size is defined as the number of records for training of the data set divided by this parameter. Put 0 to not use batches.
- An integer indicating the number of the first record of the data set that will be used for training.
- An integer indicating the number of the last record of the data set that will be used for training. This, and the previous parameter, define the subset that will be used for training. The rest of the records of the data set will be used for testing. The records for testing must be contiguous.
- An integer indicating the number of records, from the records for training indicated in the previous parameter, that will be used for validation. These records will be the ones at the end of the training subset.
- A float value indicating the threshold, typically in the range of [0, 1], to consider that an output is active (when its value is equal or greater than this parameter) or inactive.
- A float value indicating the target percentage of deviation, between the real value and the predicted value, to consider that a prediction is correct. This parameter is only used when the previous parameter threshold is zero.
- An integer to indicate:
- TEST_TYPE_ByThreshold: when the output is equal or higher than the activation value, it will be considered as active; and when the active predicted outputs match exactly to those that were indicated in the learning process, then the record is considered as a success.
- TEST_TYPE_ByMax: only the output with the highest activation value will be considered active. Additionally, if this parameter is true, if the activation value of the output with the maximum activation value does not reach pSngThresholdForActive; then it will be set to pSngThresholdForActive.
- TEST_TYPE_ByOutputs: same behavior than by threshold, but it will count how many outputs were predicted ok with respect to the total outputs predicted, instead of counting correct and incorrect records from the set.
Returns
An integer, indicating:
- 0: success.
- 1: learning process generated NaNs (Not A Numbers), although it continued. This could mean that you should use Sigmoid as the activation function.
- 2: error managing threads.
- 3: the set has no records.
- 4: error analyzing layers.
- 5: parameters are incorrect:
- The number of epochs must be greater than zero
- The number of batches must be greater than zero.
- The number of records for training must be greater than zero.
- The number of records for validation must not be negative.
- The number of records in the set must be greater than the number of batches.
- The threshold to consider active must be greater than zero.
- 6: error creating topology.
- 7: success of test while learning.
- 8: objective was not achieved and there are no more layers to try.
- 9: epochs finished.
- 10: could not initialize topology and start.
- 11: incorrect mode and or topology.
- 12: number of inputs or outputs of set do not match those of the neural network.
Usage
The following C example starts the self-learning process, using the previously memorized set. As it sets the threshold to zero, and indicates a Deviation Target, it will most probably will used to learn values prediction, for example to forecast time series. It does not use control of cycles and batches are not used neither:
#define NetLearn_Success 0
#define NetLearn_NAN 1
#define NetLearn_ThreadsError 2
#define NetLearn_SetHasNoRecords 3
#define NetLearn_ErrorAnalyzingLayers 4
#define NetLearn_IndicatedParametersAreIncorrect 5
#define NetLearn_ErrorCreatingTopology 6
#define NetLearn_Success_Tested 7
#define NetLearn_Objective_Not_Achieved_And_No_More_Layers_To_Try 8
#define NetLearn_Epochs_Finished 9
#define NetLearn_Could_Not_Initialize_Topology_And_Start 10
#define NetLearn_Incorrect_Mode_And_Or_Topology 11
#define NetLearn_Number_Of_Inputs_Or_Outputs_Of_Set_Do_Not_Match_Neural_Network 12
SetLearnStart(0, 0, 0, 0, lIntNumberOfTheFirstRecordForTraining, lIntNumberOfTheLastRecordForTraining, 0, lSngActivationPercentage, lSngActivationPercentage, lIntTestType);
SetLearnEnd
Purpose
Marks as finished the training based on the current memorized set of records. Sets the neural network into auto adaptative only when in Dynamic propagation mode, in the rest of cases, its use is optional but recommended if you are using batch size different than 0, to consolidate the learning.
Returns an estimation of the success in predictions of the current network
Declarations
Standard C:
extern float SetLearnEnd(int pIntNet, int pIntCurrentEpochNumber);
MS Visual Studio:
extern float _cdecl SetLearnEnd(int pIntNet, int pIntCurrentEpochNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetLearnEnd(int pIntNet, int pIntCurrentEpochNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- The current epoch number.
Returns
Returns an estimation of the success in predictions of the current network. This estimation is based in how many records of the set had their outputs predicted correctly so far. Although quite precise, this is just an estimation and for the correct success ratio the function SetSuccessGet should be used.
Usage
The following C example finishes training with the memorized and activates the auto adaptative feature of the first network:
SetLearnEnd(0, 0);
SetLoadFromFile
Purpose
Loads the data set from a file. This function will internally use SetFree and SetCreate, therefore, you do not need to call these other functions before calling it.
For the characteristics of the file, please read SetSaveToFile.
Declarations
Standard C:
extern int SetLoadFromFile(int pIntSet, int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
MS Visual Studio:
extern int _cdecl SetLoadFromFile(int pIntSet, int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetLoadFromFile(int pIntSet, int* pIntNumberOfInputs, int* pIntNumberOfOutputs, int* pIntNumberOfRecords);
Parameters
- Index of the set.
- 3 Integers which return indicating the number of:
- Inputs.
- Outputs.
- Records
If the values provided are not high enough to store all the inputs and outputs, the function will recalculate them and generate the file accordingly. Therefore, a 0 can be passed on each of them.
Returns
An integer indicating with:
- 0: file was loaded successfully.
- 1: file could not be opened.
- 2: version could not be read.
- 3: second line could not be read.
- 4: header is not correct.
- 5: file indicates in header to have 0 data records.
- 6: file does not include complete data for the number of records indicated in the header.
- 7: out of memory.
Usage
The following C code loads the first data set from a file:
int lIntRes = SetLoadFromFile(0, &lIntNumberOfInputs, &lIntNumberOfOutputs, &lIntNumberOfRecords);
SetRecordDelete
Purpose
Deletes the parameter record number from the set. Calling this function will reduce in memory the size of the set to the current number of records, overriding the work of the call SetCreate.
Declarations
Standard C:
extern int SetRecordDelete(int pIntSet, int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordDelete(int pIntSet, int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordDelete(int pIntSet, int pIntRecordNumber);
Parameters
- Index of the set.
- The record number to be deleted from the set, starting with 0 for the first record.
Returns
- An integer: indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example deletes the first record from the first set and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordDelete(0, 0);
SetRecordInsert
Purpose
Inserts into the set, as a new record, the current inputs and outputs of the neural network. For faster operation of SetRecordInsert, it is recommended to previously have called SetCreate for the total number of records that you intend to create via SetRecordInsert. By calling SetCreate previously to SetRecordInsert, all the necessary memory will be reserved at once, instead of per record with SetRecordInsert.
Declarations
Standard C:
extern int SetRecordInsert(int pIntNet, int pIntSet);
MS Visual Studio:
extern int _cdecl SetRecordInsert(int pIntNet, int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordInsert(int pIntNet, int pIntSet);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
Returns
- An integer: indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example memorizes in the first set, as a new record, the current inputs and outputs of the first neural network, and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordInsert(0, 0);
SetRecordGet
Purpose
Updates the current inputs and outputs of the neural network with the ones in the parameter record of the set.
Declarations
Standard C:
extern bool SetRecordGet(int pIntNet, int pIntSet, int pIntRecordNumber);
MS Visual Studio:
extern bool _cdecl SetRecordGet(int pIntNet, int pIntSet, int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordGet(int pIntNet, int pIntSet, int pIntRecordNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The record number of the set, starting with 0 for the first record.
Returns
- True: if it got the record correctly.
- False: if there are not records in the set or the requested number of record is higher than the number of records in the set.
Usage
The following C example updates the current inputs and ouputs of the first network with the ones memorized in the 10th record of the first set:
bool lBolTmp = SetRecordGet(0, 0, 9);
SetRecordOutputGet
Purpose
Return the output value of a record of the dataset.
Declarations
Standard C:
extern float SetRecordOutputGet(int pIntSet, int pIntRecordNumber, int pIntOutput);
MS Visual Studio:
extern float _cdecl SetRecordOutputGet(int pIntSet, int pIntRecordNumber, int pIntOutput);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetRecordOutputGet(int pIntSet, int pIntRecordNumber, int pIntOutput);
Parameters
- Index of the set.
- The record number of the set, starting with 0 for the first record.
- The index of the output.
Returns
The number of the output, not the predicted value, or the following errors:
- -1: inputs and outputs do not match between the network and the data set.
Usage
The following C example gets, from the first data set, and from the record in the 10th position, the number of the output:
float lIntOutputNumber = SetRecordOutputGet(0, 9, 0);
SetRecordOutputMaxGet
Purpose
Returns the number of the output of the record of the data set which has the maximum value (not the predicted value), and also returns the original position that this record had in the data set before being shuffled.
This function is useful to know which is the active output of a record from the data set after it has been shuffled by functions like SetLearnStart.
Declarations
Standard C:
extern int SetRecordOutputMaxGet(int pIntSet, int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordOutputMaxGet(int pIntSet, int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordOutputMaxGet(int pIntSet, int pIntRecordNumber);
Parameters
- Index of the set.
- The record number of the set, starting with 0 for the first record.
Returns
The number of the output which has the maximum value, not the predicted value, or the following errors:
- -1: inputs and outputs do not match between the network and the data set.
- -2: there was an error when reordering the data set.
Usage
The following C example gets, from the first data set, and from the record in the 10th position, the number of the output which has the maximum value:
int lIntOutputNumberWithMaxValue = SetRecordOutputMaxGet(0, 9);
SetRecordSet
Purpose
Updates the parameter record number of the set with the current inputs and outputs of the neural network. The number of inputs and outputs of the dataset can be larger than those of the neural network, but not smaller. If the parameter record number does not exist, it will perform a SetRecordInsert.
Declarations
Standard C:
extern int SetRecordSet(int pIntNet, int pIntSet, int pIntRecordNumber);
MS Visual Studio:
extern int _cdecl SetRecordSet(int pIntNet, int pIntSet, int pIntRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordSet(int pIntNet, int pIntSet, int pIntRecordNumber);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The record number to be updated, starting with 0 for the first record.
Returns
- 0: if it updated the record correctly.
- An integer: if it had to create a new record, indicating the number of records memorized since the last call to SetFree.
- -1: if error, typically, out of memory.
Usage
The following C example updates the first record of the first set with the current inputs and ouputs, and puts the total number of records memorized in an integer variable:
int lIntTmp = SetRecordSet(0, 0, 0);
SetRecordsNumberGet
Purpose
Returns the current number of records inside the data set.
Declarations
Standard C:
extern int SetRecordsNumberGet(int pIntSet);
MS Visual Studio:
extern int _cdecl SetRecordsNumberGet(int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetRecordsNumberGet(int pIntSet);
Parameters
Index of the set.
Returns
- The current number of records inside the data set.
Usage
The following C example puts into a variable the current number of records of the first data set:
int lIntTmp = SetRecordsNumberGet(0);
SetRecordsNumberSet
Purpose
Sets the current number of records inside the data set, if the indicated number is lower than the current number of records inside the data set.
This function is useful to update the data set with new records without having to SetFree and SetCreate.
Setting the parameter of this function to zero does not free the memory of the data set, therefore, SetFree must be called at the end of using the data set.
Declarations
Standard C:
extern void SetRecordsNumberSet(int pIntSet, int pIntNumberOfRecords);
MS Visual Studio:
extern void _cdecl SetRecordsNumberSet(int pIntSet, int pIntNumberOfRecords);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetRecordsNumberSet(int pIntSet, int pIntNumberOfRecords);
Parameters
- Index of the set.
- The new number of records of the data set.
Returns
Nothing.
Usage
The following C example sets the current number of records of the first data set to 10:
SetRecordsNumberSet(0, 10);
SetRecordsOrderSet
Purpose
Resets the order of the records in the data set to their original position.
This function is useful to restore the original position of the records of the data set after they have been shuffled by a function like SetLearnStart.
This function must not be called after calling SetsLearn when running on NPU.
Declarations
Standard C:
extern bool SetRecordsOrderSet(int pIntSet);
MS Visual Studio:
extern bool _cdecl SetRecordsOrderSet(int pIntSet);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordsOrderSet(int pIntSet);
Parameters
Index of the set.
Returns
- NetLearn_Success: if success.
- NetLearn_SetHasNoRecords: if no records in data set.
- NetLearn_Out_Of_Memory: if out of memory.
Usage
The following C example restores the position of the records of the first data set to their original position:
SetRecordsOrderSet(0);
SetRecordSuccess
Purpose
Checks if a record from the data set is currently predicted correctly, considering the current test type stablished by functions like SetLearnStart.
Declarations
Standard C:
extern bool SetRecordSuccess(int pIntNet, int pIntSet, int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
MS Visual Studio:
extern bool _cdecl SetRecordSuccess(int pIntNet, int pIntSet, int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API bool _stdcall SetRecordSuccess(int pIntNet, int pIntSet, int pIntRecordNumber, bool *pBolThereIsNan, int *pIntIdxMaxOutput, bool* pBolGradientExploded);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- The record number to be checked, starting with 0 for the first record.
- A returned parameter which will indicate if, when predicting, any not a number (NaN) was obtained.
- A returned parameter which will indicate the number of the output which had the maximum predicted value.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
- True: if the predicted output matches with the active output of the record.
- False: otherwise.
Usage
The following C example checks if the 10th record of the first set is predicted correctly. The result will be stored in a variable:
bool lBolTmp = SetRecordSuccess(0, 0, 9, &lBolThereIsNaN, &lIntIdxMaxOutput, &lBolGradientExploded);
SetRecordSwap
Purpose
Exchanges the content of 2 records of the data set.
Declarations
Standard C:
extern void SetRecordSwap(int pIntSet, int pIntSrcRecordNumber, int pIntDstRecordNumber);
MS Visual Studio:
extern void _cdecl SetRecordSwap(int pIntSet, int pIntSrcRecordNumber, int pIntDstRecordNumber);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API void _stdcall SetRecordSwap(int pIntSet, int pIntSrcRecordNumber, int pIntDstRecordNumber);
Parameters
- Index of the set.
- The source record number to be swapped, starting with 0 for the first record.
- The destination record number to be swapped, starting with 0 for the first record.
Returns
Nothing.
Usage
The following C example swaps the content of the 10th and 11th records of the first set:
SetRecordSwap(0, 9, 10);
SetReorderTrainingAndTest
Purpose
Rearranges the data set so that a certain percentage of the records representing all the different outputs are placed at the end of the data set.
This function is useful as it allows to:
- Create a data set by adding records without considering to create training and test records.
- Once all the records have been created, a call to this function will separate the data set into the training and the test subsets.
Declarations
Standard C:
extern int SetReorderTrainingAndTest(int pIntNet, int pIntSet, float pSngPercentageForTesting, float pSngThresholdForActive);
MS Visual Studio:
extern int _cdecl SetReorderTrainingAndTest(int pIntNet, int pIntSet, float pSngPercentageForTesting, float pSngThresholdForActive);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetReorderTrainingAndTest(int pIntNet, int pIntSet, float pSngPercentageForTesting, float pSngThresholdForActive);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- A float in the range [0, 1] indicating the percentage of records for testing.
- A float indicating the threshold value to consider that an output is active. Consider that this is affected by the defined test type.
Returns
An integer indicating the number of records that were rearranged. Subtracting this value from the total number of records will indicate which record is the first record for test.
Usage
The following C rearranges the first data set so that 15% of the records are reserved for tests at the end of the dataset. The number of records that were rearranged is also memorized into a variable:
int lIntTotalNumberOfRecordsRearranged = SetReorderTrainingAndTest(0, 0, 0.15, 0.5);
SetSaveToFile
Purpose
Saves current data set in a file, which is useful to load it later with SetLoadFromFile or to visually examine the data set with, for example, the NNViewer application.
The file has the following characteristics:
- The file is internally a comma separated value (CSV) file.
- It can be opened by the NNViewer application which source code is provided.
- The file is named AnaimoAI.nnv and is created where the AnaimoAI.dll is.
- The content of the file is:
- 1st row: the version of the Anaimo AI SDK which created the file, in format YYYYMM. For example, for this version: 202203
- 2nd row are 7 integers indicating the number of:
- Inputs.
- Outputs.
- Rows for inputs. It is optional and therefore can be zero.
- Columns for inputs. It is optional and therefore can be zero.
- Rows for outputs. It is optional and therefore can be zero.
- Columns for outputs. It is optional and therefore can be zero.
- Number of following records of the set in this file. This number can be lower than the real number of records, so it is recommended to keep reading the file after this number of records has been reached.
- Rest of rows, for all records of the data set, by consecutive numbers, the content of the cells of the all the inputs and then all the outputs.
As you can see, in the 2nd row of the file, columns and rows for inputs and outputs can be zero indicating that no specific visualization grid is recommended for the data set of inputs and outputs. Read the parameters section below for more information.
Declarations
Standard C:
extern int SetSaveToFile(int pIntSet, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl SetSaveToFile(int pIntSet, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSaveToFile(int pIntSet, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
- Index of the set.
- 5 integers indicating the number of:
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Precision or number of digits after the decimal symbol. Put -1 here to ignore this parameter.
Put 0 in the rows and columns parameters above if you are not interested in storing any type of recommended visualization for the data.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
Usage
The following C code saves the first data set into a file as indicated above:
int lIntRes = SetSaveToFile(0, 0, 0, 0, 0, -1);
SetSaveAddToFile
Purpose
Adds new records to the current data set in the file previously saved by SetSaveToFile. This function is much faster than SetSaveToFile as it only adds the new records to the file, but it requires SetSaveToFile to be called before using this function.
For more information and the characteristics of the outputted file, please read SetSaveToFile.
Declarations
Standard C:
extern int SetSaveAddToFile(int pIntSet, int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio:
extern int _cdecl SetSaveAddToFile(int pIntSet, int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSaveAddToFile(int pIntSet, int pIntRecordToStartSaving, int pIntIRows, int pIntICols, int pIntORows, int pIntOCols, int pIntPrecision);
Parameters
- Index of the set.
- 6 integers indicating the number of:
- The number of the record to start saving in the file.
- Rows for inputs.
- Columns for inputs.
- Rows for outputs.
- Columns for outputs.
- Precision or number of digits after the decimal symbol. Put -1 here to ignore this parameter.
Put 0 in the rows and columns parameters above if you are not interested in storing any type of recommended visualization for the data.
Returns
An integer indicating with:
- 0: the file was created successfully.
- 1: the file could not be created.
Usage
The following C code adds to the current file all the available records of the data set starting with record number 5:
int lIntRes = SetSaveAddToFile(0, 5, 0, 0, 0, 0, -1);
SetSnapshotGet
Purpose
Restores all the content from the set’s snapshot to the set. The set must have been previously created with SetCreate and must be equal or larger in inputs, outputs, and number of records than the one in the snapshot.
Declarations
Standard C:
extern int SetSnapshotGet(int pIntSet, int pIntSetWidth, int pIntSetHeight);
MS Visual Studio:
extern int _cdecl SetSnapshotGet(int pIntSet, int pIntSetWidth, int pIntSetHeight);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSnapshotGet(int pIntSet, int pIntSetWidth, int pIntSetHeight);
Parameters
- Index of the set.
- Width of the set. It can be larger or equal, but not smaller, to the pIntWidth indicated when the snapshot was taken.
- Height of the set. It can be larger or equal, but not smaller, to the pIntHeight indicated when the snapshot was taken.
Returns
Nothing.
Usage
The following C example sets the first set to how it was when SetSnapshotTake was called for the last time:
SetSnapshotGet(0,100,500);
SetSnapshotTake
Purpose
Copies all the content from the current set into the set’s snapshot. If the set includes images, their size should be specified in the parameters width and height.
Declarations
Standard C:
extern int SetSnapshotTake(int pIntSet, int pIntWidth, int pIntHeight);
MS Visual Studio:
extern int _cdecl SetSnapshotTake (int pIntSet, int pIntWidth, int pIntHeight);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API int _stdcall SetSnapshotTake(int pIntSet, int pIntWidth, int pIntHeight);
Parameters
- Index of the set.
- Width of images. All images must have the same width.
- Height of images. All images must have the same height.
Returns
A boolean indicating with true that the copy has been made.
Usage
The following C example saves the current state of the first set to a snapshot in memory, to be recalled later by SetSnapshotGet:
SetSnapshotTake(0);
SetSuccessGet
Purpose
Returns the current success rate in predictions based on the data set and the current neural network. The records from the data set and the test type that will be used to check the predictions are indicated for Validation (if specified) or Test in SetLearnStart or SetAutoStart.
Declarations
Standard C:
extern float SetSuccessGet(int pIntNet, int pIntSet, bool* pBolThereIsNan, bool* pBolGradientExploded);
MS Visual Studio:
extern float _cdecl SetSuccessGet(int pIntNet, int pIntSet, bool* pBolThereIsNan, bool* pBolGradientExploded);
MS Visual Studio compatible with win32 COM:
extern "C" AnaimoAI_API float _stdcall SetSuccessGet(int pIntNet, int pIntSet, bool* pBolThereIsNan, bool* pBolGradientExploded);
Parameters
- Index of the network. For more information, please refer to the “Multiple neural network” section.
- Index of the set.
- A returned parameter which informs with true if there were “not a number” (NaN) in the outputs. Detecting NaN is very important, therefore, if this parameter comes true, the current training hyperparameters will probably not lead to success.
- A returning boolean indicating if, during forward propagation, a gradient became infinite. This only applies to layers of type SoftMax and will indicate that the last step of SoftMax was not performed. This could indicate that the learning rate should be lowered.
Returns
A float in the range [0, 1] indicating the success rate of the predictions of the current neural network. This success rate is calculated by comparing the outputs with the predicted outputs of the consecutive records between the parameters and returning the number of records where all outputs where correctly predicted divided by the total number of records indicated in the parameters.
Usage
The following C code puts into a float variable the current success rate of the first neural network when using the set:
#define TEST_TYPE_ByThreshold 0
#define TEST_TYPE_ByMax 1
#define TEST_TYPE_ByOutputs 2
float lSngTmp = SetSuccessGet(0, 0, &lBolThereIsNan, &lBolGradientExploded);
Example: .NETNNViewer
It is very important to have an example to understand how the library can be used.
NNViewer is a complete, fully functional, easy to understand, source code of an MS Windows example application. Allows drawing inputs and outputs to train the neural network and see how it learns the patterns.
For this purpose, the source code of the example NNViewer is provided in version MS Visual Studio VB.NET (source code requires .NET 5.0 and MS Visual Studio version 2022 or higher).


How does it work?
You can type the M key at any time to obtain help about the available commands. Most of them should be self-explanatory.
NNViewer has a graphical user interface showing, on the left side, a grid with the inputs, and on the right side, a grid with the outputs.
Every input and output (every cell of the grids) can have a value in the range from 0 to 1 inclusive. The value of each cell will be shown with a color, from white for 0 to black for 1. A value of a cell of 0.5 will be shown in grey.
The user can change the inputs at any time by clicking with the mouse on an input cell. The user can change the outputs in the same way, only when the application is not on “thinking”. In the “thinking” (after pressing the T key), the outputs will be colored by the neural network.
You can create and delete pages, clear them, and navigate through them. Once you have all the pages of inputs and outputs that you want, you can make the neural network to learn them.
Once the neural network has learnt your pages, you can switch to “thinking” with the T key, and then the neural network will show you, on the outputs, which outputs it understands from the current inputs.
When the neural network is in “thinking” (T key), you can create a new page with the N key, draw new inputs and see in real time the outputs that the neural network predicts.
Example files
Once you have created your pages of inputs and outputs, you can save them with the S key.
The application comes with 4 examples files that can be loaded. These files are internally CSV files but renamed to extension NNV:
- handwritten_28x28_50000_10000.nnv: the popular MNIST handwritten digits data set with 28×28 pixel images, 50k pages of training images and 10k pages of test images. Open it with 2 layers, learning rate of 3 (L key), test type (V key) by maximum and make it learn (A key) for 30 epochs with 5000 batches (this means a batch size of 10 pages), 50.000 pages for training and not auto tuning hyperparameters. This should result in about 80% of success rate in the first epoch.
- numbers.nnv: a few handwritten numbers, in low quality, to show how the neural network can learn with very few and poor-quality inputs and outputs.
- ranges.nnv: marks ranges of inputs and outputs at the same height. The purpose of this file is to show how the neural network learns to mark outputs at the same height than inputs. Please note that 5% of the outputs are wrong, and that the neural network, in “thinking”, will mark the correct answer for the outputs. In other words, it will be detecting anomalies.
- symbols.nnv: a few pages of symbols, to show how the neural network can learn with very few inputs and outputs. Once learned (with the A key), you can create a new page (with N key), draw a similar symbol, and see how the neural network shows the output.
- xor.nnv: very simple example of a XOR operation. Type 3 layers of neural network (when choosing 3 layers, NNViewer will, in this case because there are not enough neurons for 3 layers, generate a hidden layer with the same number of neurons as inputs) and learn with 3.000 epochs and it will learn the operation.
Example: WindowsPythonJupyterHandwritten_Digits
Before opening the Jupyter file you need to:
- Install Anaconda for Windows; in our tests we used Anaconda 2023.03-1 downloaded from the official web page https://www.anaconda.com/
- From the Windows start menu, execute Anaconda Navigator, and Launch Jupyter Notebook to verify that you can open Jupyter notebooks in your computer.
- Additionally, you can open Jupyter Notebook quicker by executing, from the Windows start menu, Anaconda Prompt and then typing: jupyter notebook
- From the Jupyter notebook, navigate through your hard disk and open the file provided with the SDK: ExamplesPythonJupyter_Handwritten_DigitsIpnybAnaimo.ipynb
- Once opened, you can execute cell by cell or the whole notebook with the menu Cell and Run all.
The code trains the neural network with 50k handwritten digits and then tests with 10k handwritten different digits.
Layers created (including the inputs and outputs layers): 3
Creating set of records 1000 / 60000...
Creating set of records 2000 / 60000...
Creating set of records 3000 / 60000...
Creating set of records 4000 / 60000...
Creating set of records 5000 / 60000...
Creating set of records 6000 / 60000...
Creating set of records 7000 / 60000...
Creating set of records 8000 / 60000...
Creating set of records 9000 / 60000...
Creating set of records 10000 / 60000...
Creating set of records 11000 / 60000...
Creating set of records 12000 / 60000...
Creating set of records 13000 / 60000...
Creating set of records 14000 / 60000...
Creating set of records 15000 / 60000...
Creating set of records 16000 / 60000...
Creating set of records 17000 / 60000...
Creating set of records 18000 / 60000...
Creating set of records 19000 / 60000...
Creating set of records 20000 / 60000...
Creating set of records 21000 / 60000...
Creating set of records 22000 / 60000...
Creating set of records 23000 / 60000...
Creating set of records 24000 / 60000...
Creating set of records 25000 / 60000...
Creating set of records 26000 / 60000...
Creating set of records 27000 / 60000...
Creating set of records 28000 / 60000...
Creating set of records 29000 / 60000...
Creating set of records 30000 / 60000...
Creating set of records 31000 / 60000...
Creating set of records 32000 / 60000...
Creating set of records 33000 / 60000...
Creating set of records 34000 / 60000...
Creating set of records 35000 / 60000...
Creating set of records 36000 / 60000...
Creating set of records 37000 / 60000...
Creating set of records 38000 / 60000...
Creating set of records 39000 / 60000...
Creating set of records 40000 / 60000...
Creating set of records 41000 / 60000...
Creating set of records 42000 / 60000...
Creating set of records 43000 / 60000...
Creating set of records 44000 / 60000...
Creating set of records 45000 / 60000...
Creating set of records 46000 / 60000...
Creating set of records 47000 / 60000...
Creating set of records 48000 / 60000...
Creating set of records 49000 / 60000...
Creating set of records 50000 / 60000...
Creating set of records 51000 / 60000...
Creating set of records 52000 / 60000...
Creating set of records 53000 / 60000...
Creating set of records 54000 / 60000...
Creating set of records 55000 / 60000...
Creating set of records 56000 / 60000...
Creating set of records 57000 / 60000...
Creating set of records 58000 / 60000...
Creating set of records 59000 / 60000...
Creating set of records 60000 / 60000...
2023-06-28 09:39:24.088862 - Initiating learning...
2023-06-28 09:39:29.557353 - Epoch 0: Success rate: 91.26999974250793 %
2023-06-28 09:39:35.440770 - Epoch 1: Success rate: 92.96000003814697 %
2023-06-28 09:39:41.059279 - Epoch 2: Success rate: 93.33999752998352 %
2023-06-28 09:39:46.695869 - Epoch 3: Success rate: 92.8600013256073 %
2023-06-28 09:39:51.703843 - Epoch 4: Success rate: 94.0500020980835 %
2023-06-28 09:39:57.409123 - Epoch 5: Success rate: 93.88999938964844 %
2023-06-28 09:40:03.194934 - Epoch 6: Success rate: 94.09000277519226 %
2023-06-28 09:40:08.700750 - Epoch 7: Success rate: 94.55999732017517 %
2023-06-28 09:40:14.001746 - Epoch 8: Success rate: 94.66999769210815 %
2023-06-28 09:40:19.160740 - Epoch 9: Success rate: 94.62000131607056 %
2023-06-28 09:40:24.720267 - Epoch 10: Success rate: 94.34999823570251 %
2023-06-28 09:40:30.188844 - Epoch 11: Success rate: 94.6399986743927 %
2023-06-28 09:40:35.903739 - Epoch 12: Success rate: 94.92999911308289 %
2023-06-28 09:40:41.646582 - Epoch 13: Success rate: 94.83000040054321 %
2023-06-28 09:40:47.024246 - Epoch 14: Success rate: 94.60999965667725 %
2023-06-28 09:40:52.813525 - Epoch 15: Success rate: 94.9500024318695 %
2023-06-28 09:40:58.321525 - Epoch 16: Success rate: 94.8199987411499 %
2023-06-28 09:41:03.793734 - Epoch 17: Success rate: 94.65000033378601 %
2023-06-28 09:41:09.400702 - Epoch 18: Success rate: 95.02000212669373 %
2023-06-28 09:41:15.238250 - Epoch 19: Success rate: 94.91999745368958 %
2023-06-28 09:41:20.422186 - Epoch 20: Success rate: 94.84999775886536 %
2023-06-28 09:41:26.132038 - Epoch 21: Success rate: 95.169997215271 %
2023-06-28 09:41:31.708575 - Epoch 22: Success rate: 95.14999985694885 %
2023-06-28 09:41:36.972715 - Epoch 23: Success rate: 95.02999782562256 %
2023-06-28 09:41:42.391361 - Epoch 24: Success rate: 95.08000016212463 %
2023-06-28 09:41:47.966994 - Epoch 25: Success rate: 95.09000182151794 %
2023-06-28 09:41:53.446017 - Epoch 26: Success rate: 94.9899971485138 %
2023-06-28 09:41:59.323663 - Epoch 27: Success rate: 95.06000280380249 %
2023-06-28 09:42:04.690296 - Epoch 28: Success rate: 95.09999752044678 %
2023-06-28 09:42:09.867939 - Epoch 29: Success rate: 95.02999782562256 %
Example: Windows/Python/MS Visual Studio/Handwritten_Digits
The NNViewer example solves the handwritten digits challenge (by opening project handwritten_28x28_50000_10000.nnv). This python example has just the code to run on MS Windows and solve the same challenge. You can open the file PyAnaimo_hw.py with any python editor or open the file Python_Handwritten_Digits.sln with MS Visual Studio version 2022 or higher.
Output will be identical as the Jupyter notebook version.
Support
In case you need support, please use https://support.anaimo.com
Change log
Date | Version | Change |
20/02/2023 | 2022-03 (B.010900) | Multiple improvements in several functions. |
23/02/2023 | 2022-03 (B.010900) | SetAutoLearnContinue now learns, in the infinite mode, all records, not just those which did not predict in the 1st place. |
05/03/2023 | 2022-03 (B.010900) | Now weights are saved in NNK files as they are stored in memory. Previous NNK files will not work. |
06/03/2023 | 2022-03 (B.010900) | NeuBiasSet/Get & NeuDeltaGet have been deleted. Now all is managed with NeuWeightSet/Get. |
06/03/2023 | 2022-03 (B.010900) | NeuWeightUpdatedGet, parameter input now must start with 1. |
28/6/2023 | 2022-03 (B.010900) | Added the Jupyter version of the Handwritten Digits example. |
25/07/2023 | 2022-03 (B.010900) | New Hue and Saturation augmentation. |
26/07/2023 | 2022-03 (B.010900) | Changed AUGMENT_FILTER_SHAPE_ALL to AUGMENT_FILTER_SHAPE_NONE |
20/08/2023 | 2022-03 (B.010900) | SoftMax function has been improved. |
23/08/2023 | 2022-03 (B.010900) | Now you can limit the number of augmentations that are performed. |
09/09/2023 | 2022-03 (B.010900) | Added new method NetKnowledgeFilePathSet. |
14/09/2023 | 2024-01 (B.010010) | Removed parameter pIntRecordNumberOriginal from function SetRecordOutputMaxGet. |
17/09/2023 | 2024-01 (B.010010) | Added multiple neural network capacity. |
30/09/2023 | 2024-01 (B.010010) | Added that NetArchitectureCreateAuto, NetArchitectureCreate and NetKnowledgeLoadFromFile return the number of neurons created. |
01/11/2023 | 2024-01 (B.010010) | Changed in SetImagesAugment that number of inputs of set can be bigger than number of inputs of network. |
01/11/2023 | 2024-01 (B.010010) | Added multiset capacity. |
10/11/2023 | 2024-01 (B.010010) | Improved networks’ snapshot management. |
10/11/2023 | 2024-01 (B.010010) | New snapshots for sets supported. |
05/01/2024 | 2024-01 (B.010010) | Fixes a bug in setRecordIncrease. |
09/03/2024 | 2024-01 (B.010010) | Improved parallelization. |
10/03/2024 | 2024-01 (B.010010) | Improved memory alignment for Intel CPUs. |
10/03/2024 | 2024-01 (B.010010) | Removed long long unnecessary overhead. |
14/03/2024 | 2024-01 (B.010010) | Improved internal management of lists. |
29/03/2024 | 2024-01 (B.010010) | Now files are saved first in a .tmp file to avoid corruption. |
30/07/2024 | 2024-01 (B.010010) | Now NetSnapshotTake and NetSnapshotGet return success if there are no neurons in the net. |
04/08/2024 | 2024-01 (B.010010) | Internal improvements for NPUs. |
23/09/2024 | 2024-01 (B.010010) | Full support for NPUs (GPUs, etc.). |