Visibility is very important for this site. If you like it please link from your page to this URL or share info using Social Buttons below.

Neural Networks Programming API

What is "Cortex Neural Networks Programming API"?

The Cortex program comes with the SP_NN.DLL that is the core of Cortex artificial neural networks data processor. However this DLL expors classes instead of functions, so to work with it you need to be a C++ programmer, and even then it is quite inconvenient.

The Cortex API DLL solves this problem, providing a wrapper that you can use the way most programmers do, through calls to functions exported by this DLL.

Installation of Cortex Neural Networks Programming API.

Download the Cortex API DLL, headers and examples archive. Unzip it to the folder of your choice, preserving the directory structure.

Some programs that you can download from this site can work together. If you want (strongly recommended) this kind of functionality to be available, you should create a common folder, called (recommended) S_PROJECTS, and to unzip all software in this folder. The sub-folders will be created for you automatically during the unzip procedure.

When specifying options in your Winzip software, make sure that all subdirectories (subfolders) are restored. Usually it is the default setting for the WinZip. Installation is complete.

Uninstallation of Cortex Neural Networks Programming API.

Delete the folder containing the NN_API_Demo and SP_NN_API files.

Registration of Cortex NN Programming API

If you choose to register the Cortex Cortex Neural Networks Programming API software, you will need to enter the password (provided in the e.mail that you will receive after the registration) into the registration prompt.

Neural Network Library: Getting started.

The following chapters will walk you through the simple example of using the "Cortex API" Neural Networks software - we are going to use the jaco.nn network that was created by the Cortex program.

The API itself is located at SP_NN_API directory, and the sample code can be found in NN_API_Demo.

The sample comes with two functions (available from the "View" of the main menu. First function loads the data file, and produces the output file for these data. The second function does the same record by record, so if you want to process your data as they arrive, you can use it as an example.

The sample code is written using C++, in theory, any language that can import DLL functions will do.

Note that in the real life situation you may want to use your own data, and to be able to provide minimum and maximum values for the range. Usually, it is not a problem. However in the example we used, the .LGG file produced by the Cortex as a source of data. It means that we don't have to produce lags, and the min/max are calculated automatically, too. Just keep in mind, that if you don't want to load data from file, you don't have to.

Cortex Neural Networks Programming API functions

Functions are listed in NN_API.h file.

DllExport BOOL SP_NnLoad(char* strFileName, BOOL bIsPathRelative);

Loads the back propagation neural networks description from the file. If the bIsPathRelative is TRUE, the path is expected to be relative to the location of the executable (all DLLs and the executable are supposed to be in the same place, in a Bin folder).

DllExport BOOL SP_FileLoader(char* strFileName, int* pnInputs, int nInputs, 
	BOOL bReverse);	

Load the data from the file. Here is an example of the file:


The header (1 line) is expected, so the function is not very sophisticated. pnInputs contains zero based number of columns. bReverse is a flag specifying if the latest data are located first (as in the example above) and the arrays should be reversed.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport BOOL SP_Normalize(int nDataSize, double dExtendRange);

Can only be used after the FileLoader(). It works with the internally stored data, normalizing them to 0-1 range.

You don't have to use this function, if you prefer to supply your own data, record by record.

	DllExport BOOL SP_NnLoadAndApplyData(char* strFileName, 
	// Obtain dMin, dMax for this interval
int nNumOfLearningRecords,
double dExtendRange,
int* pnInputs, int nInputs, int* pnOutputs, int nOutputs,
int* pnLags, int nNumOfLags, BOOL bReverse);

The use of this function is shown in an example that comes with the API. strFileName is the data file. nNumOfLearningRecords is the range (0 to nNumOfLearningRecords - 1) that is used for normalization, it is the same range, that was used when Cortex created the network.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport int SP_NnGetDataLength();		

Returns the length of the data, loaded by FileLoader.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport int SP_NnGetDataWidth();		

Returns the width (number of columns) of the data, loaded by FileLoader.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport double SP_NnGetData(int nRow, int nColumn);		

Retrieve one value from the internal data structure.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport double* SP_GetMinArray();
DllExport double* SP_GetMaxArray();

Return minimums and maximums for the internally stored data. Can be used for normalization / de-normalization.

You don't have to use this function, if you prefer to supply your own data, record by record.

DllExport BOOL SP_NnApplyToRecord(double* pdData, int nInputs, int nOutputs);		

Take a crosscut of the data (a record) and produce the NN output. The output is written to the same array, next to inputs. Call this function in cycle to process your data record by record.

DllExport BOOL SP_NnUnLoad()

Performs the cleanup.

Back Propagation Neural Networks: Walking through the code

The code for sample functions is located in MainFrm.cpp.


This function serves as an example of a simple case when you have data stored in a file. The file does not include lags, NN will calculate them for you.

First we load the NN. Then we specify the indexes for the inputs and outputs. The file has the following format: Date,High,Low,Close,Volume. We use the High,Low,Close and their lags (...-4, ...-5, ...-6, ...-9).

The data are in reversed order (last dates first), so we need to reverse the arrays.

After the job is done, we export the data to the jaco.APL file.

void CMainFrame::OnApplyNn() 
	SP_NnLoad("c:\\S_Projects\\NN_API_Demo\\data\\jaco.nn", FALSE);

	// Important: We have Date, Open, High, Low, Close, Volume
	// First array is referencing 
	// Open, High, Low, Close (1, 2, 3, 4)
	// The second array is referencing Close WITHIN the first 
	// array, so it is 3, and not 4
	int pnInputs[] = { 1, 2, 3, 4 };
	int pnOutputs[] = { 3 };
	int pnLags[] = { 4, 5, 6, 9 };

		100,	// Obtain dMin, dMax for this interval
		1.0,	// dExtendRange
		pnInputs, 4, 
		pnOutputs, 1,
		pnLags, 4, 
		TRUE // BOOL bReverse

	FILE* file = fopen(
		"c:\\S_Projects\\NN_API_Demo\\data\\jaco.apl", "wb");
		fputs("Date, Close, NN:Close\r\n", file);
		for(int i = 0; i < SP_NnGetDataLength(); i++)
			fprintf(file, "%d, %f, %f\r\n", i, SP_NnGetData(i, 5), 
				SP_NnGetData(i, SP_NnGetDataWidth() - 1));

To build the chart of the Close vs NN:Close prediction, you can use the last tab of the Cortex program.


This is a slightly more complex task, as we need to do more work explicitly, rather than calling the DLL function.

Note that we need to get the data from somewhere, so we take them from the jaco.LGG file, produced by the Cortex. It makes our task simpler, as we don't have to write data loader, normalizer, and so on. If you want to use your own (non-file) data source, you can do it as well.

Before we start, we need to create arrays with indexes of input and (for the reason explained below) output columns.

	// ...-4, ...-5, ...-6, ...-9 for Open, High, Low, 
	// Close, and the last entry for the Close
	int pnInputs[] = { 5, 6, 7, 10, 15, 16, 17, 20, 25, 
		26, 27, 30, 35, 36, 37, 40, 41 };
	// Close
	int pnOutputs[] = { 41 }; // or 31
	// The real number of inputs NN is expecting is 16. 
	// We load Close to get it normalized, so that we 
	// know the normalization range. Later we will decrease 
	// the nInputs to 16.
	int nInputs = 17;
	int nOutputs = 1;
	double dExtendRange = 1.0;

We load the NN.

SP_NnLoad("c:\\S_Projects\\NN_API_Demo\\data\\jaco.nn", FALSE)

Then we load the data file. As was mentioned too many times, it is not a required step, we do it simply because we need to get the data from somewhere. As we already have the loader function, we use it for the demo purpose. In a real life situation you will compose the input array from whatever data you want.

	pnInputs, nInputs, FALSE);

Also note that we use the .LGG file here, that already contains lags. Again, in the real life it is up to you to create lag arrays, to normalize them and so on.

Here is the list of headers for the .LGG file we use, with column numbers, it will be used in "inputs" and "outputs":

No(0),Open(1),Open-1(2),Open-2(3),Open-3(4),Open-4(5),Open-5(6),Open-6(7), Open-7(8),Open-8(9),Open-9(10),High(11),High-1(12),High-2(13),High-3(14),High-4(15), High-5(16),High-6(17),High-7(18),High-8(19),High-9(20),Low(21),Low-1(22),Low-2(23), Low-3(24),Low-4(25),Low-5(26),Low-6(27),Low-7(28),Low-8(29),Low-9(30),Close(31), Close-1(32),Close-2(33),Close-3(34),Close-4(35),Close-5(36),Close-6(37),Close-7(38), Close-8(39),Close-9(40),Close(41)

The NN we use in this example was trained using ...-4, ...-5, ...-6, ...-9 as inputs, and Close as the output. This is exactly what we need to supply, otherwise the output fill be wrong.

Once again: we only load data file to have some data source. You don't have to load the data file, and can use any data source, record by record.

Note that bReverse is FALSE, as our lag file is already reversed.

Then we need to normalize the data to the 0 - 1 range. Again, as this is just an example, we cheat. The FileLoader function loads the data in the internal data structure, that you can only access through SP_NnGetDataLength, SP_NnGetDataWidth and SP_NnGetData functions. The Normalize function will normalize the data in this internal structure.

If we load the data using the FileLoader function, we can simply call the DLL's Normalize function:

Normalize(100 /*nNumOfLearningRecords*/, dExtendRange);

On the next step we create an array pdData, with one number for each input column. We copy the input data there and we feed this input array (a crosscut of the input arrays, a record) to the NN. The output data are copied to the end of the pdData array, after the inputs, so you must allocate enough space.

double* pdData = new double[nInputs + nOutputs];

After the NN produced output data, we need to de-normalize it. We have a choice. First of all, it you feed your data (without cheating, as we do), you are supposed to normalize them yourself, and therefore, you know the max and min of the data. In case you do it the way we did here, you can call DLL function to find the min and max stored during the normalization.

Important: the SP_NnGetData, GetMinArray and GetMaxArray functions return arrays of minimums and maximums for the LOADED data, therefore to access the individual values we need to use the index of the LOADED data. For example, the Close (41 in data file) can be accessed by index 16. This is the ONLY reason we included 41 (Close) as the last input in the list of inputs - to get it loaded and normalized, so that we can get its max and min. After we did it, we reduced the nInputs (nInputs--), so that all further processing ignores the Close.

If we did it without "cheating", we would have to write our own function to obtain data arrays, to calculate max and min, and to normalize the data. Then we would go straight to SP_NnApplyToRecord. However, if we work with the file in the appropriate format, as the functions are already in the DLL, why not to use them?

double* dMinArray = SP_GetMinArray();
double* dMaxArray = SP_GetMaxArray();

We fill the array (the record) that we will feed to the network:

for(int j = 0; j < nInputs; j++)
	pdData[j] = SP_NnGetData(i, j);

And we process all records in cycle. As we obtain the data, we write it to the output file:

if(SP_NnApplyToRecord(pdData, nInputs, nOutputs))
	fprintf(file, "%d, %f, %f\r\n", 
		SP_NnGetData(i, 16) * (dMaxArray[16] - dMinArray[16])
			+ dMinArray[16], 
			pdData[16] * (dMaxArray[16] - dMinArray[16])
			+ dMinArray[16]);

More samples comes with the product archive.

(C), all rights reserved

Please read the disclaimer