Odd data preparation with NARX network

Technical Source
4 min readAug 5, 2021

Hi,

I am currently working on a NARX network for a time-series prediction problem. I am using a 3*4644 array as inputs and a 1*4644 array as my targets. I do not have any delay on the input and the feedback and here is the code that I am running:

% Solve an Autoregression Problem with External Input with a NARX Neural Network
% Script generated by Neural Time Series app
% Created 16-Jul-2015 12:18:44
%
% This script assumes these variables are defined:
%
% inputs - input time series.
% targets - feedback time series.
X = tonndata(inputs,true,false);
T = tonndata(targets,true,false);
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:1;
feedbackDelays = 1:1;
hiddenLayerSize = 8;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock';
net.divideMode = 'time'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
% View the Network
%view(net)

Now my problem is that whenever I take a look at the variables input, x, target and t, they don’t look like what I was expecting to see. Basically my input and target variables look like

[v1,v2,v3], [v2,v3,v4],[v3,v4,v5],...

and

v4,v5,v6,...respectively.
Because of how I set up my network, I expected my x and t variables to look like[inputs at time t]| [v2,v3,v4], [v3,v4,v5], ...
target from t-1 | v4 , v5 , ...

and

v5, v6, ...

respectively (which clearly isn’t perfect as I already have my target fedback from time t-1 in my input at time t but I know how to fix that) but instead I get something even weirder and this time unexpected:

[v2,v3,v4], [v3,v4,v5], ...
v5 , v6 , ...

and

v5, v6, ...

as if the feedback from time t-1 and the target at time t were the same. So I don’t know if it’s normal or if I’m doing something wrong (possibly the delays that may need being set to zero).

I’d be suprised if anyone understood what I mean but I’m afraid I don’t know a better way to explain my problem. Sorry for the long and poorly structured post and thank you for your help !

ANSWER

Matlabsolutions.com provide latest MatLab Homework Help,MatLab Assignment Help for students, engineers and researchers in Multiple Branches like ECE, EEE, CSE, Mechanical, Civil with 100% output.Matlab Code for B.E, B.Tech,M.E,M.Tech, Ph.D. Scholars with 100% privacy guaranteed. Get MATLAB projects with source code for your learning and research.

% You say that you have no delays. If you have no delays you should be using fitnet or patternnet.

% However, your code shows 1 input delay and 1 feedback delay.

% Your use of v for both input and target is confusing.

% In addition, your 3-D inputs should not be represented as row vectors

% Taking advantage of defaults , using the RNG seed 4151941, and using the simplenarx_dataset with the input tripled yields the following code. Are you able to use the following to better explain your concerns?

close all, clear all, clc
[ X0 T ] = simplenarx_dataset;
x0 = cell2mat( X0 );
X = con2seq( [ x0; x0; x0 ] );
x = cell2mat(X);
whos
% Name Size Bytes Class
% T 1x100 12000 cell
% X 1x100 13600 cell
% X0 1x100 12000 cell
% x 3x100 2400 double
% x0 1x100 800 double
ID = 1, FD = 1, H = 8
neto = nar

% Check: 0.7*0.12489+0.15*(0.18163+0.23894) = 0.15051

SEE COMPLETE ANSWER CLICK THE LINK

https://www.matlabsolutions.com/resources/odd-data-preparation-with-narx-network.php

--

--

Technical Source

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.