Applying z-score before scaling to [0,1]?

Technical Source
2 min readJun 9, 2022

--

Hello

I’m currently using neural network for classification of a dataset. Of course before doing classification either the data points or the features should be normalized. The toolbox which I’m using for neural network requires all values to be in range [0,1].

Does it make sense to first apply z-score and then to scale to range [0,1]?

Second, should I normalize along the feature vectors or the data points (either applying z-score or to range [0,1])?

NOTE:-

Matlabsolutions.com provide latest MatLab Homework Help,MatLab Assignment Help , Finance Assignment Help for students, engineers and researchers in Multiple Branches like ECE, EEE, CSE, Mechanical, Civil with 100% output.Matlab Code for B.E, B.Tech,M.E,M.Tech, Ph.D. Scholars with 100% privacy guaranteed. Get MATLAB projects with source code for your learning and research.

It is well known (e.g., see the comp.ai.neural-nets FAQ) that the most efficient MLP nets are those which have

  • 1. Bipolar sigmoid hidden node transfer functions, e.g., TANSIG( == TANH ), NOT LOGSIG !
  • 2. Bipolar scaled input variables. For example
  • a. Normalized to [-1,1] via MAPMINMAX (MATLAB’s default)
  • b. Standardized to zero-mean/unit-variance via MAPSTD or ZSCORE
  • 3. However, the initial weight assignments should assure that initial hidden node outputs are in the linear region of the sigmoid.

Before training I always use the functions MINMAX (NOT mapminmax), ZSCORE and PLOT to eliminate or modify outliers and incorrect data.

Even though I prefer standardization, I accept MATLAB’s [-1,1] default, which I assume is taken into account by MATLAB’s default weight initialization. (I guess I should check this … I’ve been burned by other logical assumptions).

SEE COMPLETE ANSWER CLICK THE LINK

--

--

Technical Source
Technical Source

Written by Technical Source

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.

No responses yet