Cross validating drop down menu
A typical cross-validation procedure usually involves more than one sub-validation experiment, each of which involves the selection of different subsets of samples for model building and model testing.
In this case, of course, there are two possible sources of "predicted" Y values: 1) Y values predicted by a model that uses all of the available objects in the data set (called "Y Predicted"), and 2) Y values predicted during cross-validation (called "Y CV Predicted").When using the Analysis GUI, the cross-validation settings can be accessed by clicking on the red checkmark on the corner of the Model button of the status panel; or by selecting the "Choose Cross-Validation" button on the Analysis Flowchart, or by selecting the Tools menu on the Analysis GUI, then selecting Cross-Validation.Note that a set of data must first be loaded into the Analysis interface before the cross-validation utility is made available.In this case, the loaded X and Y variables, spec1 and conc from nir_data.mat, have 30 objects, and the default number of Data Splits is the integer closest to the square root of the number of objects, which is equal to 5.Once the parameters have been set to desired values, select the "Apply" button to apply these settings and keep the Cross-Validation window open, or select the "OK: button to apply these settings and close the Cross-Validation Window.
Search for cross validating drop down menu:
The parameters that are available for each Cross-Validation method are specified in the second row of Table 1. The Cross-Validation Window, with parameter selection shown for the "contiguous block" method.