The first step in first-level analysis is the creation of an "analysis". The analysis is a collection of information about how you want to analyze the functional data. This includes the specification of the input data, event types, event schedules, hemodynamic model, nuisance regressors, etc. For FSL users, this is more-or-less the equivalent to running the Feat GUI. In FS-FAST, the work-flow is a little different that that associated with SPM or FSL. In those packages, you would specify all the analysis parameters *every* time you analyzed a data set. In FS-FAST, you do so once regardless of how many data sets you have. This way you assure that all your data is analyzed with the same parameters. This also requires that you set up your data correctly in the first place. To do this, create a Study Directory (Study_Dir) that is independent of any of your sessions (though your session may be subdirs of your Study_Dir). Creating the analysis is one of the most important steps, but it is also one of the most complicated. It is done with the "mkanalysis-sess" command, many of which are described in the "mkanalysis-sess" wiki page. You can run it with -help to see documentation above and beyond what is written here. mkanalyis-sess has a lot of options. Rather than discussing each in the abstract, I'll give an example command-line first:

["mkanalysis-sess"] \br

Note that this command returns very quickly and does not require any information about where the functional data are. This is because it does not actually perform any processing. The actual processing will be done in downstream stages. This just collects the parameters for later use.

This will create a directory called "emotion" in the Study_Dir, and there will be two files in this directory ( and analsis.dat) with the parameters that you specified. When running future commands, you need only refer to this analysis (most of the downstream commands with take a -analysis option). In this command, the TR is set to 2 seconds. This information is redundant with the information in the functional session and so is used for quality control. The paradigm is the name of the file with the stimulus schedule information as described in FsFastCreateParadigmFile. Downstream commands will expect to find the emotion.par file in the same directoy as each functional run. -funcstem fmcsm5 tells it to look for a function volume called fmcsm5 as created by your preprocessing. inorm tells downstream processes to perform the intensity normalization (note that there is an inorm stage in the proprocessing but remeber that this stage just computes the global mean and does not rescale the data). Note also that the intensity normalization is performed over the entire 4D volume (ie, each time point is NOT resaled individually). -polyfit 2 specifies that a 2nd order polynomial be used to remove slow trends. -mcextreg instructs the analysis to use the motion correction parameters as nuisance regressors.