Longitudinal Two Stage Model
In order to analyze data processed with the longitudinal pipeline of Freesurfer, you can run a simple two stage model:
 The first stage reduces the temporal data within each subject to a single statistic (e.g. annualized atrophy rate or percent change).
 The secon stage compares this measure across groups or correlates it with a covariate (e.g. age, gender, weight, ...).
Advantages:
 modeling the correlation structure of repeated measures can be avoided
 can deal with differently spaced time points
 works on stat files (e.g. aseg.stats or aparc.stats) and on cortical maps (e.g. thickness)
 the second stage can be performed with QDEC (simple GUI) or directly with mri_glmfit
 the second stage analysis can make use of different multiple comparison methods that come with mri_glmfit
 scripts are available ( long_mris_slopes and long_stats_slopes ), no matlab needed
 for the simple case of two time points and when looking at simple differences this model simplifies to a paired analysis, but can additionally compute (symmetrized) percent changes
 includes code for intersecting cortex labels (across time and across subjects) to make sure that all noncortex measures are excluded
Disadvantages:
 does not account for different certainty of within subject slopes depending on the number of time points
 difficult to model nonlinear temporal behaviour
 difficult to deal with time varying covariates (slopes would need to be fit into those for each subject to reduce these to a single number)
 cannot include information from subjects with only a single time point
For complex modeling and to overcome these disadvantages see our matlab scripts for LinearMixedEffectsModels.
Also take a look at the Longitudinal Tutorial for an example on how to apply the 2stage model.
When using longitudinal processing stream (independent of your statistical analysis) please cite:
WithinSubject Template Estimation for Unbiased Longitudinal Image Analysis
M. Reuter, N.J. Schmansky, H.D. Rosas, B. Fischl.
NeuroImage 61(4), pp. 14021418, 2012.
Longitudinal QDEC Table
(see also LongitudinalQdecTable)
To get the longitudinal data ready for the 2 stage analysis you need to create a table (space separated as a text file) in the following format:
fsid 
fsidbase 
years 
... 
OAS2_0001_MR1 
OAS2_0001 
0 

OAS2_0001_MR2 
OAS2_0001 
1.25 

OAS2_0004_MR1 
OAS2_0004 
0 

OAS2_0004_MR2 
OAS2_0004 
1.47 

... 



where the first column is called fsid (containing each time point) and the second column is fsidbase containing the withinsubject template (=base) name, to group time points within subject. You can have many more columns such as gender, age, group, etc. Make sure there is a column containing an accurate time variable (optimally measured in years if you are interested in annualized change) such as age or the duration of the study (time from inital scan). Here we use years to measure the time from baseline scan (=tp1). You can see in the table that the two subjects OAS2_0001 and OAS2_0004 each have two time points (MR1,MR2) that are not equally spaced (approx 15 and 18 months apart).
Note, the fsid column contains the original subject/time point id's, not the longitudinal names. The scripts below know that this is a longitudinal table (because of the existing fsidbase column) and will process the data from the longitudinal directories automatically.
First Stage: Preparing the Data  QCACHE
Surface Analysis
The following commands can be used to prepare the data:
long_mris_slopes qdec ./qdec/long.qdec.table.dat meas thickness hemi lh doavg dorate dopc1 dospc dostack dolabel time years qcache fsaverage
This will:
 (qdec) read in the longitudinal qdec table
 (meas) take the thickness measure of each time point
 (hemi) work on left hemisphere
 (doavg) compute the temporal average (thickness at midpoint of linear fit, here it is just the average)
 (dorate) compute the rate of change (thickening in mm/year)
 (dopc1) compute the percent change (with respect to time point 1)
 (dospc) compute a symmetrized percent change (with respect to the temporal average)
 (dostack) output a stacked thickness file for each subject (time series)
 (dolabel) intersect the cortex label to ensure we don't include noncortex regions
 (time) specify the column in the longqdec.table.dat that contains the time variable (here 'years')
 (qcache) and automatically smooth everything on different levels and map it to fsaverage for a potential group analysis using qdec
You then need to run the same command for the right hemisphere (hemi rh). Note, if you split your table up into smaller tables containing only information for a specific subject each, you can run this on a cluster in parallel for each subject to speed things up (you can use cd qdec ; long_qdec_table qdec ./long.qdec.table.dat split fsidbase ; cd .. to split the table up into individual subjects inside the qdec directory).
Now before continuing, let's get an idea about what the above measures mean in a simple setting (2 time points):
The temporal average is simply the average thickness: avg = 0.5 * (thick1 + thick2)
The rate of change is the difference per time unit, so rate = ( thick2  thick1 ) / (time2  time1), here thickening in mm/year, we expect it to be negative in most regions (because of noise not necessarily for a single subject, but on average).
The percent change (pc1) is the rate with respect to the thickness at the first time point: pc1 = rate / thick1. We also expect it to be negative and it tells how much percent thinning we have at a given cortical location. There is also a pc1fit option that uses the value obtained from the linear fit at baseline time. This is more reliable than the baseline value directly.
The symmetrized percent change (spc) is the rate with respect to the average thickness: spc = 100 * rate / avg. This is a more robust measure than pc1, because thickness at time point 1 is more noisy than the average. Also spc is symmetric: when reversing the order of tp1 and tp2 it switches sign. This is not true for pc1. Therefore and for other reasons related to increased statistical power, we recommend to use spc, if all subjects have the same number of time points. If not, use pc1fit.
The outputs of the first stage are stored in each subjecttemplate (base) directory, e.g. $SUBJECTS_DIR/OAS2_0001/surf/lh.long.thicknessavg.fwhm15.mgh. There you can also find the surface map in the fsaverage space, e.g. $SUBJECTS_DIR/OAS2_0001/surf/lh.long.thicknessspc.fwhm15.fsaverage.mgh.
Stats Analysis
Also take a look at
long_stats_slopes help
which works very similar to long_mris_slopes but processes stats files. There you won't need to map to fsaverage or smooth the data.
Second Stage: Option QDEC
In order to run a QDEC group analysis (described in the QDEC tutorial), you need a qdec table. Since qdec cannot really work with the longitudinal qdec tables yet, you have to shrink it into a cross sectional form. Use
long_qdec_table qdec ./qdec/long.qdec.table.dat cross out ./qdec/cross.qdec.table.dat
to create a table with only a single line for each subject. In this table fsid will contain each subject's base (withinsubject template) id, because that is the location where we have stored the results of the first stage above. In this cross sectional table, for each subject, numerical values such as age or height will be averaged across time, other values will be copied from the first time point as ordered in the input table. You see that timevarying covariates are not possible with this 2stage approach. If you are interested in working with timevariying covariates, you need to use LinearMixedEffectsModels.
Also QDEC will not know about our new files (e.g. lh.long.thicknessspc...). We can tell it to look for them by creating a $SUBJECTS_DIR/qdec/.Qdecrc file in the qdec directory that contains the following lines:
MEASURE1 = long.thicknessavg MEASURE2 = long.thicknessrate MEASURE3 = long.thicknesspc1 MEASURE4 = long.thicknessspc
You can then run QDEC and do all kinds of analysis on any of those files and other variables from the qdec table:
qdec table ./qdec/cross.qdec.table.dat
You will see in the Design tab, that you can now select e.g. long.thicknessrate under Measure as the dependent variable.
Second Stage: Option mri_glmfit
Instead of Qdec you can, of course, run mri_glmfit directly (either with your own design matrices or FSGD files). When using the qcache fsaverage flag, the output of the first stage above is stored in the subjecttemplate (base), one file per subject, already mapped to fsaverage space at all or the specified smoothing levels. Those files are named <base>/surf/lh.long.thicknessspc.fwhm5.fsaverage.mgh. You can simply stack them yourself (mri_concat) or better use one of the ouput flags of long_mris_slopes , e.g. stackspc or stackrate to obtain the stacks directly, which can then be used in mri_glmfit. When creating your own FSGD file make sure the order of the subjects is the same as the order used in the long.qdec table for the stacking.
For example, for a one sample group mean to test if the percent thickness change is different from zero, you'd run:
long_mris_slopes qdec long.testretest.qdec \ meas thickness \ hemi lh \ sd $SUBJECTS_DIR \ dopc1 dolabel \ generictime \ fwhm 15 \ qcache fsaverage \ stackpc1 lh.testretest.thicknesspc1.stack.mgh \ iseclabels lh.testretest.fsaverage.cortex.label mri_glmfit osgm \ glmdir lh.testretest.thicknesspc1.fwhm15 \ y lh.testretest.thicknesspc1.stack.fwhm15.mgh \ label lh.testretest.fsaverage.cortex.label \ surf fsaverage lh tksurfer fsaverage lh pial overlay lh.testretest.thicknesspc1.fwhm15/osgm/sig.mgh
To test if percent volume changes of aseg structures are different from zero, you'd run:
long_stats_slopes qdec long.testretest.qdec \ stats aseg.stats \ meas volume \ sd $SUBJECTS_DIR \ dopc1 \ generictime \ stackpc1 testretest.asegpc1.stack.txt mri_glmfit osgm \ glmdir testretest.asegpc1 \ table testretest.asegpc1.stack.txt cat testretest.asegpc1/sig.table.dat
See the Group Analysis Tutorial for more details on mri_glmfit.