Menu:

Links:

- spires
- arXiv
- CERN-TH

Version: 1.1
(December 30, 2006)


Further reading:

CTEQ webpage
pdf: status report
Neural Nets
Condor
Globus

Processor Distributed Fit of Parton Distribution Functions


Phenomenology in the next decade will be dominated by data from the experiments at the Large Hadron Collider (LHC) at CERN. In order to fully exploit the outcome of a hadron collider like the LHC, a detailed description of the structure of the nucleon is needed. Such a description incorporates non-perturbative QCD dynamics and can only be computed from first principles on the lattice, or extracted from experimental data. This project focuses on obtaining a faithful and unbiased determination of the parton distribution functions that describe the inner structure of the nucleon.

In order to compute the errors and correlations in a reliable manner, a Monte carlo sample of artificial datasets is generated, in such a way that it reproduces the statistical features of the real experimental data. A Monte Carlo sample of PDFs is then generated by fitting a PDF for each set of artificial data. The ensemble of fitted PDF is then used to define a probability measure in the space of functions.

The fitting task can be naturally run in parallel on a cluster of PCs, or on the Grid. Each fit only requires the data corresponding to one given replica of the original dataset. The aim of the project is to develop a framework which allows the user to run the fitting, monitor the progress, collect the results, in parallel for a large number of replicas. The use of existing tools on different platforms should be investigated, and expanded if needed. The statistical analysis and some of the physical implications should also be studied.