Data farming
Encyclopedia
Data Farming is the process of using a high performance computer or computing grid to run a simulation thousands or millions of times across a large parameter and value space. The result of Data Farming is a “landscape” of output that can be analyzed for trends, anomalies, and insights in multiple parameter dimensions.

Origins of the term

The term Data Farming comes from the idea of planting data in the simulation and parameter/value space, and then harvesting the data that results from the simulation runs.

Usage

Data Farming was originally used in the Marine Corp’s Project Albert. Small agent-based distillation models (simulations) were created to capture a specific military challenge. These models were run thousands or millions of times at the Maui High Performance Computer Center and other facilities. Project Albert analysts would work with the military subject matter experts to refine the models and interpret the results. The Naval Post Graduate School also worked closely with Project Albert in model generation, output analysis, and the creation of new experimental designs
Design of experiments
In general usage, design of experiments or experimental design is the design of any information-gathering exercises where variation is present, whether under the full control of the experimenter or not. However, in statistics, these terms are usually used for controlled experiments...

to better leverage the computing capabilities at Maui and other facilities.

Workshops

International Data Farming Workshops are held twice each year, in the Spring and Fall. Workshop information, including proceedings from prior workshops and registration information for future ones, can be found at the Naval Postgraduate School's SEED Center for Data Farming .

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK