Multiscale Electrophysiology Format
Encyclopedia
Multiscale Electrophysiology Format (MEF) was developed to handle the large amounts of data produced by large-scale electrophysiology
Electrophysiology
Electrophysiology is the study of the electrical properties of biological cells and tissues. It involves measurements of voltage change or electric current on a wide variety of scales from single ion channel proteins to whole organs like the heart...

 in human and animal subjects. MEF can store any time series data up to 24 bit
Bit
A bit is the basic unit of information in computing and telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states...

s in length, and employs lossless range encoded difference compression
Lossless data compression
Lossless data compression is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be reconstructed, in exchange...

. Subject identifying information in the file header can be encrypted
Encryption
In cryptography, encryption is the process of transforming information using an algorithm to make it unreadable to anyone except those possessing special knowledge, usually referred to as a key. The result of the process is encrypted information...

 using 128-bit AES encryption in order to comply with HIPAA requirements for patient privacy
Medical privacy
The main subject of medical privacy or health privacy is the 'medical record' which historically has been a paper file of the entire medical history of the patient. Various electronic forms of medical records have existed in western countries, but mostly in an unintegrated fashion. This lack of...

 when transmitting data across an open network.

Compressed data is stored in independent blocks to allow direct access to the data, facilitate parallel processing and limit the effects of potential damage to files. Data fidelity is ensured by a 32-bit cyclic redundancy check in each compressed data block using the Koopman polynomial (0xEB31D82E), which has a Hamming distance
Hamming distance
In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different...

of from 4 to 114 kbits.

A formal specification can be found here.

Sources

  • Martin, GNN. Range encoding: an algorithm for removing redundancy from a digitised message. Video & Data Recoding Conference, Southampton, 1979.
  • Koopman, P. 32-Bit Cyclic Redundancy Codes for Internet Applications. The International Conference on Dependable Systems and Networks (June 2002). 459.
  • Brinkmann, BH et al. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data. Journal of Neuroscience Methods 180 (2009) 185–192.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK