X

Immunological Computation

Biological systems such as human beings can be regarded as sophisticated information processing systems, and can be expected to provide inspiration for various ideas to science and engineering. Biologically-motivated information processing systems can be classified into: brain-nervous systems (neural networks), genetic systems (evolutionary algorithms), and immune systems (artificial immune systems). Among these, nervous systems and genetic systems have been widely applied to various fields. There have been a relative few applications of the immune system.

The natural immune system is a very complex system with several mechanisms for defense against pathogenic organisms. The main purpose of the immune system is to recognize all cells (or molecules) within the body and categorize those cells as self or nonself. The nonself cells are further categorized in order to induce an appropriate type of defensive mechanism. The immune system learns through evolution to distinguish between dangerous foreign antigens (e.g., bacteria, viruses, etc.) and the body's own cells or molecules.

From an information-processing perspective, the immune system is a remarkable parallel and distributed adaptive system. It uses learning, memory, and associative retrieval to solve recognition and classification tasks. In particular, it learns to recognize relevant patterns, remember patterns that have been seen previously, and use combinatorics to construct pattern detectors efficiently. Also, the overall behavior of the system is an emergent property of many local interactions. These remarkable information-processing abilities of the immune system provide several important aspects in the field of computation. This field is sometimes refer to as the Immunological Computation or Artificial Immune Systems. There are a few groups that are working in the field. A link to these groups are here.

Dipankar Dasgupta's Specialization: Anomaly Detection

The objective of this research is to develop an efficient detection algorithm that can be used for noticing any changes in steady-state characteristics of a system or a process. In these experiments, the notion of self is considered as the normal behavior patterns of the monitored system. It may be assumed that the normal behavior of a system or a process can often be characterized by a series of observations over time. Also the normal system behavior generally exhibit stable patterns when observed over a time period. So, any deviation that exceeds an allowable variation in the observed data, is considered as an anomaly in the behavior pattern. This approach relies on sufficient enough sample of normal data (that can capture the semantics of the data patterns) to generate a diverse set of detectors that probabilistically detect changes without requiring prior knowledge of anomaly (or faulty) patterns.