woolrich Outlet
isabel marant online
woolrich Outlet
woolrich milano
isabel marant online
hollister sale uk
hollister uk
Reverse BioEngineering - Biomatics.org

Reverse BioEngineering

From Biomatics.org

Jump to: navigation, search


Biologists are attempting to decipher genetic and protein codes, and the workings of molecular based networks. The general study of circuits and networks is clearly the domain of computer scientists, who interpret biological phenomenon in the context of their specialties while at the same time hoping to learn tricks of their trade from biological systems.  This field is now decades old encompassing topics such as molecular computation, genetic algorithms, genomics, and proteomics.  Len Adelman showed in 1994 how DNA in a test tube could perform a calculation.  In 2000, Allis coined the phrase “Histone code”, in recognition of the role of Histone proteins in affecting transcription. Today scientists use DNA microarray and bioinformatics technologies to infer Genome-wide functional linkages among proteins in cellular complexes and metabolic pathways.

Computer scientists and engineers work with gates and flip-flops, where biologists speak of chains of carbon atoms- nucleic acids, proteins, and metabolic pathways.  Clearly, the basic building blocks in biological computing machinery are analogous to the concepts of gates and flip-flops.  Logic designers make many design choices based on analysis and testing.  Biology makes design choices based on random events and evolution.  Both have the goal of optimization.  Surely there must be some areas of similarity.  The term “Artificial Intelligence” denotes a humanly engineered facsimile of a biological system.  How different are the algorithms of image processing and computer vision from the human visual system?  Do similar processes occur in both cases?

Computer scientists talk of “Finite state machines” as described theoretically by Alan Turing and implemented in Von Neumann architecture The terms "von Neumann architecture" and "stored-program computer" are generally used interchangeably.  So the question naturally arises- does the Von Neumann design occur in some form  spontaneaously in nature?

The basic building blocks are clearly there. The NAND gate is known to be functionally complete in that any circuit can theoretically be created from some combination of them. Proteins can be shown to theoretically have the ability to behave as NAND gates…but do they in vivo? Furthermore at what level of organization does computation begin?  The physicists have their theories of “quantum computation”, where the  quantum properties of particles can be used to represent  data, and to perform operations with these data i.e. intra-atomic computation.

 

Engineering involves building something. The goal of Reverse engineering involves starting with the final product and determing how it works.  What lessons can biologists learn from the computing community to help in deciphering these biological networks?

 

Most literature in computational molecular biology have concentrated on Hardware analogies. Principles of Software design offer many insights as well.  Properties of algorithms may elucidate biological function.

 

Contents

Hardware, Software and Bioware

Man made computer programs and biological computing face many of the same design issues.  It may be enlightening to examine the nature of the man made processes-hardware and software- to gain insights into biological function-bioware.

 

The programs that computer programmers write is referred to as source code.  Source code must be translated or compiled into machine code before it can be executed on a computer.  Reverse engineering is the process of determinng the source code given the output or behaviour of the program.  Knowledge of the techniques used is likely to be useful in some measure in the biological domain as a set of heuristics as to how to proceed.

Principles of computer software that may potentially manifest themselves in molecular systems

 

  • Data- Information regarding the shape of a molecule such as an antigen for example

 

  • Variables-Variables are like containers for storing the values used to describe any given situation. Unknown quantities that are subject to change. As opposed to constants. Simple variables evolved into more complex arrangements such as arrays, structures, stacks, etc, which grouped several variables together into functional sets. See Data structures.

 

  • Assignment statements-Assign specific values to variables

Proteins may acquire a specific state.

 

  • Instruction set-Programs are constructed as combinations of atomic functional units: subroutines, functions, objects, classes.  A list of the instructions, and their variations, that a processor can execute. For example a  partial list of the Original 8086/8088 instructions  The Telomere Code

 

      • Decrement by 1
      • Compare operands
      • Logical AND
      • Call procedure
      • Jump
      • Move

 

  • Control structures-  Control the sequence of execution of the program commands
      • If then----------level of activation, threshold
      • Go to----------protein production
      • Call--------------functional groups
      • Function---------metabolic pathway, protein scaffolding
      • Do while---------Concentration of ATP above a certain level, Telomere length greater than zero 
      • For next- --------concentration of reactants

 

  • Program structure
      • Main section- The “Main” section is the top level view of the program.  Programs are structured as a main section that defines global variables and is a sequence of subroutine and function calls.

 

One can easily imagine DNA within this model.  A Protein is a function call to some subroutine embedded in a metabolic pathway.  The protein in this context can also be Data. 

 

Computer Data resides in memory encoded in binary.  The data a protein carries may reside in the absence or prescense of particular chemical functional groups or particular states of folding. Proteins are very powerful information handling devices and capable of carrying and manipulating much information.

 

 

Models of Computation

 

Definition: A formal, abstract definition of a computer. Using a model one can more easily analyze the intrinsic execution time or memory space of an algorithm while ignoring many implementation issues. There are many models of computation which differ in computing power (that is, some models can perform computations impossible for other models) and the cost of various operations.

 

External Links

Inverse bifurcation analysis: application to simple gene systems

Personal tools
Google AdSense