E-CELL: software environment for whole-cell simulation. (1/4007)

MOTIVATION: Genome sequencing projects and further systematic functional analyses of complete gene sets are producing an unprecedented mass of molecular information for a wide range of model organisms. This provides us with a detailed account of the cell with which we may begin to build models for simulating intracellular molecular processes to predict the dynamic behavior of living cells. Previous work in biochemical and genetic simulation has isolated well-characterized pathways for detailed analysis, but methods for building integrative models of the cell that incorporate gene regulation, metabolism and signaling have not been established. We, therefore, were motivated to develop a software environment for building such integrative models based on gene sets, and running simulations to conduct experiments in silico. RESULTS: E-CELL, a modeling and simulation environment for biochemical and genetic processes, has been developed. The E-CELL system allows a user to define functions of proteins, protein-protein interactions, protein-DNA interactions, regulation of gene expression and other features of cellular metabolism, as a set of reaction rules. E-CELL simulates cell behavior by numerically integrating the differential equations described implicitly in these reaction rules. The user can observe, through a computer display, dynamic changes in concentrations of proteins, protein complexes and other chemical compounds in the cell. Using this software, we constructed a model of a hypothetical cell with only 127 genes sufficient for transcription, translation, energy production and phospholipid synthesis. Most of the genes are taken from Mycoplasma genitalium, the organism having the smallest known chromosome, whose complete 580 kb genome sequence was determined at TIGR in 1995. We discuss future applications of the E-CELL system with special respect to genome engineering. AVAILABILITY: The E-CELL software is available upon request. SUPPLEMENTARY INFORMATION: The complete list of rules of the developed cell model with kinetic parameters can be obtained via our web site at: http://e-cell.org/.  (+info)

A prognostic computer model to individually predict post-procedural complications in interventional cardiology: the INTERVENT Project. (2/4007)

AIMS: The purpose of this part of the INTERVENT project was (1) to redefine and individually predict post-procedural complications associated with coronary interventions, including alternative/adjunctive techniques to PTCA and (2) to employ the prognostic INTERVENT computer model to clarify the structural relationship between (pre)-procedural risk factors and post-procedural outcome. METHODS AND RESULTS: In a multicentre study, 2500 data items of 455 consecutive patients (mean age: 61.1+/-8.3 years: 33-84 years) undergoing coronary interventions at three university centres were analysed. 80.4% of the patients were male, 16.7% had unstable angina, and 5.1%/10.1% acute/subacute myocardial infarction. There were multiple or multivessel stenoses in 16.0%, vessel bending >90 degrees in 14.5%, irregular vessel contours in 65.0%, moderate calcifications in 20.9%, moderate/severe vessel tortuosity in 53.2% and a diameter stenosis of 90%-99% in 44.4% of cases. The in-lab (out-of-lab) complications were: 0.4% (0.9%) death, 1.8% (0.2%) abrupt vessel closure with myocardial infarction and 5.5% (4.0) haemodynamic disorders. CONCLUSION: Computer algorithms derived from artificial intelligence were able to predict the individual risk of these post-procedural complications with an accuracy of >95% and to explain the structural relationship between risk factors and post-procedural complications. The most important prognostic factors were: heart failure (NYHA class), use of adjunctive/alternative techniques (rotablation, atherectomy, laser), acute coronary ischaemia, pre-existent cardiac medication, stenosis length, stenosis morphology (calcification), gender, age, amount of contrast agent and smoker status. Pre-medication with aspirin or other cardiac medication had a beneficial effect. Techniques, such as laser angioplasty or atherectomy were predictors for post-procedural complications. Single predictors alone were not able to describe the individual outcome completely.  (+info)

Virtual management of radiology examinations in the virtual radiology environment using common object request broker architecture services. (3/4007)

In the Department of Defense (DoD), US Army Medical Command is now embarking on an extremely exciting new project--creating a virtual radiology environment (VRE) for the management of radiology examinations. The business of radiology in the military is therefore being reengineered on several fronts by the VRE Project. In the VRE Project, a set of intelligent agent algorithms determine where examinations are to routed for reading bases on a knowledge base of the entire VRE. The set of algorithms, called the Meta-Manager, is hierarchical and uses object-based communications between medical treatment facilities (MTFs) and medical centers that have digital imaging network picture archiving and communications systems (DIN-PACS) networks. The communications is based on use of common object request broker architecture (CORBA) objects and services to send patient demographics and examination images from DIN-PACS networks in the MTFs to the DIN-PACS networks at the medical centers for diagnosis. The Meta-Manager is also responsible for updating the diagnosis at the originating MTF. CORBA services are used to perform secure message communications between DIN-PACS nodes in the VRE network. The Meta-Manager has a fail-safe architecture that allows the master Meta-Manager function to float to regional Meta-Manager sites in case of server failure. A prototype of the CORBA-based Meta-Manager is being developed by the University of Arizona's Computer Engineering Research Laboratory using the unified modeling language (UML) as a design tool. The prototype will implement the main functions described in the Meta-Manager design specification. The results of this project are expected to reengineer the process of radiology in the military and have extensions to commercial radiology environments.  (+info)

Meta-manager: a requirements analysis. (4/4007)

The digital imaging network-picture-archiving and communications system (DIN-PACS) will be implemented in ten sites within the Great Plains Regional Medical Command (GPRMC). This network of PACS and teleradiology technology over a shared T1 network has opened the door for round the clock radiology coverage of all sites. However, the concept of a virtual radiology environment poses new issues for military medicine. A new workflow management system must be developed. This workflow management system will allow us to efficiently resolve these issues including quality of care, availability, severe capitation, and quality of the workforce. The design process of this management system must employ existing technology, operate over various telecommunication networks and protocols, be independent of platform operating systems, be flexible and scaleable, and involve the end user at the outset in the design process for which it is developed. Using the unified modeling language (UML), the specifications for this new business management system were created in concert between the University of Arizona and the GPRMC. These specifications detail a management system operating through a common object request brokered architecture (CORBA) environment. In this presentation, we characterize the Meta-Manager management system including aspects of intelligence, interfacility routing, fail-safe operations, and expected improvements in patient care and efficiency.  (+info)

Integrated radiology information system, picture archiving and communications system, and teleradiology--workflow-driven and future-proof. (5/4007)

The proliferation of integrated radiology information system/picture archiving and communication system (RIS/PACS) and teleradiology has been slow because of two concerns: usability and economic return. A major dissatisfaction on the usability issue is that contemporary systems are not intelligent enough to support the logical workflow of radiologists. We propose to better understand the algorithms underlying the radiologists' reading process, and then embed this intelligence into the software program so that radiologists can interact with the system with less conscious effort. Regarding economic return issues, people are looking for insurance against obsolescence in order to protect their investments. We propose to future-proof a system by sticking to the following principles: compliance to industry standards, commercial off-the-shelf (COTS) components, and modularity. An integrated RIS/PACS and teleradiology system designed to be workflow-driven and future-proof is being developed at Texas Tech University Health Sciences Center.  (+info)

Mapping of putative binding sites on the ectodomain of the type II TGF-beta receptor by scanning-deletion mutagenesis and knowledge-based modeling. (6/4007)

Binding surfaces of the type II transforming growth factor (TGF)-beta receptor extracellular domain (TbetaRII-ECD) are mapped by combining scanning-deletion mutagenesis results with knowledge-based modeling of the ectodomain structure. Of the 17 deletion mutants produced within the core binding domain of TbetaRII-ECD, only three retained binding to TGF-beta. Comparative modeling based on the crystal structure of the activin type II receptor extracellular domain (ActRII-ECD) indicates that the TbetaRII mutants which retain TGF-beta binding are deleted in some of the loops connecting the beta-strands in the TbetaRII-ECD model. Interpretation of the mutagenesis data within the structural framework of the ectodomain model allows for the prediction of potential binding sites at the surface of TbetaRII-ECD.  (+info)

Integrated databases and computer systems for studying eukaryotic gene expression. (7/4007)

MOTIVATION: The goal of the work was to develop a WWW-oriented computer system providing a maximal integration of informational and software resources on the regulation of gene expression and navigation through them. Rapid growth of the variety and volume of information accumulated in the databases on regulation of gene expression necessarily requires the development of computer systems for automated discovery of the knowledge that can be further used for analysis of regulatory genomic sequences. RESULTS: The GeneExpress system developed includes the following major informational and software modules: (1) Transcription Regulation (TRRD) module, which contains the databases on transcription regulatory regions of eukaryotic genes and TRRD Viewer for data visualization; (2) Site Activity Prediction (ACTIVITY), the module for analysis of functional site activity and its prediction; (3) Site Recognition module, which comprises (a) B-DNA-VIDEO system for detecting the conformational and physicochemical properties of DNA sites significant for their recognition, (b) Consensus and Weight Matrices (ConsFrec) and (c) Transcription Factor Binding Sites Recognition (TFBSR) systems for detecting conservative contextual regions of functional sites and their recognition; (4) Gene Networks (GeneNet), which contains an object-oriented database accumulating the data on gene networks and signal transduction pathways, and the Java-based Viewer for exploration and visualization of the GeneNet information; (5) mRNA Translation (Leader mRNA), designed to analyze structural and contextual properties of mRNA 5'-untranslated regions (5'-UTRs) and predict their translation efficiency; (6) other program modules designed to study the structure-function organization of regulatory genomic sequences and regulatory proteins. AVAILABILITY: GeneExpress is available at http://wwwmgs.bionet.nsc. ru/systems/GeneExpress/ and the links to the mirror site(s) can be found at http://wwwmgs.bionet.nsc.ru/mgs/links/mirrors.html+ ++.  (+info)

Automated diagnosis of data-model conflicts using metadata. (8/4007)

The authors describe a methodology for helping computational biologists diagnose discrepancies they encounter between experimental data and the predictions of scientific models. The authors call these discrepancies data-model conflicts. They have built a prototype system to help scientists resolve these conflicts in a more systematic, evidence-based manner. In computational biology, data-model conflicts are the result of complex computations in which data and models are transformed and evaluated. Increasingly, the data, models, and tools employed in these computations come from diverse and distributed resources, contributing to a widening gap between the scientist and the original context in which these resources were produced. This contextual rift can contribute to the misuse of scientific data or tools and amplifies the problem of diagnosing data-model conflicts. The authors' hypothesis is that systematic collection of metadata about a computational process can help bridge the contextual rift and provide information for supporting automated diagnosis of these conflicts. The methodology involves three major steps. First, the authors decompose the data-model evaluation process into abstract functional components. Next, they use this process decomposition to enumerate the possible causes of the data-model conflict and direct the acquisition of diagnostically relevant metadata. Finally, they use evidence statically and dynamically generated from the metadata collected to identify the most likely causes of the given conflict. They describe how these methods are implemented in a knowledge-based system called GRENDEL and show how GRENDEL can be used to help diagnose conflicts between experimental data and computationally built structural models of the 30S ribosomal subunit.  (+info)